|By Robert Eve||
|January 1, 2012 10:00 AM EST||
Business Agility Requires Multiple Approaches
Agile businesses create business agility through a combination of business decision agility, time-to-solution agility and resource agility.
Time-To-Solution Agility = Business Value
When responding to new information needs, rapid time-to-solution is critically important and often results in significant bottom-line benefits.
Proven, time and again across multiple industries, substantial time-to-solution improvements can be seen in the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.
Consider This Example: If the business wants to enter a new market, it must first financially justify the investment, including any new IT requirements. Thus, only the highest ROI projects are approved and funded. Once the effort is approved, accelerating delivery of the IT solution also accelerates realization of the business benefits and ROI.
Therefore, if incremental revenues from the new market are $2 million per month, then the business will gain an additional $2 million for every month IT can save in time needed to deliver the solution.
Streamlined Approach to Data Integration
Data virtualization is significantly more agile and responsive than traditional data consolidation and ETL-based integration approaches because it uses a highly streamlined architecture and development process to build and deploy data integration solutions.
This approach greatly reduces complexity and reduces or eliminates the need for data replication and data movement. As numerous data virtualization case studies demonstrate, this elegance of design and architecture makes it far easier and faster to develop and deploy data integration solutions using a data virtualization platform. The ultimate result is faster realization of business benefits.
To better understand the difference, let's contrast these methods. In both the traditional data warehouse/ETL approach and data virtualization, understanding the information requirements and reporting schema is the common first step.
Traditional Data Integration Has Many Moving Parts
Using the traditional approach IT then models and implements the data warehouse schema. ETL development follows to create the links between the sources and the warehouse. Finally the ETL scripts are run to populate the warehouse. The metadata, data models/schemas and development tools used within each activity are unique to each activity.
This diverse environment of different metadata, data models/schemas and development tools is not only complex but also results in the need to coordinate and synchronize efforts and objects across them.
Experienced BI and data integration users will readily acknowledge the long development times that result from this complexity, including Forrester Research in its 2011 report Data Virtualization Reaches Critical Mass.
"Extract, transform, and load (ETL) approaches require one or more copies of data staged along the physical integration process flow. Creating, storing, and manipulating these copies can be complex and error prone."
Data Virtualization Has Fewer Moving Parts
Data virtualization uses a more streamlined architecture that simplifies development. Once the information requirements and reporting schema are understood, the next step is to develop the objects (views and data services) used to both model and query the required data.
These virtual equivalents of the warehouse schema and ETL routines and scripts are created within a single view or data service object using a unified data virtualization development environment. This approach leverages the same metadata, data models/schemas and tools.
Not only is it easier to build the data integration layer using data virtualization, but there are also fewer "moving parts," which reduces the need for coordination and synchronization activities. With data virtualization, there is no need to physically migrate data from the sources to a warehouse. The only data that is moved is the data delivered directly from the source to the consumer on-demand. These result sets persist in the data virtualization server's memory for only a short interval.
Avoiding data warehouse loads, reloads and updates further simplifies and streamlines solution deployment and thereby improves time-to-solution agility.
Iterative Development Process Is Better for Business Users
Another way data virtualization improves time-to-solution agility is through support for a fast, iterative development approach. Here, business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" process until the solution meets the user need.
Most users prefer this type of development process. Because building views of existing data is simple and fast, IT can provide business users with prospective versions of new data sets in just a few hours. The user doesn't have to wait months for results while IT develops detailed solution requirements. Then business users can react to these data sets and refine their requirements based on the tangible insights. IT can then change the views and show the refined data sets to the business users.
This iterative development approach enables the business and IT to hone in on and deliver the needed information much faster than traditional integration methods.
Even in cases where a data warehouse solution is mandated by specific analytic needs, data virtualization can be used to support rapid prototyping of the solution. The initial solution is built using data virtualization's iterative development approach, with migration to the data warehouse approach once the business is fully satisfied with the information delivered.
In contrast, developing a new information solution using traditional data integration architecture is inherently more complex. Typically, business users must fully and accurately specify their information requirements prior to any development, with little change tolerated. Not only does the development process take longer, but there is a real risk that the resulting solution will not be what the users actually need and want.
Data virtualization offers significant value, and the opportunity to reduce risk and cost, by enabling IT to quickly deliver iterative results that enable users to truly understand what their real information needs are and get a solution that meets those needs.
Ease of Data Virtualization Change Keeps Pace with Business Change
The third way data virtualization improves time-to-solution agility is ease of change. Information needs evolve. So do the associated source systems and consuming applications. Data virtualization allows a more loosely coupled architecture between sources, consumers and the data virtualization objects and middleware that integrate them.
This level of independence makes it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change. In fact, changing an existing view, adding a new source or migrating from one source to another is often completed in hours or days, versus weeks or months in the traditional approach.
Data virtualization reduces complexity, data replication and data movement. Business users and IT collaborate to quickly define the initial solution requirements followed by an iterative "develop, get feedback and refine" delivery process. Further independent layers make it significantly easier to extend and adapt existing data virtualization solutions as business requirements or associated source and consumer system implementations change.
These time-to-solution accelerators, as numerous data virtualization case studies demonstrate, make it far easier and faster to develop and deploy data integration solutions using a data virtualization platform than other approaches. The result is faster realization of business benefits.
Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.
This talk focuses on the application of DevOps fundamentals to include network infrastructure. It draws from real deployment case studies on the extension of today's paradigms to address the challenges of the network infrastructures' ability to seamlessly and cohesively integrate into agile workflows. In this session at DevOps Summit, Arista Networks will focus on configuration management using automation with a nod to future work necessary to include telemetry and ephemeral state information....
May. 5, 2015 04:45 PM EDT Reads: 1,127
Avnet, Inc. has announced that it ranked No. 4 on the InformationWeek Elite 100 – a list of the top business technology innovators in the U.S. Avnet was recognized for the development of an innovative cloud-based training system that serves as the foundation for Avnet Academy – the company’s education and training organization focused on technical training around top IT vendor technologies. The development of this system allowed Avnet to quickly expand its IT-related training capabilities around...
May. 5, 2015 04:15 PM EDT Reads: 936
SYS-CON Events announced today that dcVAST, a leader in IT infrastructure management, support service and cloud service, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. dcVAST provides cutting-edge IT services and IT infrastructure management services. dcVAST builds robust systems that are simple, secure and serviceable. dcVAST’s IT infrastructure support and IT services expertise can help companies r...
May. 5, 2015 04:02 PM EDT Reads: 608
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developmen...
May. 5, 2015 03:45 PM EDT Reads: 897
While Docker continues to be the darling of startups, enterprises and IT innovators around the world, networking continues to be a real mess. Indeed, managing the interaction between Docker containers and networks has always been fraught with complications. Without automation in networking, the vision of running Docker at scale and letting IT run the same apps unchanged on the laptop and in the data center or for any cloud cannot be realized.
May. 5, 2015 03:30 PM EDT Reads: 930
SYS-CON Events announced today Isomorphic Software, the global leader in high-end, web-based business applications, will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software ...
May. 5, 2015 03:00 PM EDT Reads: 839
SYS-CON Events announced today the DevOps Foundation Certification Course, being held June ?, 2015, in conjunction with DevOps Summit and 16th Cloud Expo at the Javits Center in New York City, NY. This sixteen (16) hour course provides an introduction to DevOps – the cultural and professional movement that stresses communication, collaboration, integration and automation in order to improve the flow of work between software developers and IT operations professionals. Improved workflows will res...
May. 5, 2015 02:30 PM EDT Reads: 3,395
Docker is becoming very popular--we are seeing every major private and public cloud vendor racing to adopt it. It promises portability and interoperability, and is quickly becoming the currency of the Cloud. In his session at DevOps Summit, Bart Copeland, CEO of ActiveState, discussed why Docker is so important to the future of the cloud, but will also take a step back and show that Docker is actually only one piece of the puzzle. Copeland will outline the bigger picture of where Docker fits a...
May. 5, 2015 02:15 PM EDT Reads: 5,326
A new definition of Big Data & the practical applications of the defined components & associated technical architecture models This presentation introduces a new definition of Big Data, along with the practical applications of the defined components and associated technical architecture models. In his session at Big Data Expo, Tony Shan will start with looking into the concept of Big Data and tracing back the first definition by Doug Laney, and then he will dive deep into the description of 3V...
May. 5, 2015 02:00 PM EDT Reads: 1,139
SYS-CON Events announced today that CenturyLink, Inc., a leader in the network services market, has been named “Platinum Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. CenturyLink is the third largest telecommunications company in the United States and is recognized as a leader in the network services market by technology industry analyst firms. The company is a global leader in cloud infrastructure and ...
May. 5, 2015 01:15 PM EDT Reads: 1,285
As cloud gives an opportunity to businesses to buy services externally – how is cloud impacting your customers? In his General Session at 15th Cloud Expo, Fabio Gori, Director of Worldwide Cloud Marketing at Cisco, provided answers to big questions: Do you see hybrid cloud as where the world is going? What benefits does it bring? And how does Cisco connect all of these clouds? He also discussed Intercloud and Cisco’s investment on it.
May. 5, 2015 12:00 PM EDT Reads: 5,154
SYS-CON Events announced today that B2Cloud, a provider of enterprise resource planning software, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. B2cloud develops the software you need. They have the ideal tools to help you work with your clients. B2Cloud’s main solutions include AGIS – ERP, CLOHC, AGIS – Invoice, and IZUM
May. 5, 2015 12:00 PM EDT Reads: 4,486
The Internet of Things Maturity Model (IoTMM) is a qualitative method to gauge the growth and increasing impact of IoT capabilities in an IT environment from both a business and technology perspective. In his session at @ThingsExpo, Tony Shan will first scan the IoT landscape and investigate the major challenges and barriers. The key areas of consideration are identified to get started with IoT journey. He will then pinpoint the need of a tool for effective IoT adoption and implementation, whic...
May. 5, 2015 11:45 AM EDT Reads: 1,162
SYS-CON Events announced today that MangoApps will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY., and the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides private all-in-one social intranets allowing workers to securely collaborate from anywhere in the world and from any device. Social, mobile, and eas...
May. 5, 2015 11:00 AM EDT Reads: 4,403
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
May. 5, 2015 11:00 AM EDT Reads: 5,807
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
May. 5, 2015 11:00 AM EDT Reads: 3,934
The world's leading Cloud event, Cloud Expo has launched Microservices Journal on the SYS-CON.com portal, featuring over 19,000 original articles, news stories, features, and blog entries. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. Microservices Journal offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Follow new article posts on T...
May. 5, 2015 11:00 AM EDT Reads: 2,945
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
May. 5, 2015 10:30 AM EDT Reads: 5,436
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...
May. 5, 2015 10:15 AM EDT Reads: 4,524
The WebRTC Summit 2015 New York, to be held June 9-11, 2015, at the Javits Center in New York, NY, announces that its Call for Papers is open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 16th International Cloud Expo, @ThingsExpo, Big Data Expo, and DevOps Summit.
May. 5, 2015 10:15 AM EDT Reads: 3,183