Welcome!

Cloud Expo Authors: Lori MacVittie, Pat Romanski, Victoria Livschitz, Carmen Gonzalez, Elizabeth White

Related Topics: Virtualization, SOA & WOA

Virtualization: Article

How Data Virtualization Improves Business Agility – Part 3

Optimize staff, infrastructure and integration approach for maximum ROI

While the benefits derived from greater business agility are significant, costs are also an important factor to consider. This is especially true in today's extremely competitive business environment and difficult economic times.

This article, the last in a series of three articles on how data virtualization delivers business agility, focuses on resource agility.

In Parts 1 and 2, business decision agility and time-to-solution agility were addressed.

Resource Agility Is a Key Enabler of Business Agility
In the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, resource agility was identified as the third key element in an enterprise's business agility strategy, along with business decision agility and time-to-solution agility.

Data virtualization directly enables greater resource agility through superior developer productivity, lower infrastructure costs and better optimization of data integration solutions.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects.

Superior Developer Productivity Saves Personnel Costs
At 41% of the typical enterprise IT budget, personnel staffing expenses, including salaries, benefits and occupancy, represent the largest category of IT spending according to recently published analyst research. This spending is double that of both software and outsourcing, and two-and-a-half times that of hardware.

Not only are these staffing costs high in absolute terms, with data integration efforts often representing half the work in a typical IT development project, data integration developer productivity is critically important on a relative basis as well.

As described in Part 2 of this series, data virtualization uses a streamlined architecture and development approach. Not only does this improve time-to-solution agility, it also improves developer productivity in several ways.

  • First, data virtualization allows rapid, iterative development of views and data services. The development and deployment time savings associated with this development approach directly translate into lower staffing costs.
  • Second, the typically SQL-based views used in data virtualization are a well-understood IT paradigm. And the IDEs for building these views share common terminology and techniques with the IDEs for the most popular relational databases. The same can be said for data services and popular SOA IDEs. These factors make data virtualization easy for developers to learn and reduce training costs typically required when adopting new tools.
  • Third, graphically oriented IDEs simplify data virtualization solution development with significant built-in code generation and automatic query optimization. This enables less senior and lower cost development staff to build data integration solutions.
  • Fourth, the views and services built for one application can easily be reused across other applications. This further increases productivity and reduces staffing resource costs.

Better Asset Leverage Lowers Infrastructure Costs
Large enterprises typically have hundreds, if not thousands, of data sources. While these data assets can be leveraged to provide business decision agility, these returns come at a cost. Each source needs to be efficiently operated and managed and the data effectively governed. These ongoing infrastructure costs typically dwarf initial hardware and software implementation costs.

Traditional data integration approaches, where data is consolidated in data warehouses or marts, add to the overall number of data sources. This necessitates not only greater up-front capital expenditures, but also increased spending for ongoing operations and management. In addition, every new copy of the data introduces an opportunity for inconsistency and lower data quality.

Protecting against these inevitable issues is a non-value-added activity that further diverts critical resources. Finally, more sources equal more complexity. This means large, ongoing investments in coordination and synchronization activities.

These demands consume valuable resources that can be significantly reduced through the use of data virtualization. Because data virtualization requires fewer physical data repositories than traditional data integration approaches, enterprises that use data virtualization lower their capital expenditures as well as their operating, management and governance costs. In fact, many data virtualization users find these infrastructure savings alone can justify their entire investment in data virtualization technology.

Add Data Virtualization to Optimize Your Data Integration Portfolio
As a component of a broad data integration portfolio, data virtualization joins traditional data integration approaches such as data consolidation in the form of data warehouses and marts enabled by ETL as well as messaging and replication-based approaches that move data from one location to another.

Each of these approaches has strengths and limitations when addressing various business information needs, data source and consumer technologies, time-to-solution and resource agility requirements.

For example, a data warehouse approach to integration is often deployed when analyzing historical time-series data across multiple dimensions. Data virtualization is typically adopted to support one or more of the five popular data virtualization usage patterns:

  • BI data federation
  • Data warehouse extension
  • Enterprise data virtualization layer
  • Big data integration
  • Cloud data integration

Given the many information needs, integration challenges, and business agility objectives organizations have to juggle, each data integration approach added to the portfolio improves the organization's data integration flexibility and thus optimizes the ability to deliver effective data integration solutions.

With data virtualization in the integration portfolio, the organization can optimally mix and match physical and virtual integration methods based on the distinct requirements of a specific application's information needs, source data characteristics and other critical factors such as time-to-solution, data latency and total cost of ownership.

In addition, data virtualization provides the opportunity to refactor and optimize data models that are distributed across multiple applications and consolidated stores. For example, many enterprises use their BI tool's semantic layer and/or data warehouse schema to manage data definitions and models. Data virtualization provides the option to centralize this key functionality in the data virtualization layer. This can be especially useful in cases where the enterprise has several BI tools and/or multiple warehouses and marts, each with their own schemas and governance.

Conclusion
Data virtualization's streamlined architecture and development approach significantly improves developer productivity. Further, data virtualization requires fewer physical data repositories than traditional data integration approaches. This means that data virtualization users lower their capital expenditures as well as their operating, management and governance costs. Finally, adding data virtualization to the integration portfolio enables the optimization of physical and virtual integration methods.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects in the pursuit of business agility.

•   •   •

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Some developers believe that monitoring is a function of the operations team. Some operations teams firmly believe that monitoring the systems they maintain is sufficient to run the business successfully. Most of them are wrong. The complexity of today's applications have gone far and beyond the capabilities of "traditional" system-level monitoring tools and approaches and requires much broader knowledge of business and applications as a whole. The goal of DevOps is to connect all aspects of app...
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
SYS-CON Media announced that Centrify, a provider of unified identity management across cloud, mobile and data center environments that delivers single sign-on (SSO) for users and a simplified identity infrastructure for IT, has launched an ad campaign on Cloud Computing Journal. The ads focus on security: how an organization can successfully control privilege for all of the organization’s identities to mitigate identity-related risk without slowing down the business, and how Centrify provides ...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
"We help companies that are using a lot of Software as a Service. We help companies manage and gain visibility into what people are using inside the company and decide to secure them or use standards to lock down or to embrace the adoption of SaaS inside the company," explained Scott Kriz, Co-founder and CEO of Bitium, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"SAP had made a big transition into the cloud as we believe it has significant value for our customers, drives innovation and is easy to consume. When you look at the SAP portfolio, SAP HANA is the underlying platform and it powers all of our platforms and all of our analytics," explained Thorsten Leiduck, VP ISVs & Digital Commerce at SAP, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device exp...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
"Cloud consumption is something we envision at Solgenia. That is trying to let the cloud spread to the user as a consumption, as utility computing. We want to allow the people to just pay for what they use, not a subscription model," explained Ermanno Bonifazi, CEO & Founder of Solgenia, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. Acco...
"For the past 4 years we have been working mainly to export. For the last 3 or 4 years the main market was Russia. In the past year we have been working to expand our footprint in Europe and the United States," explained Andris Gailitis, CEO of DEAC, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...