Welcome!

@CloudExpo Authors: Jim Malone, Christopher Keene, Roger Strukhoff, David Sprott, Craig Lowell

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

How Data Virtualization Improves Business Agility – Part 3

Optimize staff, infrastructure and integration approach for maximum ROI

While the benefits derived from greater business agility are significant, costs are also an important factor to consider. This is especially true in today's extremely competitive business environment and difficult economic times.

This article, the last in a series of three articles on how data virtualization delivers business agility, focuses on resource agility.

In Parts 1 and 2, business decision agility and time-to-solution agility were addressed.

Resource Agility Is a Key Enabler of Business Agility
In the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, resource agility was identified as the third key element in an enterprise's business agility strategy, along with business decision agility and time-to-solution agility.

Data virtualization directly enables greater resource agility through superior developer productivity, lower infrastructure costs and better optimization of data integration solutions.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects.

Superior Developer Productivity Saves Personnel Costs
At 41% of the typical enterprise IT budget, personnel staffing expenses, including salaries, benefits and occupancy, represent the largest category of IT spending according to recently published analyst research. This spending is double that of both software and outsourcing, and two-and-a-half times that of hardware.

Not only are these staffing costs high in absolute terms, with data integration efforts often representing half the work in a typical IT development project, data integration developer productivity is critically important on a relative basis as well.

As described in Part 2 of this series, data virtualization uses a streamlined architecture and development approach. Not only does this improve time-to-solution agility, it also improves developer productivity in several ways.

  • First, data virtualization allows rapid, iterative development of views and data services. The development and deployment time savings associated with this development approach directly translate into lower staffing costs.
  • Second, the typically SQL-based views used in data virtualization are a well-understood IT paradigm. And the IDEs for building these views share common terminology and techniques with the IDEs for the most popular relational databases. The same can be said for data services and popular SOA IDEs. These factors make data virtualization easy for developers to learn and reduce training costs typically required when adopting new tools.
  • Third, graphically oriented IDEs simplify data virtualization solution development with significant built-in code generation and automatic query optimization. This enables less senior and lower cost development staff to build data integration solutions.
  • Fourth, the views and services built for one application can easily be reused across other applications. This further increases productivity and reduces staffing resource costs.

Better Asset Leverage Lowers Infrastructure Costs
Large enterprises typically have hundreds, if not thousands, of data sources. While these data assets can be leveraged to provide business decision agility, these returns come at a cost. Each source needs to be efficiently operated and managed and the data effectively governed. These ongoing infrastructure costs typically dwarf initial hardware and software implementation costs.

Traditional data integration approaches, where data is consolidated in data warehouses or marts, add to the overall number of data sources. This necessitates not only greater up-front capital expenditures, but also increased spending for ongoing operations and management. In addition, every new copy of the data introduces an opportunity for inconsistency and lower data quality.

Protecting against these inevitable issues is a non-value-added activity that further diverts critical resources. Finally, more sources equal more complexity. This means large, ongoing investments in coordination and synchronization activities.

These demands consume valuable resources that can be significantly reduced through the use of data virtualization. Because data virtualization requires fewer physical data repositories than traditional data integration approaches, enterprises that use data virtualization lower their capital expenditures as well as their operating, management and governance costs. In fact, many data virtualization users find these infrastructure savings alone can justify their entire investment in data virtualization technology.

Add Data Virtualization to Optimize Your Data Integration Portfolio
As a component of a broad data integration portfolio, data virtualization joins traditional data integration approaches such as data consolidation in the form of data warehouses and marts enabled by ETL as well as messaging and replication-based approaches that move data from one location to another.

Each of these approaches has strengths and limitations when addressing various business information needs, data source and consumer technologies, time-to-solution and resource agility requirements.

For example, a data warehouse approach to integration is often deployed when analyzing historical time-series data across multiple dimensions. Data virtualization is typically adopted to support one or more of the five popular data virtualization usage patterns:

  • BI data federation
  • Data warehouse extension
  • Enterprise data virtualization layer
  • Big data integration
  • Cloud data integration

Given the many information needs, integration challenges, and business agility objectives organizations have to juggle, each data integration approach added to the portfolio improves the organization's data integration flexibility and thus optimizes the ability to deliver effective data integration solutions.

With data virtualization in the integration portfolio, the organization can optimally mix and match physical and virtual integration methods based on the distinct requirements of a specific application's information needs, source data characteristics and other critical factors such as time-to-solution, data latency and total cost of ownership.

In addition, data virtualization provides the opportunity to refactor and optimize data models that are distributed across multiple applications and consolidated stores. For example, many enterprises use their BI tool's semantic layer and/or data warehouse schema to manage data definitions and models. Data virtualization provides the option to centralize this key functionality in the data virtualization layer. This can be especially useful in cases where the enterprise has several BI tools and/or multiple warehouses and marts, each with their own schemas and governance.

Conclusion
Data virtualization's streamlined architecture and development approach significantly improves developer productivity. Further, data virtualization requires fewer physical data repositories than traditional data integration approaches. This means that data virtualization users lower their capital expenditures as well as their operating, management and governance costs. Finally, adding data virtualization to the integration portfolio enables the optimization of physical and virtual integration methods.

These factors combine to provide significant cost savings that can be applied flexibly to fund additional data integration activities and/or other business and IT projects in the pursuit of business agility.

•   •   •

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
To paraphrase someone famous, "The definition of insanity is to do something the same way over and over again and expect a different result". Humans are creatures of habit and when it comes to storage, old habits die hard. Why do we continue to put our faith in legacy storage providers when they haven't invented anything new in decades. Sure, they re-badge their products every couple of years to make their messaging look modern, but ultimately, it's the same old stuff with a new coat of lipsti...
StarNet Communications Corp has announced the addition of three Secure Remote Desktop modules to its flagship X-Win32 PC X server. The new modules enable X-Win32 to safely tunnel the remote desktops from Linux and Unix servers to the user’s PC over encrypted SSH. Traditionally, users of PC X servers deploy the XDMCP protocol to display remote desktop environments such as the Gnome and KDE desktops on Linux servers and the CDE environment on Solaris Unix machines. XDMCP is used primarily on comp...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
SYS-CON Events announced today that StarNet Communications will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. StarNet Communications’ FastX is the industry first cloud-based remote X Windows emulator. Using standard Web browsers (FireFox, Chrome, Safari, etc.) users from around the world gain highly secure access to applications and data hosted on Linux-based servers in a central data center. ...
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Ixia (Nasdaq: XXIA) has announced that NoviFlow Inc.has deployed IxNetwork® to validate the company’s designs and accelerate the delivery of its proven, reliable products. Based in Montréal, NoviFlow Inc. supports network carriers, hyperscale data center operators, and enterprises seeking greater network control and flexibility, network scalability, and the capacity to handle extremely large numbers of flows, while maintaining maximum network performance. To meet these requirements, NoviFlow in...
Enterprises have forever faced challenges surrounding the sharing of their intellectual property. Emerging cloud adoption has made it more compelling for enterprises to digitize their content, making them available over a wide variety of devices across the Internet. In his session at 19th Cloud Expo, Santosh Ahuja, Director of Architecture at Impiger Technologies, will introduce various mechanisms provided by cloud service providers today to manage and share digital content in a secure manner....
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...