Welcome!

@CloudExpo Authors: Pat Romanski, William Schmarzo, Stefana Muller, Elizabeth White, Karthick Viswanathan

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Article

How Data Virtualization Delivers Business Agility – Part 1

Make decisions faster based on complete, high-quality actionable information

Enterprise adoption of data virtualization has accelerated along with a growing need for greater business agility.

The close relationship of business agility and data virtualization was described in my recent Virtualization Magazine article, The Agile Business - Why Data Virtualization Is Needed.

It can also be observed across hundreds of organizations and is clearly evident in the ten case studies described in the recently published Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

Three Elements of Business Agility
In their quest to become agile businesses, these organizations address all three elements of business agility: business decision agility, time-to-solution agility and resource agility.

This article addresses how data virtualization delivers business decision agility. Part 2 and Part 3 will address time-to-solution agility and resource agility.

Agile Business Decisions Are Key
Making effective business decisions requires knowledge and insight that can only be developed from access to and analysis of complete, high-quality actionable information. Data virtualization enables the organization to deliver this information in several ways.

Complete Information
Understanding the complete picture is the first step in any decision process. Large enterprises today have thousands of data sources that span multiple transaction systems of record, complementary applications, consolidated data stores, external data sources and more. Each source is a silo with its own unique metadata and data model, data access toolset, and underlying architecture.

The challenge is to integrate data across these traditional silos in order to provide the business user with a single, complete and high-level view of whatever information is needed for analysis and decision making.

Taking this concept to the enterprise level, data virtualization has the ability to provide an organization with a unified view of information across the entire business.

One traditional solution is to consolidate all of the data in a unified, enterprise data warehouse. However, this does not always prove feasible in practice for a number of reasons, including the ongoing proliferation of new data sources and types that must be incorporated into the warehouse.

As an alternative, data virtualization offers virtual data federation functions that enable an organization to integrate its extensive range of internal and external data sources without moving any data. Accessing the data in a new source, for example, simply requires establishing a single connection - from the data virtualization layer to the source - and the creation of virtual views and data services to access, transform, abstract and represent the source data in an appropriate format for consuming applications.

High-Quality Information
Ensuring that the information guiding business decisions is high quality and fit-for-purpose is a second major business decision agility requirement.

It is rare that as-is, raw data in original source systems is an exact match for the consuming application and business user. At a minimum, some degree of format and syntax transformation is required to bridge the gap between source and consumer data models and technologies. In many cases, additional validation and standards-conformance processing is also needed to improve the integrity and consistency of the data delivered to consumers.

Data virtualization supports multiple techniques to ensure delivery of high-quality information. For example, when providing up-to-the-minute, institution-wide views of equity, option, futures, derivative and debt positions for risk managers in the financial services industry, data virtualization transforms the data as structured in the various trading platform sources into a consistent form and format for consumption by the risk management applications.

Actionable Information
Finally, the information decision makers require must be actionable. Time-to-action is an additive function that combines time from event-to-insight, insight-to-decision, decision-to-implementation and implementation-to- results.

Yesterday's data, summarized in the warehouse, is not sufficient when having the most current information is the first step in a time-to-action path. For example, understanding the current location and availability of maintenance staff and repair gear is a critical first step in understanding how to respond to an equipment failure in process industries.

Data virtualization provides high performance query engines and flexible caching to query and deliver source data in near-real time whenever it is requested. This ensures decisions are made based on the most up-to-date information available when appropriate.

Conclusion
Making effective business decisions requires knowledge and insight that can only be developed from access to and analysis of complete, high-quality actionable information.

Data virtualization offers virtual data federation functions that enable an organization to integrate its extensive range of internal and external data sources to provide the business with a complete picture. Further, data virtualization supports multiple techniques to ensure delivery of high-quality information. Finally, data virtualization can query and deliver source data in near-real time whenever it is requested to ensure up-to-date, actionable information.

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization.  This series of three articles on How Data Virtualization Delivers Business Agility includes excerpts from the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is being used on IBM Cloud, Amazon, and Microsoft Azure and how to gain access to these resources in the cloud... for FREE!
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, shared success stories from a few folks who have already started using VM-aware storage. By managing storage operations at the VM-level, they’ve been able to solve their most vexing storage problems, and create infrastructures that scale to meet the needs of their applications. Best of all, they’ve got predictable, manageable storage performance – at a level conventional storage can’t match. ...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and storage creates a platform for consolidation, migration and enabling digital transformation.
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addressed the challenges of scaling document repositories to this level; architectural approaches for coordinating data; search and storage technologies, Solr, and Amazon storage and database technologies; the breadth of use cases that modern content systems need to support; how to support user applications that require subsecond response times.