Welcome!

@CloudExpo Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Carmen Gonzalez

Related Topics: Agile Computing, Java IoT, Microservices Expo, Release Management , @CloudExpo

Agile Computing: Article

Maximize Your Return on Data: The New Business Imperative

Part 1: Lower the cost and complexity of sustaining your business applications

Application owners / senior-level IT allocate a certain percentage of their budget to sustaining, enhancing, and transforming their applications. In most organizations, the largest percentage of the IT spend is on sustaining the applications or basically "keep the lights on" type of activities, which leaves little money to enhance them to support business agility and new business requirements, or transform them to leap frog over the competition. Organizations that implement a solid data management strategy in support of their applications can maximize the return from their application investments. They can lower the cost of sustaining their applications, thus releasing budget to do what they should be doing, which is supporting the needs of the business. Innovative CIOs are actually rethinking their application environments, from just focusing on the application - the code, middleware, infrastructure, etc., to a new focus on the underlying data that supports the application. This new focus enables them to release budget and complexity from sustaining their apps, do what they do much better when it comes to enhancing their apps, and do brand new things they couldn't do before within their applications that will give them the competitive edge they are looking for.

How to Lower the Cost and Complexity of Sustaining Your Business Applications
Of all the things that keep organizations from realizing the best possible return on their data assets, enterprise application environments reside at the very top. Face it, application environments today are large, complex and generally inflexible constructs. Most companies have dozens of different types of key applications supporting countless business processes. Not only are the data volumes huge and the different data types numerous, but the data is often duplicative and the applications redundant across various business units. Moreover, the data is frequently hard to get to, the applications difficult to integrate and the quality of the data frequently questionable. No wonder it is such a challenge to provide a single comprehensive view of the critical data that the business uses every day.

This series of articles is aimed at helping organizations maximize the value of their data and applications, by moving beyond merely sustaining applications to enhancing them to support business agility and to transforming them to drive business innovation and growth. Because many organizations spend far too much of their IT budget on sustaining applications, it is important to first discover ways to lower the costs and complexity, freeing up budget and resources for innovation.

Cutting the Costs of "Keeping on the Lights"
There is a common set of challenges that most companies face around sustaining applications. These include:

  • Application Bloat - Whether the result of mergers and acquisitions, or business units going off and buying their own applications, many companies are rife with redundant applications that soak up maintenance time and money.
  • Data Sprawl - Companies frequently experience diminished application performance as the amount of data within the application grows. This in turn makes it difficult to meet SLAs and forces the purchase of additional hardware, leading to further costs.
  • Proliferating Integration Interfaces - There is a major challenge around integrating silo'd applications, and in dealing with the number and complexity of the interfaces required, which again increases costs.
  • Security and Privacy - Finally, there are the efforts and costs involved with securing the data in non-production applications, and the specter of fines should you fail to meet regulatory requirements around information privacy.

There are a number of initiatives that companies typically pursue to try to reduce the costs and efforts of sustaining applications. These include rationalizing the application portfolio by sending duplicative or inactive applications to the "application retirement home," archiving inactive data to improve application performance, masking sensitive data to meet security and privacy requirements, and finding ways to reduce the costs and complexity around integrating applications. The hitch is that organizations are trying to do all these things on a fixed budget and with a finite set of resources. Hence these initiatives have to be pursued very intelligently, making use of the best possible technologies to yield the greatest return on effort.

Getting the Most from Application Retirement
Capable of paying major dividends, application rationalization is an initiative being pursued by a good many organizations. Consequently, according to Gartner, by 2020, half of all applications that are running in data centers as of 2010 will be retired. If true, that represents a magnificent savings.

What does it take to retire redundant or obsolete applications and still provide seamless access to the archived data? Just because the application has outlived its usefulness, that doesn't mean that the data has. And for certain kinds of data, mandates demand that it be kept for years. Hence when an application is retired, it is increasingly necessary to archive all its data across all the data's sources.

Here is what has to happen to support successful application retirement in five steps:

  1. Mine the source metadata from the legacy application - You want to archive complete business entities, not just the transactional data but also master and reference data, and metadata.
  2. Extract and move the data - You want the ability to move, extract, and archive any data, including documents, attachments, images, and audio files associated with application and database records.
  3. Compress, secure and lockdown the archived data - You need to place it in to a secure, highly compressed, immutable file for later retrieval.
  4. Define and enforce retention policies - To ensure compliance, you need to be able to assign retention policies to different classes of archived data, apply legal holds to certain data, etc.; and to reduce the costs of managing ensured compliance, you want the ability to automatically purge expired records on a scheduled basis.
  5. Provide easy search access - You need to provide easy search and discovery access to archived data from any BI/reporting tool such as such as Crystal Reports, MicroStrategy, and Business Objects, and maintain access to archived data in database instances from existing application interfaces.

Improving Application Performance Through Archiving
The same steps, and same archiving technologies, also apply to archiving inactive data from live applications in order to improve their performance and reduce their TCO. This can take several forms, including archiving inactive data to an archive database in order to benefit from faster application response times, or archiving to an Optimized File Archive to effect substantial storage and infrastructure savings.

Importantly, a truly universal data archiving solution is strongly recommended, not only to support both application retirement and archiving from live applications, but also to ensure that you are able to leverage a single solution to address the archiving needs of all enterprise applications and databases, present and future.

Sub-setting and Masking Data in Non-Production Environments
The use of real data sets in development and test environments is widespread, and is necessary for good reasons. Frequently, this data is confidential or sensitive and subject to compliance requirements and the costs of not protecting it far outweigh the costs of doing so. Nevertheless, you need to control data management costs. Hence, when it comes to managing all the data in a test environment, you want the ability to:

  • Optimize performance and control costs by data sub-setting - Instead of using full sets of production data in test, you want the ability to create a functionally intact subset of the data, keeping only the data required by your business policies while maintaining all referential integrity. By working with a smaller set of data, you can shorten development cycles and reduce storage costs and the use of system resources.
  • Support compliance through data masking - By masking production data, you obfuscate Personally Identifiable Information and other sensitive data while preserving the data's usefulness in development and test activities.

In terms of flexibly protecting data privacy and confidentiality, Dynamic Data Masking technology can take you even further by providing real-time preventive capabilities. With this technology, flexible protection rules enable different kinds of masks to be applied dynamically to different kinds of data based on user privilege levels so you are able to engage in policy-based, selective masking and blocking of production data.

Reducing the Costs of Integrating Applications
For many organizations, much of the cost of keeping the IT lights on revolves around maintaining the "integration hairball" - the intricate web of point-to-point of interfaces between applications. According to Forrester Research, 87% of respondents to a recent IT survey indicated that they rely on hand coding for integration, and 75% of those admit that writing code for each integration effort leads to increased maintenance costs.

Another cost factor is the use of disparate integration tools so that there is no standard methodology and little economy of scale, not to mention difficulty sometimes in finding people trained in the use of a particular tool.

The way to substantially reduce the costs and complexity of integrating applications is to implement - and preferably, standardize on - a unified data integration platform with universal connectivity to data sources and targets, combined with the ability to access, transform, and integrate any data type, i.e., structured, unstructured, or semi-structured. To be fully useful, the platform also needs to support the full breadth of data latency requirements found in today's enterprises - batch, real-time, and changed data capture.

Importantly, a platform approach to integration will let you leverage a codeless development environment, so that custom-coded point-to-point interfaces and their expensive maintenance requirements become a thing of the past. Instead, development teams can leverage drag-and-drop development tools coupled with extensive reuse and sharing across projects of objects such as data mappings and transformations to speed development cycles and dramatically slash overall data integration costs.

Moving Forward Towards Enhancing Applications
The actions prescribed above have been proven to radically reduce the costs of sustaining applications, so that more resources can be applied to enhancing them and to driving innovation.

More Stories By Adam Wilson

Adam Wilson is the General Manager for Informatica’s Information Lifecycle Management Business Unit. Prior to assuming this role, he was in charge of product definition and go-to-market strategy for Informatica’s award-winning enterprise data integration platform. Mr. Wilson holds an MBA from the Kellogg School of Management and an engineering degree from Northwestern University. He can be reached at [email protected] or follow him on Twitter @ a_adam_wilson

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
"We are a modern development application platform and we have a suite of products that allow you to application release automation, we do version control, and we do application life cycle management," explained Flint Brenton, CEO of CollabNet, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...