Welcome!

@CloudExpo Authors: Pat Romanski, Janakiram MSV, Kevin Benedict, Elizabeth White, Peter Silva

Related Topics: Agile Computing, Java IoT, Microservices Expo, Release Management , @CloudExpo

Agile Computing: Article

Maximize Your Return on Data: The New Business Imperative

Part 1: Lower the cost and complexity of sustaining your business applications

Application owners / senior-level IT allocate a certain percentage of their budget to sustaining, enhancing, and transforming their applications. In most organizations, the largest percentage of the IT spend is on sustaining the applications or basically "keep the lights on" type of activities, which leaves little money to enhance them to support business agility and new business requirements, or transform them to leap frog over the competition. Organizations that implement a solid data management strategy in support of their applications can maximize the return from their application investments. They can lower the cost of sustaining their applications, thus releasing budget to do what they should be doing, which is supporting the needs of the business. Innovative CIOs are actually rethinking their application environments, from just focusing on the application - the code, middleware, infrastructure, etc., to a new focus on the underlying data that supports the application. This new focus enables them to release budget and complexity from sustaining their apps, do what they do much better when it comes to enhancing their apps, and do brand new things they couldn't do before within their applications that will give them the competitive edge they are looking for.

How to Lower the Cost and Complexity of Sustaining Your Business Applications
Of all the things that keep organizations from realizing the best possible return on their data assets, enterprise application environments reside at the very top. Face it, application environments today are large, complex and generally inflexible constructs. Most companies have dozens of different types of key applications supporting countless business processes. Not only are the data volumes huge and the different data types numerous, but the data is often duplicative and the applications redundant across various business units. Moreover, the data is frequently hard to get to, the applications difficult to integrate and the quality of the data frequently questionable. No wonder it is such a challenge to provide a single comprehensive view of the critical data that the business uses every day.

This series of articles is aimed at helping organizations maximize the value of their data and applications, by moving beyond merely sustaining applications to enhancing them to support business agility and to transforming them to drive business innovation and growth. Because many organizations spend far too much of their IT budget on sustaining applications, it is important to first discover ways to lower the costs and complexity, freeing up budget and resources for innovation.

Cutting the Costs of "Keeping on the Lights"
There is a common set of challenges that most companies face around sustaining applications. These include:

  • Application Bloat - Whether the result of mergers and acquisitions, or business units going off and buying their own applications, many companies are rife with redundant applications that soak up maintenance time and money.
  • Data Sprawl - Companies frequently experience diminished application performance as the amount of data within the application grows. This in turn makes it difficult to meet SLAs and forces the purchase of additional hardware, leading to further costs.
  • Proliferating Integration Interfaces - There is a major challenge around integrating silo'd applications, and in dealing with the number and complexity of the interfaces required, which again increases costs.
  • Security and Privacy - Finally, there are the efforts and costs involved with securing the data in non-production applications, and the specter of fines should you fail to meet regulatory requirements around information privacy.

There are a number of initiatives that companies typically pursue to try to reduce the costs and efforts of sustaining applications. These include rationalizing the application portfolio by sending duplicative or inactive applications to the "application retirement home," archiving inactive data to improve application performance, masking sensitive data to meet security and privacy requirements, and finding ways to reduce the costs and complexity around integrating applications. The hitch is that organizations are trying to do all these things on a fixed budget and with a finite set of resources. Hence these initiatives have to be pursued very intelligently, making use of the best possible technologies to yield the greatest return on effort.

Getting the Most from Application Retirement
Capable of paying major dividends, application rationalization is an initiative being pursued by a good many organizations. Consequently, according to Gartner, by 2020, half of all applications that are running in data centers as of 2010 will be retired. If true, that represents a magnificent savings.

What does it take to retire redundant or obsolete applications and still provide seamless access to the archived data? Just because the application has outlived its usefulness, that doesn't mean that the data has. And for certain kinds of data, mandates demand that it be kept for years. Hence when an application is retired, it is increasingly necessary to archive all its data across all the data's sources.

Here is what has to happen to support successful application retirement in five steps:

  1. Mine the source metadata from the legacy application - You want to archive complete business entities, not just the transactional data but also master and reference data, and metadata.
  2. Extract and move the data - You want the ability to move, extract, and archive any data, including documents, attachments, images, and audio files associated with application and database records.
  3. Compress, secure and lockdown the archived data - You need to place it in to a secure, highly compressed, immutable file for later retrieval.
  4. Define and enforce retention policies - To ensure compliance, you need to be able to assign retention policies to different classes of archived data, apply legal holds to certain data, etc.; and to reduce the costs of managing ensured compliance, you want the ability to automatically purge expired records on a scheduled basis.
  5. Provide easy search access - You need to provide easy search and discovery access to archived data from any BI/reporting tool such as such as Crystal Reports, MicroStrategy, and Business Objects, and maintain access to archived data in database instances from existing application interfaces.

Improving Application Performance Through Archiving
The same steps, and same archiving technologies, also apply to archiving inactive data from live applications in order to improve their performance and reduce their TCO. This can take several forms, including archiving inactive data to an archive database in order to benefit from faster application response times, or archiving to an Optimized File Archive to effect substantial storage and infrastructure savings.

Importantly, a truly universal data archiving solution is strongly recommended, not only to support both application retirement and archiving from live applications, but also to ensure that you are able to leverage a single solution to address the archiving needs of all enterprise applications and databases, present and future.

Sub-setting and Masking Data in Non-Production Environments
The use of real data sets in development and test environments is widespread, and is necessary for good reasons. Frequently, this data is confidential or sensitive and subject to compliance requirements and the costs of not protecting it far outweigh the costs of doing so. Nevertheless, you need to control data management costs. Hence, when it comes to managing all the data in a test environment, you want the ability to:

  • Optimize performance and control costs by data sub-setting - Instead of using full sets of production data in test, you want the ability to create a functionally intact subset of the data, keeping only the data required by your business policies while maintaining all referential integrity. By working with a smaller set of data, you can shorten development cycles and reduce storage costs and the use of system resources.
  • Support compliance through data masking - By masking production data, you obfuscate Personally Identifiable Information and other sensitive data while preserving the data's usefulness in development and test activities.

In terms of flexibly protecting data privacy and confidentiality, Dynamic Data Masking technology can take you even further by providing real-time preventive capabilities. With this technology, flexible protection rules enable different kinds of masks to be applied dynamically to different kinds of data based on user privilege levels so you are able to engage in policy-based, selective masking and blocking of production data.

Reducing the Costs of Integrating Applications
For many organizations, much of the cost of keeping the IT lights on revolves around maintaining the "integration hairball" - the intricate web of point-to-point of interfaces between applications. According to Forrester Research, 87% of respondents to a recent IT survey indicated that they rely on hand coding for integration, and 75% of those admit that writing code for each integration effort leads to increased maintenance costs.

Another cost factor is the use of disparate integration tools so that there is no standard methodology and little economy of scale, not to mention difficulty sometimes in finding people trained in the use of a particular tool.

The way to substantially reduce the costs and complexity of integrating applications is to implement - and preferably, standardize on - a unified data integration platform with universal connectivity to data sources and targets, combined with the ability to access, transform, and integrate any data type, i.e., structured, unstructured, or semi-structured. To be fully useful, the platform also needs to support the full breadth of data latency requirements found in today's enterprises - batch, real-time, and changed data capture.

Importantly, a platform approach to integration will let you leverage a codeless development environment, so that custom-coded point-to-point interfaces and their expensive maintenance requirements become a thing of the past. Instead, development teams can leverage drag-and-drop development tools coupled with extensive reuse and sharing across projects of objects such as data mappings and transformations to speed development cycles and dramatically slash overall data integration costs.

Moving Forward Towards Enhancing Applications
The actions prescribed above have been proven to radically reduce the costs of sustaining applications, so that more resources can be applied to enhancing them and to driving innovation.

More Stories By Adam Wilson

Adam Wilson is the General Manager for Informatica’s Information Lifecycle Management Business Unit. Prior to assuming this role, he was in charge of product definition and go-to-market strategy for Informatica’s award-winning enterprise data integration platform. Mr. Wilson holds an MBA from the Kellogg School of Management and an engineering degree from Northwestern University. He can be reached at [email protected] or follow him on Twitter @ a_adam_wilson

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
SYS-CON Events announced today that StarNet Communications will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. StarNet Communications’ FastX is the industry first cloud-based remote X Windows emulator. Using standard Web browsers (FireFox, Chrome, Safari, etc.) users from around the world gain highly secure access to applications and data hosted on Linux-based servers in a central data center. ...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
StarNet Communications Corp has announced the addition of three Secure Remote Desktop modules to its flagship X-Win32 PC X server. The new modules enable X-Win32 to safely tunnel the remote desktops from Linux and Unix servers to the user’s PC over encrypted SSH. Traditionally, users of PC X servers deploy the XDMCP protocol to display remote desktop environments such as the Gnome and KDE desktops on Linux servers and the CDE environment on Solaris Unix machines. XDMCP is used primarily on comp...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Enterprises have forever faced challenges surrounding the sharing of their intellectual property. Emerging cloud adoption has made it more compelling for enterprises to digitize their content, making them available over a wide variety of devices across the Internet. In his session at 19th Cloud Expo, Santosh Ahuja, Director of Architecture at Impiger Technologies, will introduce various mechanisms provided by cloud service providers today to manage and share digital content in a secure manner....
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.
SYS-CON Events announced today that eCube Systems, a leading provider of middleware modernization, integration, and management solutions, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. eCube Systems offers a family of middleware evolution products and services that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...