Welcome!

@CloudExpo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Automic Blog, Kevin Jackson

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

Stop Buying Database Licenses: You Have All the Capacity You Need

Ensure Big Data is getting to the right place at the right time and is managed responsibly

Any organization that has deployed a business application has experienced the joy of procuring database licenses. Most database software licensing models are based on the quantity and type of compute processing cores in the underlying database server - the more cores in the processor and the more processors in the server box, the higher the cost of the database software license. Depending on the application and the business's expectations, the tolerance threshold for performance can vary. This is typically considered during the design and testing phases of an application deployment life cycle.

Once the application goes into production and data accumulates in key areas supporting mission-critical business processes, performance starts to take a hit. Performance tuning is an art, requiring the skill and experience of the highly coveted and paid performance database administrator (DBA) - an employee who has been, incidentally, identified by industry research as being problematic to retain. [1] That performance guru will add database indexes, rearrange queries, or add additional database objects solely targeted at improving application performance. At some point, performance tuning will only get you so far and returns will start to diminish. DBAs will request more processing power to meet SLAs; more processing power turns into more database licenses not only in production but for every copy of the data. When data is copied to a data warehouse for reporting, to a test or development environment for product support activities, or to a disaster recovery site and there is a corresponding performance expectation for that environment, the production server upgrade now turns into several server upgrades - each with a corresponding increase in database license upgrades.

In many cases, data volumes are growing astronomically, requiring next-generation analytical platforms - more fondly referred to as "big data" systems - to keep up with the quest for knowledge. While these database systems offer an incredible opportunity to change the way organizations find value in their information assets, they too have an incremental cost associated with the size and volume of data.

The fact the database volume and its corresponding costs are growing exponentially is not the big insight. Anyone working in IT gets this and analysts indicate that "on average, data repositories for large applications grow annually at 65%."[2] What is revealing is that the vast majority of the data in these systems is dormant. In fact, industry analysts estimate that as much as 80% of the data in these systems is dormant. These are closed transactions and infrequently queried data that is often only retained for compliance purposes. If you knew you could keep all this data online, reduce its size 90+%, eliminate growth in your databases licenses, and still be able to restore, manage retention, or directly report on the data, why wouldn't you? Why keep this dormant data inside your most expensive applications, riding on your most expensive infrastructure, being maintained by your most expensive personnel? Stop the madness. By taking a good hard look at who is accessing what data over time, there is a good chance that after some inflection point, data is rarely accessed - if it is accessed at all. Why keep buying database licenses for data that doesn't justify the need?

While the concept is not new, information life-cycle management (ILM) has traditionally been associated with tiering infrastructure and archiving documents or email. Application ILM takes this a step further applying tiered services and archiving to databases. The idea is to inventory applications and data warehouses with the value of data and how end users access it. Mapping expectations of business process response time to the underlying infrastructure optimizes operations for performance and cost. It is a philosophical shift from blindly scaling up or out by adding more capacity or compute power as a reaction to missed SLAs to asking the business what it really needs and does that need change over time?

If the business doesn't know the answer to that question, maybe a good look at overall business process efficiency is in order. If there is an answer to that question, capitalize on it. Take stock of the assets used to support that business process and quantify the following: what percentage of data stored online in the production database could just be deleted? What percentage of data has retention requirements - either legal or operational - but doesn't have performance requirements? That's the opportunity to assess what database licenses are costing your organization and if you really need them.

Let's face it. Managing databases and procuring database licenses is expensive and there are no signs that data growth is diminishing. With the amount of high performance compute power and storage capacity wasted due to dormant data, there is a big opportunity to control big data growth. By removing dormant data by getting rid of what you don't need and archiving what you need to keep for longer retention periods, latent DB capacity is released, ultimately improving application performance and operational efficiencies. Taking control over data growth can not only deliver savings on avoided cost associated with more database licenses, but also offer additional advantages, such as:

  • Improved database query performance
  • Ability to meet or exceed application SLAs by IT
  • Shorten application upgrade cycles
  • Reduced backup, recovery, refresh and batch windows
  • Controlled data sprawl ultimately improving eDiscovery efforts
  • Focus budgets on managing data based on its value as determined by its age and access frequency

Whether you are talking about Big Data because data has accumulated over time or overnight, IT has always been in the business of supporting Big Data initiatives. By focusing on the benefits of Application ILM, organizations can be in a better position to make sure that not only is Big Data getting to the right place at the right time, but that big data is managed responsibly.

References:

  1. ESG's 2011 IT Spending Intentions Survey identified database administrators as a top area of problematic skill shortages for IT.
  2. Source: Forrester Research, Inc. TechRadar: Enterprise Data Integration, February 2010

More Stories By Adam Wilson

Adam Wilson is the General Manager for Informatica’s Information Lifecycle Management Business Unit. Prior to assuming this role, he was in charge of product definition and go-to-market strategy for Informatica’s award-winning enterprise data integration platform. Mr. Wilson holds an MBA from the Kellogg School of Management and an engineering degree from Northwestern University. He can be reached at [email protected] or follow him on Twitter @ a_adam_wilson

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
"WineSOFT is a software company making proxy server software, which is widely used in the telecommunication industry or the content delivery networks or e-commerce," explained Jonathan Ahn, COO of WineSOFT, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Sanjeev Sharma Joins June 5-7, 2018 @DevOpsSummit at @Cloud Expo New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
You know you need the cloud, but you're hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You're looking at private cloud solutions based on hyperconverged infrastructure, but you're concerned with the limits inherent in those technologies. What do you do?