Welcome!

@CloudExpo Authors: Liz McMillan, Elizabeth White, William Schmarzo, Rene Buest, Pat Romanski

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

Stop Buying Database Licenses: You Have All the Capacity You Need

Ensure Big Data is getting to the right place at the right time and is managed responsibly

Any organization that has deployed a business application has experienced the joy of procuring database licenses. Most database software licensing models are based on the quantity and type of compute processing cores in the underlying database server - the more cores in the processor and the more processors in the server box, the higher the cost of the database software license. Depending on the application and the business's expectations, the tolerance threshold for performance can vary. This is typically considered during the design and testing phases of an application deployment life cycle.

Once the application goes into production and data accumulates in key areas supporting mission-critical business processes, performance starts to take a hit. Performance tuning is an art, requiring the skill and experience of the highly coveted and paid performance database administrator (DBA) - an employee who has been, incidentally, identified by industry research as being problematic to retain. [1] That performance guru will add database indexes, rearrange queries, or add additional database objects solely targeted at improving application performance. At some point, performance tuning will only get you so far and returns will start to diminish. DBAs will request more processing power to meet SLAs; more processing power turns into more database licenses not only in production but for every copy of the data. When data is copied to a data warehouse for reporting, to a test or development environment for product support activities, or to a disaster recovery site and there is a corresponding performance expectation for that environment, the production server upgrade now turns into several server upgrades - each with a corresponding increase in database license upgrades.

In many cases, data volumes are growing astronomically, requiring next-generation analytical platforms - more fondly referred to as "big data" systems - to keep up with the quest for knowledge. While these database systems offer an incredible opportunity to change the way organizations find value in their information assets, they too have an incremental cost associated with the size and volume of data.

The fact the database volume and its corresponding costs are growing exponentially is not the big insight. Anyone working in IT gets this and analysts indicate that "on average, data repositories for large applications grow annually at 65%."[2] What is revealing is that the vast majority of the data in these systems is dormant. In fact, industry analysts estimate that as much as 80% of the data in these systems is dormant. These are closed transactions and infrequently queried data that is often only retained for compliance purposes. If you knew you could keep all this data online, reduce its size 90+%, eliminate growth in your databases licenses, and still be able to restore, manage retention, or directly report on the data, why wouldn't you? Why keep this dormant data inside your most expensive applications, riding on your most expensive infrastructure, being maintained by your most expensive personnel? Stop the madness. By taking a good hard look at who is accessing what data over time, there is a good chance that after some inflection point, data is rarely accessed - if it is accessed at all. Why keep buying database licenses for data that doesn't justify the need?

While the concept is not new, information life-cycle management (ILM) has traditionally been associated with tiering infrastructure and archiving documents or email. Application ILM takes this a step further applying tiered services and archiving to databases. The idea is to inventory applications and data warehouses with the value of data and how end users access it. Mapping expectations of business process response time to the underlying infrastructure optimizes operations for performance and cost. It is a philosophical shift from blindly scaling up or out by adding more capacity or compute power as a reaction to missed SLAs to asking the business what it really needs and does that need change over time?

If the business doesn't know the answer to that question, maybe a good look at overall business process efficiency is in order. If there is an answer to that question, capitalize on it. Take stock of the assets used to support that business process and quantify the following: what percentage of data stored online in the production database could just be deleted? What percentage of data has retention requirements - either legal or operational - but doesn't have performance requirements? That's the opportunity to assess what database licenses are costing your organization and if you really need them.

Let's face it. Managing databases and procuring database licenses is expensive and there are no signs that data growth is diminishing. With the amount of high performance compute power and storage capacity wasted due to dormant data, there is a big opportunity to control big data growth. By removing dormant data by getting rid of what you don't need and archiving what you need to keep for longer retention periods, latent DB capacity is released, ultimately improving application performance and operational efficiencies. Taking control over data growth can not only deliver savings on avoided cost associated with more database licenses, but also offer additional advantages, such as:

  • Improved database query performance
  • Ability to meet or exceed application SLAs by IT
  • Shorten application upgrade cycles
  • Reduced backup, recovery, refresh and batch windows
  • Controlled data sprawl ultimately improving eDiscovery efforts
  • Focus budgets on managing data based on its value as determined by its age and access frequency

Whether you are talking about Big Data because data has accumulated over time or overnight, IT has always been in the business of supporting Big Data initiatives. By focusing on the benefits of Application ILM, organizations can be in a better position to make sure that not only is Big Data getting to the right place at the right time, but that big data is managed responsibly.

References:

  1. ESG's 2011 IT Spending Intentions Survey identified database administrators as a top area of problematic skill shortages for IT.
  2. Source: Forrester Research, Inc. TechRadar: Enterprise Data Integration, February 2010

More Stories By Adam Wilson

Adam Wilson is the General Manager for Informatica’s Information Lifecycle Management Business Unit. Prior to assuming this role, he was in charge of product definition and go-to-market strategy for Informatica’s award-winning enterprise data integration platform. Mr. Wilson holds an MBA from the Kellogg School of Management and an engineering degree from Northwestern University. He can be reached at [email protected] or follow him on Twitter @ a_adam_wilson

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"We want to show that our solution is far less expensive with a much better total cost of ownership so we announced several key features. One is called geo-distributed erasure coding, another is support for KVM and we introduced a new capability called Multi-Part," explained Tim Desai, Senior Product Marketing Manager at Hitachi Data Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
"The Striim platform is a full end-to-end streaming integration and analytics platform that is middleware that covers a lot of different use cases," explained Steve Wilkes, Founder and CTO at Striim, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Calligo, an innovative cloud service provider offering mid-sized companies the highest levels of data privacy and security, has been named "Bronze Sponsor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Calligo offers unparalleled application performance guarantees, commercial flexibility and a personalised support service from its globally located cloud plat...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that DXWorldExpo has been named “Global Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Digital Transformation is the key issue driving the global enterprise IT business. Digital Transformation is most prominent among Global 2000 enterprises and government institutions.
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
"Outscale was founded in 2010, is based in France, is a strategic partner to Dassault Systémes and has done quite a bit of work with divisions of Dassault," explained Jackie Funk, Digital Marketing exec at Outscale, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We focus on SAP workloads because they are among the most powerful but somewhat challenging workloads out there to take into public cloud," explained Swen Conrad, CEO of Ocean9, Inc., in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"DivvyCloud as a company set out to help customers automate solutions to the most common cloud problems," noted Jeremy Snyder, VP of Business Development at DivvyCloud, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We are still a relatively small software house and we are focusing on certain industries like FinTech, med tech, energy and utilities. We help our customers with their digital transformation," noted Piotr Stawinski, Founder and CEO of EARP Integration, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I think DevOps is now a rambunctious teenager – it’s starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We've been engaging with a lot of customers including Panasonic, we've been involved with Cisco and now we're working with the U.S. government - the Department of Homeland Security," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.