Click here to close now.

Welcome!

Cloud Expo Authors: Elizabeth White, Pat Romanski, Liz McMillan, Blue Box Blog, Kathy Thomas

Related Topics: Cloud Expo, Virtualization

Cloud Expo: Article

Breaking the Storage Array Lifecycle

Cloud storage appliances can grow to a virtually unlimited storage capacity without the need to ever upgrade

Anyone who purchases storage arrays is familiar with the many advantages of modular storage systems and storage area networks. However, they may also be familiar with one of the less desirable attributes of storage arrays: the typical three- to five-year lifecycle that forces decommissions and mandates upgrades on a regular basis. With many organizations expanding their need for storage by 20-60% on an annual basis [1], outgrowing capacity of existing storage arrays is a regular occurrence, effectively rendering upgrade cycles to be a fact of life.

Although decommissioning and upgrading a storage array may not appear all that daunting, the process includes a number of cumbersome aspects:

  • The migration process from old to new storage arrays could last months, creating an overlap when both the old and a new storage array must be maintained. In some enterprise environments, the migration can cost thousands of dollars per Terabyte [2]
  • When purchasing a new storage array, its sizing is based on anticipated future growth, resulting in an initial capacity "over-purchase" and an underutilized storage array for most of its life cycle. Result? Pre-payment for capacity that may not be needed over the next two years
  • Software and architectural changes associated with each storage array upgrade may compromise stability or, at a minimum, require staff retraining on new policies and features

The Economics of Storage Array Replacement
Why decommission arrays at all? Why not simply expand storage capacity by adding new storage arrays instead of replacing them? With storage densities and capacities growing every year and cost per gigabyte dropping, the economics provide the answer.

Consider a three-year-old, 10TB storage array that costs $50,000 to purchase and has a total annual maintenance cost (vendor maintenance fee, administration, floor space, power, cooling, etc.) of $25,000 per year. Now let's say today the IT manager purchases a new 25TB storage array for $50,000. Thanks to improved storage density, it may carry the same total annual maintenance cost of $25K per year.

If the manager chooses to keep both arrays active, he would now have data residing on the old storage array at $2.50/GB in annual maintenance cost and data residing on the new storage array at only $1/GB in annual maintenance cost (with plenty of capacity to spare). This economic inefficiency makes a pretty strong business case for decommissioning the old array and moving all data to the new array as quickly as possible.

Is there any hope of breaking this storage array cycle?

There is hope with on-demand cloud storage. Cloud storage substantially changes the economics and deployment of data storage and makes some of the aspects of the storage array life cycle start to disappear entirely.

  • The pay-as you-go cloud model eliminates the need to pre-purchase or over-purchase capacity
  • Cost per GB drops over time as cloud providers continue to lower costs every 6-12 months
  • Storage capacity is virtually limitless, so there is never a need to decommission or upgrade any hardware

The area chart below illustrates some of the economic inefficiencies of this deployment model. While most organizations consume storage at a near linear rate over time, the required initial pre-purchase of capacity results in a relatively disproportionate up-front investment. The blue area on the chart shows how total cost of ownership (TCO) increases over time with traditional storage deployment - note the spikes at every purchase cycle (Q1 Y1 & Q1 Y4). The green bars on the chart illustrate how TCO increases with a pay-as-you-go cloud storage model. Even in this simplistic case where the TCO of both models is identical after three years, you can easily spot the inefficiencies of traditional storage, which requires pre-payment for unused capacity.

This is a very basic analysis. In practice, there are many additional costs that factor into traditional storage deployments including:

  • Administrative costs of the migrations from old arrays to a new arrays
  • Potential overlap where both arrays must be maintained simultaneously
  • Potential downtime and productivity loss during the migration process

Furthermore, the analysis does not capture some of the additional cost savings of cloud storage that include:

  • Economies of scale in pricing and administration as a result of leveraging large multi-tenant cloud environments
  • Price per GB erosion over time

Factoring these additional costs of traditional storage and additional savings of cloud storage may be enough to convince most organizations that it is time to introduce cloud storage into their existing IT infrastructure, however, when it comes to deploying a cloud storage strategy, many companies don't know where or how to begin.

Introducing Cloud Storage into the Organization: A Hybrid Approach

Replacing traditional storage with cloud storage will break the traditional storage array life cycle, but a complete "forklift" replacement may be overkill. While cloud storage may not necessarily meet all on-premise storage needs, it can still augment existing storage infrastructure. A hybrid local-cloud storage environment can streamline storage operations and can even extend the life cycle of traditional storage through selective data offload.

A more conservative approach might be to identify data suitable for cloud storage such as secondary copies, backups, off-site data and/or archives. Interestingly, archives are often stored on traditional onsite storage to make them easily accessible to meet compliance requirements. By some reports, like the one below from ESG, the growth rate for archive data is expected to grow approximately 56% per year.

With literally hundreds of thousands of petabytes of archives to store over the next few years, the benefits of offloading archives or infrequently accessed data from traditional storage are numerous. In fact, transitioning this data to cloud storage can extend the traditional life cycle of storage arrays beyond the typical 3-5 year time frame. Imagine a 6-10 year storage array life cycle instead. That would result in a reduction of capital investment in storage infrastructure by half and introduce a significantly more efficient just-in-time, pay-as-you-go model.

How can businesses leverage tiers of cloud storage in a manner that integrates seamlessly into an existing storage infrastructure?

Connecting the Storage Infrastructure to the Cloud
Given most storage consumed by businesses is either block-based or NAS-based, an on-premise cloud storage appliance or gateway that converts cloud storage to a block or NAS protocol can greatly simplify deployment. When choosing a solution, keep in mind that, unlike NAS, block-based solutions have the advantage of supporting both block-based access and any file system protocol. Block-based iSCSI solutions support petabytes of storage and provide thin-provisioning, caching, encryption, compression, deduplication and snapshots, matching feature sets of sophisticated SAN storage arrays. These solutions can readily reside alongside existing SANs and are available in both software and hardware form factors.

Cloud storage appliances can grow to a virtually unlimited storage capacity without the need to ever upgrade, eliminating many administrative burdens and risks of the storage array life cycle. Since cloud storage is pay-as-you-go, cost adjustments occur automatically, eliminating the economic inefficiencies of the storage array life cycle.

In combination with a cloud storage gateway or appliance, businesses should also consider storage tiering software. Auto-tiering software can be found in storage virtualization solutions, data classification solutions, and even in some hypervisor solutions. Businesses that choose an auto-tiering framework can immediately begin to extend the life cycle of their existing storage arrays and leverage the benefits of cloud storage by selectively offloading infrequently used data.

References

  1. IDC: Unstructured data will become the primary task for storage
  2. Hitachi Data Systems: Reducing Costs and Risks for Data Migrations

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Move from reactive to proactive cloud management in a heterogeneous cloud infrastructure. In his session at 16th Cloud Expo, Manoj Khabe, Innovative Solution-Focused Transformation Leader at Vicom Computer Services, Inc., will show how to replace a help desk-centric approach with an ITIL-based service model and service-centric CMDB that’s tightly integrated with an event and incident management platform. Learn how to expand the scope of operations management to service management. He will al...
What are the benefits of using an enterprise-grade orchestration platform? In their session at 15th Cloud Expo, Nate Gordon, Director of Technology at Appcore, and Kedar Poduri, Senior Director of Product Management at Citrix Systems, took a closer look at the architectural design factors needed to support diverse workloads and how to run these workloads efficiently as a service provider. They also discussed how to deploy private cloud environments in 15 minutes or less.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
The DevOps Institute (DOI) launched on Monday with the mission of serving as the premier source for aligning industry standard quality DevOps training and examination services for enterprise IT. The Institute is led by a Board of Regents who will oversee DOI’s offerings in an effort to codify and promote DevOps’ best practices and standards that enable enterprise IT to deliver more value faster to their customers. The initial Board of Regents includes Gene Kim, Lori MacVittie, Sanjeev Sharma, ...
In his session at DevOps Summit, Andrei Yurkevich, CTO at Altoros, provided an overview of all the benefits and opportunities, as well as drawbacks of deploying Cloud Foundry PaaS with Juju and compared it to BOSH. Discover the features that overlap, and understand what Juju Charm is, what it is not, where you use one or the other or where you use both BOSH and Juju Charms together. Andrei Yurkevich is Cloud Foundry protagonist and CTO at Altoros. Under his supervision, the Altoros engineering ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
MeriTalk, a public-private partnership focused on improving the outcomes of government IT, today announced the results of its new report, "The Agile Advantage: Can DevOps Move Cloud to the Fast Lane?" The study, underwritten by Accenture Federal Services, reveals that approximately two-thirds of Feds say DevOps will help agencies shift into the cloud fast lane - improving IT collaboration and migration speed. But help is needed, with 66 percent saying that their agency needs to move IT services ...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of...
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device exp...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
Some developers believe that monitoring is a function of the operations team. Some operations teams firmly believe that monitoring the systems they maintain is sufficient to run the business successfully. Most of them are wrong. The complexity of today's applications have gone far and beyond the capabilities of "traditional" system-level monitoring tools and approaches and requires much broader knowledge of business and applications as a whole. The goal of DevOps is to connect all aspects of app...
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developmen...