Welcome!

Cloud Expo Authors: Carmen Gonzalez, Pat Romanski, Elizabeth White, Liz McMillan, Victoria Livschitz

Related Topics: Cloud Expo, Virtualization

Cloud Expo: Article

Breaking the Storage Array Lifecycle

Cloud storage appliances can grow to a virtually unlimited storage capacity without the need to ever upgrade

Anyone who purchases storage arrays is familiar with the many advantages of modular storage systems and storage area networks. However, they may also be familiar with one of the less desirable attributes of storage arrays: the typical three- to five-year lifecycle that forces decommissions and mandates upgrades on a regular basis. With many organizations expanding their need for storage by 20-60% on an annual basis [1], outgrowing capacity of existing storage arrays is a regular occurrence, effectively rendering upgrade cycles to be a fact of life.

Although decommissioning and upgrading a storage array may not appear all that daunting, the process includes a number of cumbersome aspects:

  • The migration process from old to new storage arrays could last months, creating an overlap when both the old and a new storage array must be maintained. In some enterprise environments, the migration can cost thousands of dollars per Terabyte [2]
  • When purchasing a new storage array, its sizing is based on anticipated future growth, resulting in an initial capacity "over-purchase" and an underutilized storage array for most of its life cycle. Result? Pre-payment for capacity that may not be needed over the next two years
  • Software and architectural changes associated with each storage array upgrade may compromise stability or, at a minimum, require staff retraining on new policies and features

The Economics of Storage Array Replacement
Why decommission arrays at all? Why not simply expand storage capacity by adding new storage arrays instead of replacing them? With storage densities and capacities growing every year and cost per gigabyte dropping, the economics provide the answer.

Consider a three-year-old, 10TB storage array that costs $50,000 to purchase and has a total annual maintenance cost (vendor maintenance fee, administration, floor space, power, cooling, etc.) of $25,000 per year. Now let's say today the IT manager purchases a new 25TB storage array for $50,000. Thanks to improved storage density, it may carry the same total annual maintenance cost of $25K per year.

If the manager chooses to keep both arrays active, he would now have data residing on the old storage array at $2.50/GB in annual maintenance cost and data residing on the new storage array at only $1/GB in annual maintenance cost (with plenty of capacity to spare). This economic inefficiency makes a pretty strong business case for decommissioning the old array and moving all data to the new array as quickly as possible.

Is there any hope of breaking this storage array cycle?

There is hope with on-demand cloud storage. Cloud storage substantially changes the economics and deployment of data storage and makes some of the aspects of the storage array life cycle start to disappear entirely.

  • The pay-as you-go cloud model eliminates the need to pre-purchase or over-purchase capacity
  • Cost per GB drops over time as cloud providers continue to lower costs every 6-12 months
  • Storage capacity is virtually limitless, so there is never a need to decommission or upgrade any hardware

The area chart below illustrates some of the economic inefficiencies of this deployment model. While most organizations consume storage at a near linear rate over time, the required initial pre-purchase of capacity results in a relatively disproportionate up-front investment. The blue area on the chart shows how total cost of ownership (TCO) increases over time with traditional storage deployment - note the spikes at every purchase cycle (Q1 Y1 & Q1 Y4). The green bars on the chart illustrate how TCO increases with a pay-as-you-go cloud storage model. Even in this simplistic case where the TCO of both models is identical after three years, you can easily spot the inefficiencies of traditional storage, which requires pre-payment for unused capacity.

This is a very basic analysis. In practice, there are many additional costs that factor into traditional storage deployments including:

  • Administrative costs of the migrations from old arrays to a new arrays
  • Potential overlap where both arrays must be maintained simultaneously
  • Potential downtime and productivity loss during the migration process

Furthermore, the analysis does not capture some of the additional cost savings of cloud storage that include:

  • Economies of scale in pricing and administration as a result of leveraging large multi-tenant cloud environments
  • Price per GB erosion over time

Factoring these additional costs of traditional storage and additional savings of cloud storage may be enough to convince most organizations that it is time to introduce cloud storage into their existing IT infrastructure, however, when it comes to deploying a cloud storage strategy, many companies don't know where or how to begin.

Introducing Cloud Storage into the Organization: A Hybrid Approach

Replacing traditional storage with cloud storage will break the traditional storage array life cycle, but a complete "forklift" replacement may be overkill. While cloud storage may not necessarily meet all on-premise storage needs, it can still augment existing storage infrastructure. A hybrid local-cloud storage environment can streamline storage operations and can even extend the life cycle of traditional storage through selective data offload.

A more conservative approach might be to identify data suitable for cloud storage such as secondary copies, backups, off-site data and/or archives. Interestingly, archives are often stored on traditional onsite storage to make them easily accessible to meet compliance requirements. By some reports, like the one below from ESG, the growth rate for archive data is expected to grow approximately 56% per year.

With literally hundreds of thousands of petabytes of archives to store over the next few years, the benefits of offloading archives or infrequently accessed data from traditional storage are numerous. In fact, transitioning this data to cloud storage can extend the traditional life cycle of storage arrays beyond the typical 3-5 year time frame. Imagine a 6-10 year storage array life cycle instead. That would result in a reduction of capital investment in storage infrastructure by half and introduce a significantly more efficient just-in-time, pay-as-you-go model.

How can businesses leverage tiers of cloud storage in a manner that integrates seamlessly into an existing storage infrastructure?

Connecting the Storage Infrastructure to the Cloud
Given most storage consumed by businesses is either block-based or NAS-based, an on-premise cloud storage appliance or gateway that converts cloud storage to a block or NAS protocol can greatly simplify deployment. When choosing a solution, keep in mind that, unlike NAS, block-based solutions have the advantage of supporting both block-based access and any file system protocol. Block-based iSCSI solutions support petabytes of storage and provide thin-provisioning, caching, encryption, compression, deduplication and snapshots, matching feature sets of sophisticated SAN storage arrays. These solutions can readily reside alongside existing SANs and are available in both software and hardware form factors.

Cloud storage appliances can grow to a virtually unlimited storage capacity without the need to ever upgrade, eliminating many administrative burdens and risks of the storage array life cycle. Since cloud storage is pay-as-you-go, cost adjustments occur automatically, eliminating the economic inefficiencies of the storage array life cycle.

In combination with a cloud storage gateway or appliance, businesses should also consider storage tiering software. Auto-tiering software can be found in storage virtualization solutions, data classification solutions, and even in some hypervisor solutions. Businesses that choose an auto-tiering framework can immediately begin to extend the life cycle of their existing storage arrays and leverage the benefits of cloud storage by selectively offloading infrequently used data.

References

  1. IDC: Unstructured data will become the primary task for storage
  2. Hitachi Data Systems: Reducing Costs and Risks for Data Migrations

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
"We help companies that are using a lot of Software as a Service. We help companies manage and gain visibility into what people are using inside the company and decide to secure them or use standards to lock down or to embrace the adoption of SaaS inside the company," explained Scott Kriz, Co-founder and CEO of Bitium, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
Some developers believe that monitoring is a function of the operations team. Some operations teams firmly believe that monitoring the systems they maintain is sufficient to run the business successfully. Most of them are wrong. The complexity of today's applications have gone far and beyond the capabilities of "traditional" system-level monitoring tools and approaches and requires much broader knowledge of business and applications as a whole. The goal of DevOps is to connect all aspects of app...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
SYS-CON Media announced that Centrify, a provider of unified identity management across cloud, mobile and data center environments that delivers single sign-on (SSO) for users and a simplified identity infrastructure for IT, has launched an ad campaign on Cloud Computing Journal. The ads focus on security: how an organization can successfully control privilege for all of the organization’s identities to mitigate identity-related risk without slowing down the business, and how Centrify provides ...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
"SAP had made a big transition into the cloud as we believe it has significant value for our customers, drives innovation and is easy to consume. When you look at the SAP portfolio, SAP HANA is the underlying platform and it powers all of our platforms and all of our analytics," explained Thorsten Leiduck, VP ISVs & Digital Commerce at SAP, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device exp...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...