Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Blog Feed Post

Breaking the Storage Array Life Cycle with Cloud Storage: Part I

Why decommission arrays at all?

Anyone who purchases storage arrays is familiar with the many advantages modular storage systems and storage area networks offer. However, they may also be familiar with one of the less desirable attributes of storage arrays: the typical 3-yr to 5-yr life cycle that forces decommissions and mandates upgrades on a regular basis. With many organizations expanding their need for storage by 20-60% on an annual basis, outgrowing capacity of existing storage arrays is a regular occurrence, effectively rendering upgrade cycles to be a fact of life.

Although decommissioning and upgrading a storage array may not appear all that daunting, the process includes a number of cumbersome aspects:

  • First, the migration process from old to new storage arrays could last months, creating an overlap when both the old and a new storage array must be maintained. In some enterprise environments, the migration could even cost thousands of dollars per Terabyte (read “Reducing Costs and Risks for Data Migrations” research paper by Hitachi Data Systems for more on the topic of costs)
  • Second, when purchasing a new storage array, its sizing is selected based on anticipated future growth, resulting in an initial capacity “over-purchase” and an underutilized storage array for most of its life cycle.   Result?  Pre-payment for capacity you may not need over the next 2 years
  • Third, software and architectural changes associated with each storage array upgrade may compromise stability or, at a minimum, require staff retraining on new tools, policies and features

So why decommission arrays at all? Why not simply expand storage capacity by adding new storage arrays instead of replacing them? Well, with storage densities and capacities growing every year and cost per GB dropping every year, the economics provide the answer.

Let’s say you own a 10TB storage array from 3 years ago that cost $50K to purchase and has a total annual maintenance cost (vendor maintenance fee, administration, floor space, power, cooling, etc) of $25K per year. Now let’s say today you purchase a new 25TB storage array for $50K. Thanks to improved storage density, you may find it carries the same total annual maintenance cost of $25K per year.

If you choose to keep both arrays active, you would now have data residing on the old storage array at $2.50/GB in annual maintenance cost and data residing on the new storage array at only $1/GB in annual maintenance cost (with plenty of capacity to spare). This economic inefficiency makes a pretty strong business case for decommissioning the old array and moving all data to the new array as quickly as possible.

So is there any hope of breaking this storage array cycle?

There is indeed hope with on-demand cloud storage. Cloud storage substantially changes the economics and deployment of data storage and makes some of the aspects of the storage array life cycle start to disappear entirely.

  • With a pay-as you-go model, there is no longer a need to ever pre-purchase or over-purchase capacity
  • The yearly cost per GB drops every year as cloud providers continue to lower costs every 6-12 months
  • Storage capacity is virtually limitless, so there is never a need to decommission or upgrade any hardware

Sound appealing? Look for part II of this series, where we’ll examine exactly how cloud storage can start to break this storage array life cycle.

Read the original blog entry...

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

CloudEXPO Stories
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.
CloudEXPO New York 2018, colocated with DevOpsSUMMIT and DXWorldEXPO New York 2018 will be held November 12-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI and Machine Learning to one location.