|By Nicos Vekiarides||
|October 3, 2011 11:45 AM EDT||
Anyone who purchases storage arrays is familiar with the many advantages of modular storage systems and storage area networks. However, they may also be familiar with one of the less desirable attributes of storage arrays: the typical three- to five-year lifecycle that forces decommissions and mandates upgrades on a regular basis. With many organizations expanding their need for storage by 20-60% on an annual basis , outgrowing capacity of existing storage arrays is a regular occurrence, effectively rendering upgrade cycles to be a fact of life.
Although decommissioning and upgrading a storage array may not appear all that daunting, the process includes a number of cumbersome aspects:
- The migration process from old to new storage arrays could last months, creating an overlap when both the old and a new storage array must be maintained. In some enterprise environments, the migration can cost thousands of dollars per Terabyte 
- When purchasing a new storage array, its sizing is based on anticipated future growth, resulting in an initial capacity "over-purchase" and an underutilized storage array for most of its life cycle. Result? Pre-payment for capacity that may not be needed over the next two years
- Software and architectural changes associated with each storage array upgrade may compromise stability or, at a minimum, require staff retraining on new policies and features
The Economics of Storage Array Replacement
Why decommission arrays at all? Why not simply expand storage capacity by adding new storage arrays instead of replacing them? With storage densities and capacities growing every year and cost per gigabyte dropping, the economics provide the answer.
Consider a three-year-old, 10TB storage array that costs $50,000 to purchase and has a total annual maintenance cost (vendor maintenance fee, administration, floor space, power, cooling, etc.) of $25,000 per year. Now let's say today the IT manager purchases a new 25TB storage array for $50,000. Thanks to improved storage density, it may carry the same total annual maintenance cost of $25K per year.
If the manager chooses to keep both arrays active, he would now have data residing on the old storage array at $2.50/GB in annual maintenance cost and data residing on the new storage array at only $1/GB in annual maintenance cost (with plenty of capacity to spare). This economic inefficiency makes a pretty strong business case for decommissioning the old array and moving all data to the new array as quickly as possible.
Is there any hope of breaking this storage array cycle?
There is hope with on-demand cloud storage. Cloud storage substantially changes the economics and deployment of data storage and makes some of the aspects of the storage array life cycle start to disappear entirely.
- The pay-as you-go cloud model eliminates the need to pre-purchase or over-purchase capacity
- Cost per GB drops over time as cloud providers continue to lower costs every 6-12 months
- Storage capacity is virtually limitless, so there is never a need to decommission or upgrade any hardware
The area chart below illustrates some of the economic inefficiencies of this deployment model. While most organizations consume storage at a near linear rate over time, the required initial pre-purchase of capacity results in a relatively disproportionate up-front investment. The blue area on the chart shows how total cost of ownership (TCO) increases over time with traditional storage deployment - note the spikes at every purchase cycle (Q1 Y1 & Q1 Y4). The green bars on the chart illustrate how TCO increases with a pay-as-you-go cloud storage model. Even in this simplistic case where the TCO of both models is identical after three years, you can easily spot the inefficiencies of traditional storage, which requires pre-payment for unused capacity.
This is a very basic analysis. In practice, there are many additional costs that factor into traditional storage deployments including:
- Administrative costs of the migrations from old arrays to a new arrays
- Potential overlap where both arrays must be maintained simultaneously
- Potential downtime and productivity loss during the migration process
Furthermore, the analysis does not capture some of the additional cost savings of cloud storage that include:
- Economies of scale in pricing and administration as a result of leveraging large multi-tenant cloud environments
- Price per GB erosion over time
Factoring these additional costs of traditional storage and additional savings of cloud storage may be enough to convince most organizations that it is time to introduce cloud storage into their existing IT infrastructure, however, when it comes to deploying a cloud storage strategy, many companies don't know where or how to begin.
Introducing Cloud Storage into the Organization: A Hybrid Approach
Replacing traditional storage with cloud storage will break the traditional storage array life cycle, but a complete "forklift" replacement may be overkill. While cloud storage may not necessarily meet all on-premise storage needs, it can still augment existing storage infrastructure. A hybrid local-cloud storage environment can streamline storage operations and can even extend the life cycle of traditional storage through selective data offload.
A more conservative approach might be to identify data suitable for cloud storage such as secondary copies, backups, off-site data and/or archives. Interestingly, archives are often stored on traditional onsite storage to make them easily accessible to meet compliance requirements. By some reports, like the one below from ESG, the growth rate for archive data is expected to grow approximately 56% per year.
With literally hundreds of thousands of petabytes of archives to store over the next few years, the benefits of offloading archives or infrequently accessed data from traditional storage are numerous. In fact, transitioning this data to cloud storage can extend the traditional life cycle of storage arrays beyond the typical 3-5 year time frame. Imagine a 6-10 year storage array life cycle instead. That would result in a reduction of capital investment in storage infrastructure by half and introduce a significantly more efficient just-in-time, pay-as-you-go model.
How can businesses leverage tiers of cloud storage in a manner that integrates seamlessly into an existing storage infrastructure?
Connecting the Storage Infrastructure to the Cloud
Given most storage consumed by businesses is either block-based or NAS-based, an on-premise cloud storage appliance or gateway that converts cloud storage to a block or NAS protocol can greatly simplify deployment. When choosing a solution, keep in mind that, unlike NAS, block-based solutions have the advantage of supporting both block-based access and any file system protocol. Block-based iSCSI solutions support petabytes of storage and provide thin-provisioning, caching, encryption, compression, deduplication and snapshots, matching feature sets of sophisticated SAN storage arrays. These solutions can readily reside alongside existing SANs and are available in both software and hardware form factors.
Cloud storage appliances can grow to a virtually unlimited storage capacity without the need to ever upgrade, eliminating many administrative burdens and risks of the storage array life cycle. Since cloud storage is pay-as-you-go, cost adjustments occur automatically, eliminating the economic inefficiencies of the storage array life cycle.
In combination with a cloud storage gateway or appliance, businesses should also consider storage tiering software. Auto-tiering software can be found in storage virtualization solutions, data classification solutions, and even in some hypervisor solutions. Businesses that choose an auto-tiering framework can immediately begin to extend the life cycle of their existing storage arrays and leverage the benefits of cloud storage by selectively offloading infrequently used data.
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Dec. 4, 2016 09:00 AM EST Reads: 543
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 08:45 AM EST Reads: 785
"Coalfire is a cyber-risk, security and compliance assessment and advisory services firm. We do a lot of work with the cloud service provider community," explained Ryan McGowan, Vice President, Sales (West) at Coalfire Systems, Inc., in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 08:30 AM EST Reads: 760
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU's GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes. In...
Dec. 4, 2016 08:30 AM EST Reads: 969
CloudJumper, a Workspace as a Service (WaaS) platform innovator for agile business IT, has been recognized with the Customer Value Leadership Award for its nWorkSpace platform by Frost & Sullivan. The company was also featured in a new report(1) by the industry research firm titled, “Desktop-as-a-Service Buyer’s Guide, 2016,” which provides a comprehensive comparison of DaaS providers, including CloudJumper, Amazon, VMware, and Microsoft.
Dec. 4, 2016 08:15 AM EST Reads: 705
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Dec. 4, 2016 08:15 AM EST Reads: 750
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
Dec. 4, 2016 08:00 AM EST Reads: 804
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Dec. 4, 2016 08:00 AM EST Reads: 689
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
Dec. 4, 2016 06:15 AM EST Reads: 6,971
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Dec. 4, 2016 05:30 AM EST Reads: 1,753
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
Dec. 4, 2016 04:45 AM EST Reads: 4,968
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Dec. 4, 2016 04:30 AM EST Reads: 1,554
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Dec. 4, 2016 04:00 AM EST Reads: 4,959
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Dec. 4, 2016 04:00 AM EST Reads: 6,226
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
Dec. 4, 2016 03:45 AM EST Reads: 1,570
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 02:15 AM EST Reads: 886
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 4, 2016 02:00 AM EST Reads: 3,786
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 4, 2016 12:30 AM EST Reads: 1,788
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
Dec. 3, 2016 11:00 PM EST Reads: 4,167
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 3, 2016 11:00 PM EST Reads: 972