|By Nicos Vekiarides||
|October 3, 2011 11:45 AM EDT||
Anyone who purchases storage arrays is familiar with the many advantages of modular storage systems and storage area networks. However, they may also be familiar with one of the less desirable attributes of storage arrays: the typical three- to five-year lifecycle that forces decommissions and mandates upgrades on a regular basis. With many organizations expanding their need for storage by 20-60% on an annual basis , outgrowing capacity of existing storage arrays is a regular occurrence, effectively rendering upgrade cycles to be a fact of life.
Although decommissioning and upgrading a storage array may not appear all that daunting, the process includes a number of cumbersome aspects:
- The migration process from old to new storage arrays could last months, creating an overlap when both the old and a new storage array must be maintained. In some enterprise environments, the migration can cost thousands of dollars per Terabyte 
- When purchasing a new storage array, its sizing is based on anticipated future growth, resulting in an initial capacity "over-purchase" and an underutilized storage array for most of its life cycle. Result? Pre-payment for capacity that may not be needed over the next two years
- Software and architectural changes associated with each storage array upgrade may compromise stability or, at a minimum, require staff retraining on new policies and features
The Economics of Storage Array Replacement
Why decommission arrays at all? Why not simply expand storage capacity by adding new storage arrays instead of replacing them? With storage densities and capacities growing every year and cost per gigabyte dropping, the economics provide the answer.
Consider a three-year-old, 10TB storage array that costs $50,000 to purchase and has a total annual maintenance cost (vendor maintenance fee, administration, floor space, power, cooling, etc.) of $25,000 per year. Now let's say today the IT manager purchases a new 25TB storage array for $50,000. Thanks to improved storage density, it may carry the same total annual maintenance cost of $25K per year.
If the manager chooses to keep both arrays active, he would now have data residing on the old storage array at $2.50/GB in annual maintenance cost and data residing on the new storage array at only $1/GB in annual maintenance cost (with plenty of capacity to spare). This economic inefficiency makes a pretty strong business case for decommissioning the old array and moving all data to the new array as quickly as possible.
Is there any hope of breaking this storage array cycle?
There is hope with on-demand cloud storage. Cloud storage substantially changes the economics and deployment of data storage and makes some of the aspects of the storage array life cycle start to disappear entirely.
- The pay-as you-go cloud model eliminates the need to pre-purchase or over-purchase capacity
- Cost per GB drops over time as cloud providers continue to lower costs every 6-12 months
- Storage capacity is virtually limitless, so there is never a need to decommission or upgrade any hardware
The area chart below illustrates some of the economic inefficiencies of this deployment model. While most organizations consume storage at a near linear rate over time, the required initial pre-purchase of capacity results in a relatively disproportionate up-front investment. The blue area on the chart shows how total cost of ownership (TCO) increases over time with traditional storage deployment - note the spikes at every purchase cycle (Q1 Y1 & Q1 Y4). The green bars on the chart illustrate how TCO increases with a pay-as-you-go cloud storage model. Even in this simplistic case where the TCO of both models is identical after three years, you can easily spot the inefficiencies of traditional storage, which requires pre-payment for unused capacity.
This is a very basic analysis. In practice, there are many additional costs that factor into traditional storage deployments including:
- Administrative costs of the migrations from old arrays to a new arrays
- Potential overlap where both arrays must be maintained simultaneously
- Potential downtime and productivity loss during the migration process
Furthermore, the analysis does not capture some of the additional cost savings of cloud storage that include:
- Economies of scale in pricing and administration as a result of leveraging large multi-tenant cloud environments
- Price per GB erosion over time
Factoring these additional costs of traditional storage and additional savings of cloud storage may be enough to convince most organizations that it is time to introduce cloud storage into their existing IT infrastructure, however, when it comes to deploying a cloud storage strategy, many companies don't know where or how to begin.
Introducing Cloud Storage into the Organization: A Hybrid Approach
Replacing traditional storage with cloud storage will break the traditional storage array life cycle, but a complete "forklift" replacement may be overkill. While cloud storage may not necessarily meet all on-premise storage needs, it can still augment existing storage infrastructure. A hybrid local-cloud storage environment can streamline storage operations and can even extend the life cycle of traditional storage through selective data offload.
A more conservative approach might be to identify data suitable for cloud storage such as secondary copies, backups, off-site data and/or archives. Interestingly, archives are often stored on traditional onsite storage to make them easily accessible to meet compliance requirements. By some reports, like the one below from ESG, the growth rate for archive data is expected to grow approximately 56% per year.
With literally hundreds of thousands of petabytes of archives to store over the next few years, the benefits of offloading archives or infrequently accessed data from traditional storage are numerous. In fact, transitioning this data to cloud storage can extend the traditional life cycle of storage arrays beyond the typical 3-5 year time frame. Imagine a 6-10 year storage array life cycle instead. That would result in a reduction of capital investment in storage infrastructure by half and introduce a significantly more efficient just-in-time, pay-as-you-go model.
How can businesses leverage tiers of cloud storage in a manner that integrates seamlessly into an existing storage infrastructure?
Connecting the Storage Infrastructure to the Cloud
Given most storage consumed by businesses is either block-based or NAS-based, an on-premise cloud storage appliance or gateway that converts cloud storage to a block or NAS protocol can greatly simplify deployment. When choosing a solution, keep in mind that, unlike NAS, block-based solutions have the advantage of supporting both block-based access and any file system protocol. Block-based iSCSI solutions support petabytes of storage and provide thin-provisioning, caching, encryption, compression, deduplication and snapshots, matching feature sets of sophisticated SAN storage arrays. These solutions can readily reside alongside existing SANs and are available in both software and hardware form factors.
Cloud storage appliances can grow to a virtually unlimited storage capacity without the need to ever upgrade, eliminating many administrative burdens and risks of the storage array life cycle. Since cloud storage is pay-as-you-go, cost adjustments occur automatically, eliminating the economic inefficiencies of the storage array life cycle.
In combination with a cloud storage gateway or appliance, businesses should also consider storage tiering software. Auto-tiering software can be found in storage virtualization solutions, data classification solutions, and even in some hypervisor solutions. Businesses that choose an auto-tiering framework can immediately begin to extend the life cycle of their existing storage arrays and leverage the benefits of cloud storage by selectively offloading infrequently used data.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Apr. 29, 2016 09:15 PM EDT Reads: 311
The IoT has the potential to create a renaissance of manufacturing in the US and elsewhere. In his session at 18th Cloud Expo, Florent Solt, CTO and chief architect of Netvibes, will discuss how the expected exponential increase in the amount of data that will be processed, transported, stored, and accessed means there will be a huge demand for smart technologies to deliver it. Florent Solt is the CTO and chief architect of Netvibes. Prior to joining Netvibes in 2007, he co-founded Rift Technol...
Apr. 29, 2016 09:00 PM EDT Reads: 1,539
If there is anything we have learned by now, is that every business paves their own unique path for releasing software- every pipeline, implementation and practices are a bit different, and DevOps comes in all shapes and sizes. Software delivery practices are often comprised of set of several complementing (or even competing) methodologies – such as leveraging Agile, DevOps and even a mix of ITIL, to create the combination that’s most suitable for your organization and that maximize your busines...
Apr. 29, 2016 08:30 PM EDT Reads: 1,747
Struggling to keep up with increasing application demand? Learn how Platform as a Service (PaaS) can streamline application development processes and make resource management easy.
Apr. 29, 2016 07:45 PM EDT Reads: 2,011
New Relic, Inc. has announced a set of new features across the New Relic Software Analytics Cloud that offer IT operations teams increased visibility, and the ability to diagnose and resolve performance problems quickly. The new features further IT operations teams’ ability to leverage data and analytics, as well as drive collaboration and a common, shared understanding between teams. Software teams are under pressure to resolve performance issues quickly and improve availability, as the comple...
Apr. 29, 2016 07:30 PM EDT Reads: 2,401
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, will draw upon their own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He will also discuss the implementation of microservices in data and applicat...
Apr. 29, 2016 05:15 PM EDT Reads: 1,695
See storage differently! Storage performance problems have only gotten worse and harder to solve as applications have become largely virtualized and moved to a cloud-based infrastructure. Storage performance in a virtualized environment is not just about IOPS, it is about how well that potential performance is guaranteed to individual VMs for these apps as the number of VMs keep going up real time. In his session at 18th Cloud Expo, Dhiraj Sehgal, in product and marketing at Tintri, will discu...
Apr. 29, 2016 04:30 PM EDT Reads: 639
Join IBM June 8 at 18th Cloud Expo at the Javits Center in New York City, NY, and learn how to innovate like a startup and scale for the enterprise. You need to deliver quality applications faster and cheaper, attract and retain customers with an engaging experience across devices, and seamlessly integrate your enterprise systems. And you can't take 12 months to do it.
Apr. 29, 2016 04:30 PM EDT Reads: 1,776
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, will discuss how research has demonstrated the value of Machine Learning in delivering next generation analytics to im...
Apr. 29, 2016 03:45 PM EDT Reads: 1,621
This is not a small hotel event. It is also not a big vendor party where politicians and entertainers are more important than real content. This is Cloud Expo, the world's longest-running conference and exhibition focused on Cloud Computing and all that it entails. If you want serious presentations and valuable insight about Cloud Computing for three straight days, then register now for Cloud Expo.
Apr. 29, 2016 03:30 PM EDT Reads: 1,669
As you respond to increasing requests for new analytics, you need fast and flexible technology in your arsenal so that you can deploy the right workload to the right platform for the need at hand. Do you need self-service and fast time to value? Do you have data and application control and privacy needs, along with strict SLAs to meet? IBM dashDB™ is data warehouse technology powered by in-memory computing and in-database analytics that are designed for fast results, scalability and more.
Apr. 29, 2016 03:15 PM EDT Reads: 1,548
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Apr. 29, 2016 03:00 PM EDT Reads: 863
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
Apr. 29, 2016 02:45 PM EDT Reads: 871
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Apr. 29, 2016 02:30 PM EDT Reads: 1,407
The paradigm has shifted. A Gartner survey shows that 43% of organizations are using or plan to implement the Internet of Things in 2016. However, not just a handful of companies are still using the old-style ad-hoc trial-and-error ways, unaware of the critical barriers, paint points, traps, and hidden roadblocks. How can you become a winner? In his session at @ThingsExpo, Tony Shan will present a methodical approach to guide the holistic adoption and enablement of IoT implementations. This ov...
Apr. 29, 2016 02:00 PM EDT Reads: 1,547
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., will focus on real world deployments of DDoS mitigation strategies in every layer of the network. He will give an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He will also outline what we have found in our experience managing and running thousands of Linux and Unix managed service platforms and what specifically c...
Apr. 29, 2016 01:45 PM EDT Reads: 1,082
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
Apr. 29, 2016 01:30 PM EDT Reads: 832
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
Apr. 29, 2016 01:30 PM EDT Reads: 794
SYS-CON Events announced today that Stratoscale, the software company developing the next generation data center operating system, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Stratoscale is revolutionizing the data center with a zero-to-cloud-in-minutes solution. With Stratoscale’s hardware-agnostic, Software Defined Data Center (SDDC) solution to store everything, run anything and scale everywhere...
Apr. 29, 2016 01:00 PM EDT Reads: 1,493
SYS-CON Events announced today that Ericsson has been named “Gold Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Ericsson is a world leader in the rapidly changing environment of communications technology – providing equipment, software and services to enable transformation through mobility. Some 40 percent of global mobile traffic runs through networks we have supplied. More than 1 billion subscribers around the world re...
Apr. 29, 2016 01:00 PM EDT Reads: 776