Welcome!

@CloudExpo Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White, Zakia Bouachraoui

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

Why Every IT Organization Should Consider Cloud-Integrated Storage

It's best to plan for the inevitable

You've probably already heard that storage capacity needs are growing rapidly, with IDC projecting the digital universe will exceed 40,000 exabytes (40 billion terabytes) by 2020. According to Gartner, organizations are on average growing their capacity 40% to 60% year over year. On the other hand, disk storage densities of SAN and NAS storage arrays are not keeping up with the growth of data, resulting in a looming storage capacity sprawl for many organizations. The potential impact is measured not only in additional floor-space, but also cooling, management and increases in maintenance staff. As a result, many organizations will eventually find themselves trapped in a situation where their storage needs exceed their existing on-premise capabilities - the unenviable position of facing additional investments in infrastructure and people.

In fact, a recent survey we took indicated that a full 60% of users feel that they are "always running out of storage" - a clear indication that this "future" problem is already being felt on the ground.

So should your organization take measures to prevent this sprawl from occurring? In short, yes, by planning ahead. With a few basic assumptions and calculations, you can examine the progression of storage capacity growth in your organization.

Let's say you have on-premise lab space that accommodated up to 20TB of storage in 2011, but you only needed 12TB at the time - seems like a reasonable ratio, leaving ample room for growth. Now let's assume your storage needs are growing at 40% a year (average being 40-60% as indicated above). Of course, data storage (disk) densities are also improving, albeit at a slower rate, at close to 20% annually according to IHS iSuppli. The density improvements enable storing more data in the same physical footprint, based on the assumption that you upgrade regularly. Now what happens to your storage footprint over the next 5-6 years?

 

As plotted in the chart above, your storage needs exceed maximum on-premise capacity sometime in 2014 at about a 35TB "crossover point" - of course, that is assuming you are riding the technology curve for storage density, since your capacity was only 20TB in 2011. In practice, storage is purchased in 3-5 year cycles which means storage densities improve far less frequently than the chart indicates, resulting in the need to deploy more storage sooner.

What options does an organization have to prevent a continuously growing storage footprint? With retention and compliance requirements, deleting data is rarely a good practice. To some extent, deduplication and consolidation of copies can help push out this "crossover point" but cannot make it go away.

Inevitably, nearly every organization needs to formulate a plan for dealing with storage capacity sprawl, whether it involves biting the bullet and constantly increasing investment in infrastructure and administrative resources or utilizing technologies such as cloud-integrated storage to relieve this burden from their IT organization. The latter approach may seem quite attractive considering the implications of the former.

What makes cloud-integrated storage compelling is that it serves as a buffer to meet unexpected peaks and valleys in capacity needs without any up-front capital investment. Making cloud-integrated storage a key part of your storage strategy gives you the option to scale outside of the proverbial box - capacity is always available and can be increased or decreased on-demand as needed.

Consider a plan that utilizes cloud-integrated storage in a SAN environment, offloading half of the yearly capacity growth to the cloud. This is not an unreasonable expectation given most data stored on-premise is often static. This plan effectively transforms the SAN into a hybrid environment that scales to limitless capacities without impacting existing IT staff and resources.

Using the same assumptions as before (starting with 12TB storage need in 2011 and 20TB capacity), the chart below illustrates how introducing cloud-integrated storage prevents storage capacity requirements from ever exceeding on-premise storage capacity - there is no longer any need to worry about reaching a "crossover point" and the associated repercussions.

While this example is relatively simple, leveraging the cloud for storage capacity expansion is a much better way to deploy, maintain and scale SAN or NAS storage infrastructure. It keeps IT resources focused on more valuable aspects of your business than growing on-premise infrastructure. Baking cloud-integrated storage into you SAN and NAS strategy can help avoid unwelcome expansion investments down the road.

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

CloudEXPO Stories
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the massive amount of information associated with these devices. Ed presented sought out sessions at CloudEXPO Silicon Valley 2017 and CloudEXPO New York 2017. He is a regular contributor to Cloud Computing Journal.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight and has been quoted or published in Time, CIO, Computerworld, USA Today and Forbes.
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure.