Welcome!

@CloudExpo Authors: Gopala Krishna Behara, Sridhar Chalasani, Tirumala Khandrika, LeanTaaS Blog, Pat Romanski

Related Topics: @CloudExpo, Linux Containers, Containers Expo Blog, Machine Learning

@CloudExpo: Blog Feed Post

Like “API” Is “Storage Tier” Redefining Itself?

There is an interesting bit in high-tech that isn’t much mentioned but happens pretty regularly

There is an interesting bit in high-tech that isn’t much mentioned but happens pretty regularly – when a good idea is adapted and moved to new uses, raising it a bit in the stack or revising it to keep up with the times. The quintessential example of this phenomenon is the progression from “subroutines” to “libraries” to “frameworks” to “APIs” to “Web Services”. The progression is logical and useful, but those assembler and C programmers that were first stuffing things into reusable subroutines could not have foreseen the entire spectrum of what their “useful” idea was going to become over time. I had the luck of developing in all of  those stages. I wrote assembly routines right before they were no longer necessary for everyday development, and wrote web services/SOA routines for the first couple of years they were about.


YES INDEED, THIS IS A STORAGE BLOG

I think we see that happening in storage and don’t even realize it yet, which is kind of cool, because we all get a ring-side seat if you know where to look. image

When the concept of tiering first came around – I am not certain if it was first introduced with HSM or ILM, someone can weigh in on that distinction in the comments – it was aimed at the difference in performance between disks and arrays of disks. The point was that your more expensive disk wasn’t necessary for everyday tasks. And it was a valid point. Tiering has become a part of most large organizations’ storage management plans, just because it makes sense.

But the one truth about technology over the last twenty or thirty years is that it absolutely does not stand still. The moment you think it has plateaued, something new comes along from left field and changes the landscape. Storage has seen no shortage of this process, with the disks that were being used at the time tiering was introduced being replaced by SAS and SATA, then eventually SATA II. The interesting thing about these changes is that the reliability and access speed differences have gone down as a percentage since the days of SCSI vs. ATA. The disks just keep getting more reliable and faster. and with RAID everywhere, you get increased reliability through data redundancy. Though the amount of reliability you gain is dependent upon the level of RAID you choose, that’s relatively common knowledge at this point, so we won’t get too deep into it here.

Image Courtesy of www.clickonF5.org


BRING ON THE CHANGE!

And then the first bombshell hit. SSD. The performance difference of SSD versus hard disk is astounding and very real. It’s not something so close that you could choose to implement the slower technology (as is  true with hard disks), if you need the performance level of SSD for a given application, there are very few options but to bite the bullet and buy SSD. But it’s fast. It’s very fast. And prices are coming down.

Now the second bombshell hits. Cloud Storage. It’s immense. It’s very immense. And with a Cloud Storage Gateway, it looks like all your other storage – or at least all your other NAS storage. Companies like Cirtas and Nasuni are making cloud usable with local caches and interfaces to cloud providers. Some early reports like this one from Storage Switzerland claim that they make access “as fast as local storage”, but I’ll wager that’s untrue, simply because the cache IS local storage, all else has to go out through your WAN link. By definition that means the aggregate is slower than local disk access unless every file operation is a cache hit. Mathematically, I think that would be highly improbable. But even so, if they speed up cloud storage access and make it enterprise friendly, you now have a huge – potentially unlimited – place to store your stuff. And if my guess is right (it is a guess, have not tested at all, and don’t know of any ongoing testing), our WOM product should make these things perform like LAN storage due to the combination of TCP optimizations, compression and de-duplication in-flight reducing the burden on the WAN.


AND THAT’S WHERE IT GETS INTERESTING

So your hard disks are so close in performance and reliability – particularly after taking RAID into account – that the importance of the old definitions is blurred. You can have tier one with SATA II disks. No problem, lots of smaller and medium sized orgs DO have such an arrangement.

But that implies that what used to be “tier one” and “tier two” have greatly merged, the line between them blurring. Just in time for these two highly differentiated technologies to take on. I have a vision of the future where high-performance, high-volume sites use SSD for more and more of tier one, RAIDed SAS and/or SATA drives for tier two, and cloud storage for backup/replication/tier three. Then tiers have meaning again – tier one is screaming fast, tier two is the old standby, combining fast and reliable, and tier three is cloud storage (be it public or private, others can argue that piece out)…

And that has implications for both budgeting and architecture. SSD is more expensive. Depending upon your provider and usage patterns, cloud is less expensive (than disk, not tape). That implies a shift of dollars from the low end to the high end of your spending patterns. Perhaps, if you have savvy contract negotiators, it means actual savings overall on storage expenses, but more likely you’re just smoothing the spending out by paying monthly for cloud services instead of “Oh No, we have to  buy a new array”.


A BRIGHT, TIERFUL FUTURE

But tiering is a lot more attractive if you have three actual distinct tiers that serve specific purposes. Many organizations will start with tape as the final destination for backup purposes, but I don’t believe they’ll stay there. Backing up to disk has a long history at this point, and if that backup was going to disk that you can conceivably keep for as long as you’re willing to pay for it, I suspect that archival will be the primary focus of tape going forward. I don’t predict that tape will die, it is just too convenient and too intertwined to walk away from. And it makes sense for archival purposes - “we have to keep this for seven billion years because of government regulation, but we don’t need it” is a valid use for storage that you don’t pay by the month for and is stable for longer periods of time.

Of course I think you should throw an ARX in front of all of this storage to handle the tiering for you, but there are other options out there, something will have to make the determination, so find what works best for you.

Not so long ago, I would have claimed that most organizations didn’t need SSD, and only heavily stressed databases would actually see the benefit. These days I’m more sanguine about the prospects. As prices drop, ever more uses for SSDs are apparent. As of this writing they’re running $2 - $2.50 per gig, a lot more than SATA or even SAS, but most companies don’t need nearly as much tier one storage as they do tier two.


WATCH FOR IT

That’s the way I see it falling out too – prices on SSD will continue to drive down toward SAS/SATA, and you’ll want to back up tier one a lot more – which you should anyway – while cloud storage started pretty inexpensive and will likely continue to drop while all gets sorted out.

And like “subroutine”, you’ll only find traditional hard disks alone in the data center for small or very special purpose uses. Like the subroutine, it will give way to more specialized collections of storage on one end and “inling” SSDs on the other end.

Until the Next Big Thing comes along anyway.

 

 

 

 

 

 

 

Image compliments of steampunkworkshop.com go ahead, click on the image, it’s a steam USB charger – the next big thing…


Follow me on Twitter icon_facebook

AddThis Feed Button Bookmark and Share

Related Articles and Blogs

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is founder of Ingrained Technology, A technical advocacy and software development consultancy. He has experience in application development, architecture, infrastructure, technical writing,DevOps, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
"We started a Master of Science in business analytics - that's the hot topic. We serve the business community around San Francisco so we educate the working professionals and this is where they all want to be," explained Judy Lee, Associate Professor and Department Chair at Golden Gate University, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
You know you need the cloud, but you're hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You're looking at private cloud solutions based on hyperconverged infrastructure, but you're concerned with the limits inherent in those technologies. What do you do?
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...