Welcome!

Cloud Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Carmen Gonzalez, Rich Waidmann

Related Topics: Virtualization, Java, XML, SOA & WOA, Open Source, Cloud Expo

Virtualization: Blog Post

The Evolution of Solid State Arrays

Solid state storage continues to evolve

In the first wave of solid-state storage arrays, we saw commodity style SSDs (solid state drives) being added to traditional storage arrays. This solution provided an incremental benefit in performance over spinning hard drives, however the back-end technology in these arrays was developed up to 20 years ago and was purely focused around driving performance out of the slowest part of the infrastructure – the hard drive.  Of course SSDs are an order of magnitude faster than HDDs so you can pretty much guarantee SSDs in traditional arrays results in underused resources, but is premium priced.

Wave 2 of SSD arrays saw the development of custom hardware, mostly still continuing to use commodity SSDs.  At this point we saw full exploitation of the solid state capabilities, with architecture designed to provide the full performance capabilities of solid state drives.  These arrays removed unnecessary or bottlenecking features (like cache) and provided much more back-end scalability.  Within the wave 2 group, Nimbus Data have chosen a hybrid approach and developed their own solid state drives.  This gives them more control over the management functionality of the SSDs and subsequently more control over performance and availability.

Notably, some startup vendors have taken a slightly different approach.  Violin Memory have chosen from day 1 to use custom NAND memory cards called VIMMs (Violin Intelligent Memory Module). This technology removes the need for NAND to emulate a hard drive and for the interface between the processor/memory & persistent memory (e.g. the NAND) to go across a hard drive interface like SAS using the SCSI protocol.  Whilst it could be debated that the savings from removing the disk drive protocol could be marginal, the use of NAND that doesn’t emulate hard drives is about much more than that.  SSD controllers have many features to extend the life of the drive itself.  This includes wear levelling and garbage collection, features that could have a direct impact on device performance.  Custom NAND components can, for instance allow wear levelling to be achieved across the entire array or for individual cell failures to be managed more efficiently.

Building bespoke NAND components isn’t cheap.  Violin have chosen to invest in technology that they believe gives them an advantage in their hardware – no dependency on SSD manufacturers.  The ability to build advanced functionality into their persistent memory means availability can be increased (components don’t need to be swapped out as frequently – failing components can be partially used).

At this point we should do a call out to Texas Memory Systems, recently acquired by IBM.  They have also used custom NAND components; their RamSan-820 uses 500GB flash modules using eMLC memory.

I believe that the third wave will see many more vendors looking to move away from the SSD form factor and building bespoke NAND components as Violin have done.  Currently Violin and TMS have the headstart.  They’ve done the hard work and built the foundation of their platform.  Their future innovations will probably revolve around bigger and faster devices and replacing NAND with whatever is the next generation of persistent memory.

Last week, HDS announced their approach to full flash devices; a new custom-build Flash Module Drive (FMD) that can be added to the VSP platform.  This provides 1.6TB or 3.2TB (higher capacity due March 2013) of storage per module, which can then be stacked into an 8U shelf of 48 FMDs in total – a total of 600TB of flash in a single VSP.  Each FMD is like a traditional SSD drive in terms of height and width, but is much deeper in size.  It appears to the VSP as a traditional SSD.

The FMD chassis is separate to the existing disk chassis that are deployed in the VSP and so FMDs can’t be deployed in conjunction with hard drives.  Although this seems like a negative, the flash modules have higher specification back-end directors (to fully utilise the flash performance), which, in addition to their size, explains why they wouldn’t be mixed together.

Creating a discrete flash module provides Hitachi with a number of benefits compared to individual MLC SSDs including:

  • Higher performance on mixed workloads
  • Inbuilt compression using the onboard custom chips
  • Improved ECC error correction using onboard code and hardware
  • Lower power per TB consumption from higher memory density
  • > 1,000,000 IOPS in a single array

The new FMDs can also be used with HDT (dynamic tiering) to cater for mixed sub-LUN workloads and of course Hitachi’s upgraded microcode is already optimised to work with flash devices.

The Architect’s View
Solid state storage continues to evolve.  NAND flash is fast and has its foibles but this can be overcome with dedicated NAND modules.  Today, only four vendors have moved to dedicated solid-state components while the others continue to use commodity SSDs.  At scale, performance and availability, when viewed in terms of consistency become much more important.  Many vendors today are producing high performance devices, but how well will they scale going forward and how resilient will they be?  As the market matures, these differences will be the dividing line between survival and failure.

Disclaimer: I recently attended the Hitachi Bloggers’ and Influencers’ Days 2012.  My flights and accommodation were covered by Hitachi during the trip, however there is no requirement for me to blog about any of the content presented and I am not compensated in any way for my time when attending the event.  Some materials presented were discussed under NDA and don’t form part of my blog posts, but could influence future discussions.

Related Links

Comments are always welcome; please indicate if you work for a vendor as it’s only fair. If you have any related links of interest, please feel free to add them as a comment for consideration.

Read the original blog entry...

@CloudExpo Stories
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the ...
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
At 15th Cloud Expo, Shrikant Pattathil, Executive Vice President at Harbinger Systems, demos a video delivery platform that helps you do interactive videos. He discusses how Harbinger is accomplishing it in the cloud world, the problems they faced and the choices they made to get around these problems.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
CloudBees, Inc., has announced a $23.5 million financing round, led by longtime CloudBees investor Lightspeed Venture Partners. Existing investors Matrix Partners, Verizon Ventures and Blue Cloud Ventures also participated in the round. The latest funding announcement follows earlier rounds of $4 million, $10.5 million and $10.8 million, bringing the total investment in CloudBees to just under $50 million since the company’s inception in 2010. Previous venture investment rounds were led by Ma...
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the tec...
The cloud is becoming the de-facto way for enterprises to leverage common infrastructure while innovating and one of the biggest obstacles facing public cloud computing is security. In his session at 15th Cloud Expo, Jeff Aliber, a global marketing executive at Verizon, discussed how the best place for web security is in the cloud. Benefits include: Functions as the first layer of defense Easy operation –CNAME change Implement an integrated solution Best architecture for addressing network-l...
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
IBM has announced a new strategic technology services agreement with Anthem, Inc., a health benefits company in the U.S. IBM has been selected to provide operational services for Anthem's mainframe and data center server and storage infrastructure for the next five years. Among the benefits of the relationship, Anthem has the ability to leverage IBM Cloud solutions that will help increase the ease, availability and speed of adding infrastructure to support new business requirements.
“DevOps is really about the business. The business is under pressure today, competitively in the marketplace to respond to the expectations of the customer. The business is driving IT and the problem is that IT isn't responding fast enough," explained Mark Levy, Senior Product Marketing Manager at Serena Software, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Today, IT is not just a cost center. IT is an enabler and driver of business. With the emergence of the hybrid cloud paradigm, IT now has increasingly more capabilities to create new strategic opportunities for a business. Hybrid cloud allows an organization to utilize multi-tenant public clouds, dedicated private clouds, bare metal hosting, and the associated support and services for the right use cases through an on-demand, XaaS model. This model of IT creates tremendous opportunities for busi...
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, explored the synergies in these two approaches, with practical tips, techniques, research data, wa...
Software AG and Wipro Ltd. have announced a joint solution platform for streaming analytics that provides real-time actionable intelligence for the Internet of Things (IoT) market. “The key to successfully addressing the IoT market is the ability to rapidly build and evolve apps that tap into, analyze and make smart decisions on fast, big data”, said John Bates, Global Head of Industry Solutions and CMO, Software AG. To address the huge market potential created by streaming analytics in conj...
Appcore deploys cloud for service providers based on the Apache Cloud set. In this demo at 15th Cloud Expo, Nate Gordon, Director of Technology at Appcore, shows their new product that's coming out in January - Appcore Atlas, which is focused on deploying private clouds based on CloudStack in 15 minutes or less. Our upcoming June 9-11, 2015, event in New York City will present a total of 10 simultaneous tracks (the largest conference content in the world) by an all-star faculty, over three days...
Amazon, Google and Facebook are household names in part because of their mastery of Big Data. But what about organizations without billions of dollars to spend on Big Data tools - how can they extract value from their data? In his session at 6th Big Data Expo®, Ali Ghodsi, Co-Founder and Head of Engineering at Databricks, discussed how the zero management cost and scalability of the cloud is addressing the challenges and pain points that data engineers face when working with Big Data. He also s...
The term culture has had a polarizing effect among DevOps supporters. Some propose that culture change is critical for success with DevOps, but are remiss to define culture. Some talk about a DevOps culture but then reference activities that could lead to culture change and there are those that talk about culture change as a set of behaviors that need to be adopted by those in IT. There is no question that businesses successful in adopting a DevOps mindset have seen departmental culture change, ...
CA Technologies released a new study – “DevOps: The Worst-Kept Secret to Winning in the Application Economy” – that reveals that 82% of enterprises in Asia Pacific and Japan (APJ) already have or plan to adopt a DevOps strategy, a 12 point increase from last year’s figure of 70%. DevOps is a methodology which helps foster collaboration between the teams that create and test applications (Dev) with those that maintain them in production environments (Ops). Vanson Bourne conducted the survey with...
SYS-CON Events announced today that that Innodisk, the service-driven provider of industrial embedded flash and DRAM storage products and technologies, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Innodisk is a service-driven provider of industrial embedded flash and DRAM storage products and technologies. With satisfied customers across the embedded, aerospace and defense, cloud storage markets an...