Welcome!

@CloudExpo Authors: Pat Romanski, Zakia Bouachraoui, Elizabeth White, Yeshim Deniz, Liz McMillan

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Blog Feed Post

Do VMs Still Matter in the Cloud?

Virtual Machines encompass “virtual hardware” and very real operating systems

By John Considine

There’s a long running debate about the true role of Virtual Machines (VMs) in cloud computing.  In talking with CTOs at the large vendors as well as the “Clouderati” over the last two years, there seems to be the desire to eliminate the VM from cloud computing.  A colleague of mine, Simeon Simeonov, wrote a blog a couple of weeks ago that made the case for eliminating the VM.  While the argument is appealing, and there is growing support for the idea, I’d like to argue that there are compelling reasons to keep the Virtual Machine as the core of cloud computing.

Virtual Machines encompass “virtual hardware” and very real operating systems.  VMs drive the economics and flexibility of the cloud by allowing complete servers to be created on-demand and in many cases share the same physical hardware.  The virtual machines provide a complete environment for applications to run – just like they would on their own individual server, including both the hardware and operating system.

Sim and other cloud evangelists would like to see applications developed independent of the underlying operating systems and hardware. Implied in this argument is that developers shouldn’t be constrained anymore by an “outdated” VM construct, but should design from scratch for the cloud and its horizontal scalability. This reminds me of early conversations I had when we were just starting CloudSwitch that went something like: “If you just design your applications to be stateless, fault tolerant, and horizontally scalable, then you can run them in the cloud.”  The message seemed to be that if you do all of the work to make your applications cloud-like, they will run great in the cloud.  The motivation is cost savings, flexibility, and almost infinite scalability, and the cost is redesigning everything around the limitations and architectures offered by the cloud providers.

But why should we require everyone to adapt to the cloud instead of adapting the cloud to the users?  Amazon’s EC2 was the very first “public cloud” and it was designed with some really strange attributes that were driven from a combination of technology choices and a web-centric view of the world.  We ended up with notions of “ephemeral storage” and effectively random IP address assignment as well as being told that the servers can and will fail without notice or remediation.  These properties would never work in an enterprise datacenter; I can’t imagine anyone proposing them, much less a company implementing them.

But somehow, and this is what disruption is really about, it was OK for Amazon to offer this because the users would adjust to the limitations.  The process began with customers selecting web based applications to be put in the cloud.  Then a number of startups formed to make this new computing environment easier to use; methods of communicating the changing addresses, ways to persist storage, methods of monitoring and restarting resources in the cloud, and much more.

As cloud computing continued to evolve, the clouds started offering “better” features.  Amazon introduced persistent block storage (EBS) to provide “normal” storage, VPC to allow for better IP address management, and a host of other features that allow for more than just web applications to run in the cloud.  In this same timeframe a number of cloud providers entered the market with features and functions that were more closely aligned with “traditional” computing architectures.

The obvious question is what is driving these “improvements”?  Clearly the early clouds had captured developers and web applications without these capabilities – just look at the number of startups using the cloud (pretty much all of them).  I’d assert that the enterprise customers are driving the more recent cloud feature sets – since the enterprise has both serious problems and serious money to spend.   If this is true, then we can project forward on the likely path both the clouds and the enterprises will follow.

This brings us back to the role of the Virtual Machine.  Enterprises have learned over the years that details matter in complex systems.  Even though we want to move towards application development that doesn’t touch the hardware or operating systems objects, we must recognize that there is important work done at this level – hardware control, the creation and management of sockets, memory management, file system access, etc.  No matter how abstract the applications become, there is some form of an operating system that works with these low level constructs.  Further, changes at the operating system level can affect the whole system – think Windows automatic updates, Linux YUM updates, new packages or kernel patches have caused whole systems to fail; this is the reason that enterprises tightly control these updates.  This means in turn that the enterprise needs to have control of their operating systems if they want to use their software and management policies, and the way that you control your operating system in the cloud is with VMs.

Enterprise requirements are driving the evolution and adoption of the cloud and this will make the use of VMs even more important than it has been to date. Cloud providers know that enterprise customers are critical to their own success and will make sure that they deliver a cloud model that feels familiar and controllable to enterprise IT and developers.

Read the original blog entry...

More Stories By Ellen Rubin

Ellen Rubin is the CEO and co-founder of ClearSky Data, an enterprise storage company that recently raised $27 million in a Series B investment round. She is an experienced entrepreneur with a record in leading strategy, market positioning and go-to- market efforts for fast-growing companies. Most recently, she was co-founder of CloudSwitch, a cloud enablement software company, acquired by Verizon in 2011. Prior to founding CloudSwitch, Ellen was the vice president of marketing at Netezza, where as a member of the early management team, she helped grow the company to more than $130 million in revenues and a successful IPO in 2007. Ellen holds an MBA from Harvard Business School and an undergraduate degree magna cum laude from Harvard University.

CloudEXPO Stories
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.
CloudEXPO New York 2018, colocated with DevOpsSUMMIT and DXWorldEXPO New York 2018 will be held November 12-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI and Machine Learning to one location.
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of computational needs for many industries. Their solutions provide benefits across many environments, such as datacenter deployment, HPC, workstations, storage networks and standalone server installations. ICC has been in business for over 23 years and their phenomenal range of clients include multinational corporations, universities, and small businesses.
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or personal computing needs.