Click here to close now.

Welcome!

@CloudExpo Blog Authors: Liz McMillan, Lori MacVittie, Elizabeth White, Pat Romanski, VictorOps Blog

Related Topics: @ContainersExpo, @CloudExpo Blog

@ContainersExpo: Blog Post

Virtualization Does Not a Cloud Make

Cloud Computing is not about any particular technology, but rather is a new operational model for the delivery of IT services

In a previous post we discussed the positive shift in the cloud computing discourse towards actionable steps rather than philosophical diatribes on definitions. And to support that discussion we offered the following list of things not to do:

  1. Not understanding the business value
  2. Assuming server virtualization is enough
  3. Not understanding service dependencies
  4. Leveraging traditional monitoring
  5. Not understanding internal/external costs

As we continue our discussion of these missteps, in this post we'll address both a mistake and a common misconception.

Cloud computing is not about any particular technology, but rather is a new operational model for the delivery of IT services. Make no mistake: technology and implementation decisions have the potential to radically change your IT financial models by increasing IT efficiency. But this does not mean that specific technologies are requisite components of a Cloud.

Virtualization is one of those technologies frequently associated, and sometimes thought to be synonymous, with cloud computing. Moreover, if you asked a group of 20 IT professionals to define virtualization, the overwhelming majority would reply: "VMware." Together, these misconceptions perpetuate the notion that an organization can realize a cloud delivery paradigm by exclusively leveraging VMware or other comparable virtualization technologies. But this notion is false - VMware certainly deserves praise for their marketing prowess in the cloud space, and for providing a powerful, cloud-enabling technology. But implemented alone, VMware is not a cloud; it is a recipe for more operational headaches than you already have.

Let's start by refining our understanding of virtualization. VMware, Xen, KVM, and HyperV provide server virtualization. They facilitate ease of provisioning and movement of application workloads, and enable the sharing of an individual server's resources among multiple applications. But is that the only thing we consider when we deploy a solution? What about networking and storage? How about middleware and the data consumed by the workload? What about the application itself?

We have been virtualizing at the server level since MVS was invented for the mainframe. We've been virtualizing networks for the past 20 years. In the past decade we've begun to aggressively virtualize storage platforms as well, so none of those bear further analysis in this forum. However, when we ask people about middleware, data, and application virtualization, their resultant blank stares suggest that more examination is warranted.

Middleware virtualization is an imprecise term without a universally accepted definition. In this context we mean the true decoupling of the APIs that handle scalability, high availability, etc., from the runtime platform that provides those services. If you are familiar with the now-passé technology for grid computing, you know exactly what we mean. Why does this matter for the cloud? Horizontal scaling for web applications by simply overprovisioning (current approach for most deployments, cloud or not) is easy to do, but it's inherently inefficient: it is time-consuming to continually provision/de-provision resources, and underutilized infrastructure in the run-time environment goes wasted most of the time. Those familiar with middleware virtualization know those parameters are configured once during deployment, and actions are fully automated thereafter. Consequently it is much quicker to react to changes in demand.

Data virtualization is a simpler concept: what happens when your workload needs to access a large data repository to execute? Sure, it runs great when that repository is across the data center, but what about when it's on a different continent? Does it really make sense to leverage additional resources that are a great distance away if the price you pay for network latency (speed of light is a bummer) outweigh the performance benefits of adding more servers? At the simplest level, data virtualization is a way to make the data appear local to the computing resource. If the benefits of that aren't immediately obvious, you've probably logged on to the wrong blog.

Last, but certainly not leas let's explore application virtualization. Like middleware, it lacks a well-accepted definition, so we'll attempt to keep it simple. Application virtualization technologies specialize in the packaging and deployment of complete application run-time environments (think web, application, and DB) independent of any particular execution platform. In terms of execution platforms, think x86 servers running Linux, Microsoft, Solaris, other proprietary Unix platforms, or even a public cloud IaaS service like EC2. Wouldn't it be nice to package once and provision anywhere, and to do so manually or automatically based on time-of-day or real-time events like a failed resource?

In closing, we would like to re-emphasize a couple of points:

  • Virtualization does not equal cloud
  • Virtualization is much more than VMware

Virtualization at all levels - what we affectionately term holistic virtualization - can significantly increase the resource efficiency, the responsiveness to demand fluctuations, and reduce the level of effort your team will need to put into supporting your cloud environment.

In our next post we'll explore the next topic on our list:  service dependencies.

More Stories By James Houghton

James Houghton is Co-Founder & Chief Technology Officer of Adaptivity. In his CTO capacity Jim interacts with key technology providers to evolve capabilities and partnerships that enable Adaptivity to offer its complete SOIT, RTI, and Utility Computing solutions. In addition, he engages with key clients to ensure successful leverage of the ADIOS methodology.

Most recently, Houghton was the SVP Architecture & Strategy Executive for the infrastructure organization at Bank of America, where he drove legacy infrastructure transformation initiatives across 40+ data centers. Prior to that he was the Head of Wachovia’s Utility Product Management, where he drove the design, services, and offering for SOA and Utility Computing for the technology division of Wachovia’s Corporate & Investment Bank. He has also led leading-edge consulting practices at IBM Global Technology Services and Deloitte Consulting.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"We have a tagline - "Power in the API Economy." What that means is everything that is built in applications and connected applications is done through APIs," explained Roberto Medrano, Executive Vice President at Akana, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of...
"AgilData is the next generation of dbShards. It just adds a whole bunch more functionality to improve the developer experience," noted Dan Lynn, CEO of AgilData, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists addressed this very serious issue of pro...
"We provide a web application framework for building really sophisticated web applications that run on a browser without any installation need so we get used for biotech, defense, and banking applications," noted Charles Kendrick, CTO and Chief Architect at Isomorphic Software, in this SYS-CON.tv interview at @DevOpsSummit (http://DevOpsSummit.SYS-CON.com), held June 9-11, 2015, at the Javits Center in New York
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
In the midst of the widespread popularity and adoption of cloud computing, it seems like everything is being offered “as a Service” these days: Infrastructure? Check. Platform? You bet. Software? Absolutely. Toaster? It’s only a matter of time. With service providers positioning vastly differing offerings under a generic “cloud” umbrella, it’s all too easy to get confused about what’s actually being offered. In his session at 16th Cloud Expo, Kevin Hazard, Director of Digital Content for SoftL...
"A lot of the enterprises that have been using our systems for many years are reaching out to the cloud - the public cloud, the private cloud and hybrid," stated Reuven Harrison, CTO and Co-Founder of Tufin, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
The time is ripe for high speed resilient software defined storage solutions with unlimited scalability. ISS has been working with the leading open source projects and developed a commercial high performance solution that is able to grow forever without performance limitations. In his session at Cloud Expo, Alex Gorbachev, President of Intelligent Systems Services Inc., shared foundation principles of Ceph architecture, as well as the design to deliver this storage to traditional SAN storage co...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society-changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his session at @ThingsExpo, Jason Mondanaro, Director, Product Management at Metanga, discussed how you can plan to cooperate, partner, and form lasting all-star teams to change the world...
SYS-CON Events announced today that JFrog, maker of Artifactory, the popular Binary Repository Manager, will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based in California, Israel and France, founded by longtime field-experts, JFrog, creator of Artifactory and Bintray, has provided the market with the first Binary Repository solution and a software distribution social platform.
In his session at 16th Cloud Expo, Simone Brunozzi, VP and Chief Technologist of Cloud Services at VMware, reviewed the changes that the cloud computing industry has gone through over the last five years and shared insights into what the next five will bring. He also chronicled the challenges enterprise companies are facing as they move to the public cloud. He delved into the "Hybrid Cloud" space and explained why every CIO should consider ‘hybrid cloud' as part of their future strategy to achi...
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Internet of Things is moving from being a hype to a reality. Experts estimate that internet connected cars will grow to 152 million, while over 100 million internet connected wireless light bulbs and lamps will be operational by 2020. These and many other intriguing statistics highlight the importance of Internet powered devices and how market penetration is going to multiply many times over in the next few years.
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal...
Agile, which started in the development organization, has gradually expanded into other areas downstream - namely IT and Operations. Teams – then teams of teams – have streamlined processes, improved feedback loops and driven a much faster pace into IT departments which have had profound effects on the entire organization. In his session at DevOps Summit, Anders Wallgren, Chief Technology Officer of Electric Cloud, will discuss how DevOps and Continuous Delivery have emerged to help connect dev...