@CloudExpo Authors: Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Blog Feed Post

Virtual Machine Density as the New Measure of IT Efficiency

You’re going to need a dynamic infrastructure lest you effectively negate the gains achieved by higher VM densities

You’re going to need a dynamic infrastructure lest you effectively negate the gains achieved by higher VM densities

In the continuing saga of “do more with less” comes a new phrase that’s being tossed around: VM density. For example, VMware puts forth the notion that the Total Cost of Ownership (TCO) of virtualization technology must consider VM density, saying, “Density matters in a many-to-one relationship.” VMware illustrates this concept in the context of TCO, but in general an increasing number of solutions are beginning to tout not only the benefits of higher VM density, but of their solutions ability to affect it. They recognize it as a valuable measure of efficiency. The general measure is this: the more virtual machines per physical server, the better.

VM density relates closely to IT efficiency in just about any way you measure it: whether it’s the “cost per application” or “cost per user”, the density of virtual machines per server is going to factor into the equation. But so does the cost of managing those virtual machines, and the flexibility inherent in virtualization and cloud computing models comes at a cost: managing an increasing complex network and application network manually.


vmgrowthThe dilemma is that the cost of management increases across IT as VM density grows, which actually decreases efficiency – at least as a measure of cost per virtual machine. IDC predicts not only the increase in VM density – as does Gartner and just about every other analyst firm – but also that the ratio of administrators to virtual servers will increase accordingly. By 2012 IDC predicts in its “Data Center of the Future (March 2009)” that the typical 25:1 VM to administrator ratio will increase to 35:1.

Unfortunately, the management costs incurred by a virtual machine are not decreasing despite the math that says it should. In fact, the increasing complexity posed by the growth of virtual machines will have a deleterious affect on the cost of managing the overall infrastructure, especially at the network layer. That’s because there’s more involved in this type of change than just movement of a virtual machine. The underpinnings of the network: DHCP, DNS, IPAM, are all impacted – and not necessarily favorably – by the increase in VM density.

Greg Ness says it well in “Today’s Networks Resemble Yesterday’s Factories”:

“This growing tension between system automation, increasing VLAN density/VMsprawl and rising network manual labor costs sets the stage for massive network innovation in management/automation, security and application delivery. Without innovation the network becomes more expensive to operate and less relevant to the ongoing march of IT.”


Infrastructure 2.0 offers the foundation on which solutions can be constructed to counter the costs of higher VM densities and increasing management expense. Whether it’s network, application delivery network, or at the IP address management layer, the ability of solutions to collaborate via integration and take decisive actions based on the data exchanged between such integrated infrastructure is what is required to create the data center of the future that isn’t so prohibitively expensive to run that it never comes to fruition. It is the glue that will hold together the data center in the future.

The analogy of today’s network as a factory is altogether too true; packets come in one end and out the other comes a response. It’s an assembly line, with every packet and request being treated as equally as others. Customization is not possible, and any change to the process is disruptive. Along comes dynamic infrastructure; an infrastructure that takes as a core precept the concept of context. Context gives infrastructure the ability to not just deal with the change inherent in high density virtualized architectures, but act upon it in a flexible way. When an infrastructure is dynamic and context-aware, it can customize and treat each request as an individual entity deserving of the “special attention” once only afforded expensive, custom processing. And it is that flexibility coupled with programmability from whence innovation comes.

While VM density may be the new measure of IT efficiency, part of that metric must include the cost to manage the increasing rate of change it brings with it. In order to keep those costs down it will be necessary to architect a network and application network that takes advantage of the integration capabilities of Infrastructure 2.0 while leveraging the flexibility inherent in this new breed of infrastructure solutions.



Going to VMworld? Interested in seeing automation of the DATA CENTER like you’ve never seen before? Make sure to visit F5 @ VMworld to see just how far a dynamic infrastructure can TAKE YOUR VIRTUALIZED ARCHITECTURE.

You can follow F5 on Twitter for details - we’ll make sure you know where to be when it happens.

Follow me on Twitter View Lori's profile on SlideShare friendfeedicon_facebook AddThis Feed Button Bookmark and Share

Related blogs & articles:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

CloudEXPO Stories
Sanjeev Sharma Joins November 11-13, 2018 @DevOpsSummit at @CloudEXPO New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 Cloud Computing Blogger for IT Integrators" by CRN (2015). Mr. Jackson's professional career includes service in the US Navy Space Systems Command, Vice President J.P. Morgan Chase, Worldwide Sales Executive for IBM and NJVC Vice President, Cloud Services. He is currently part of a team responsible for onboarding mission applications to the US Intelligence Community cloud computing environment (IC ...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight and has been quoted or published in Time, CIO, Computerworld, USA Today and Forbes.
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the massive amount of information associated with these devices. Ed presented sought out sessions at CloudEXPO Silicon Valley 2017 and CloudEXPO New York 2017. He is a regular contributor to Cloud Computing Journal.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on organizations of all sizes and in every line of business. Fintech is a constant battleground for this technology expanding trend and the lessons learned here can be applied anywhere. Digital transformation isn't going to go away and the need for greater understanding and skills around managing, guiding, and understanding the greater landscape of change is required for effective transformations.