Welcome!

@CloudExpo Authors: Yeshim Deniz, Elizabeth White, Liz McMillan, Zakia Bouachraoui, Pat Romanski

Related Topics: @CloudExpo

@CloudExpo: Article

Cloud Computing Debate: Booz Allen Hamilton Comments on Recent McKinsey & Co. Report

Two Booz Allen Hamilton Principals offer their take on McKinsey & Co. view of cloud computing.

Cloud Musings

(In a recent discussion document titled "Clearing the air on cloud computing", Will Forrest of McKinsey & Co. offered his view on cloud computing. Unedited comments on the report from Mike Cameron and Rod Fontecilla, Booz Allen Hamilton Principals are provided below, published at their request.)

The recent McKinsey report on cloud computing “Clearing the air on cloud computing” has caused a bit of a stir, primarily since it purports to demonstrate that cloud computing can be twice as expensive as traditional data centers in some applications. Since this report makes a claim to an analysis of cloud economics, we would like to weigh in with a couple of comments regarding the report.

The McKinsey report, as presented, seeks to be the “other voice” and offer a contrarian view of cloud computing. The first thing we noted was the statement, on slide 7, that “Cloud computing can divert IT departments’ attention from technologies that can actually deliver sizeable benefits; e.g., aggressive virtualization.” This view seems to be an underlying motif in subsequent discussions, yet it is a premise that is not substantiated.

We are also somewhat taken aback by a management consulting firm is proposing an “industry standard definition” for cloud computing, having rejected, for various reasons, the definitions used by the IT vendors and data center owners that are currently creating cloud computing in the industry, as well as by centers of academic excellence (e.g., the computer science department at Berkeley). We are surprised that McKinsey rejected a definition of cloud computing (slide 11) because the definition doesn’t provide “definitive economic implications.” Webster’s dictionary defines “bicycle” without making any economic implications.

Definitions say what something is. Economic implications are a value judgment. We do not understand how a definition, absent a value judgment. It is also an assertion by McKinsey that the definition fails because it does “not distinguish cloud services from clouds.” Interestingly, on slide 17, a cloud service is defined as having two of the three key requirements of a cloud. This leaves McKinsey’s definition of cloud services to mean “not quite a cloud.” The report does not attempt to define what cloud services are, stating only that “it could run on top of a cloud.”

They state that cloud offerings “are most attractive” to small and medium sized business, and “there are significant hurdles to the adoption of cloud services by large enterprises.” That would come as quite a shock to Target, Eli Lily, the New York Stock Exchange, the American Stock Exchange, NSADAQ, Toyota, E*Trade, Computer Associates, and a host other large enterprises that have been in the cloud for a couple of years.

The “significant hurdles” to cloud adoption by large organizations appear to be McKinsey’s opinions but not supported by hard data. For example, “business perceptions of increased IT flexibility and effectiveness will have to be properly managed.” What perceptions? Managed by whom?

We are trying to figure out how McKinsey got to the numbers they cite, on slide 24, in their comparison of CPU costs per month in the data center versus in the cloud. Taking the $14K/server cited on slide 23, and dividing that out over a three year refresh cycle, costing it out by month, and dividing by 8 to reflect the cost of each processing core, I got to $48/month. But that price does not reflect any power, facilities, or labor, so the “Total Cost of Assets” MUST be higher than the figure cited by McKinsey, unless they changed assumptions between examples. They also make an assumption of an Amazon large instance (discounted by 25% for reasons that are not provided) and calculate a cost per month of $270.

Where this example appears to break down is that, for the data center, they are calculating the cost per core, while for Amazon they are calculating the cost of a Large EC2 instance, which is four cores. On a single-core basis, an EC2 Small instance is only $72 month, running non-stop. Assuming the same 10% utilization used in other examples, the comparison should be $48/month for the data center and $7.20 month for EC2.

Their assertion that moving a data center to the cloud provides a 10-15% savings in labor seems to be well off the mark. In the discussions with cloud providers, we learned that labor went from being one of the largest components of cost to an insignificant component of cost, largely because of virtualization (reduced hardware baseline plus ease of provisioning logical, rather than physical, devices) and elasticity (automated resource management).


( Thank you. If you enjoyed this article, get free updates by email or RSS - KLJ )

More Stories By Kevin Jackson

Kevin Jackson, founder of the GovCloud Network, is an independent technology and business consultant specializing in mission critical solutions. He has served in various senior management positions including VP & GM Cloud Services NJVC, Worldwide Sales Executive for IBM and VP Program Management Office at JP Morgan Chase. His formal education includes MSEE (Computer Engineering), MA National Security & Strategic Studies and a BS Aerospace Engineering. Jackson graduated from the United States Naval Academy in 1979 and retired from the US Navy earning specialties in Space Systems Engineering, Airborne Logistics and Airborne Command and Control. He also served with the National Reconnaissance Office, Operational Support Office, providing tactical support to Navy and Marine Corps forces worldwide. Kevin is the founder and author of “Cloud Musings”, a widely followed blog that focuses on the use of cloud computing by the Federal government. He is also the editor and founder of “Government Cloud Computing” electronic magazine, published at Ulitzer.com. To set up an appointment CLICK HERE

CloudEXPO Stories
Even if your IT and support staff are well versed in agility and cloud technologies, it can be an uphill battle to establish a DevOps style culture - one where continuous improvement of both products and service delivery is expected and respected and all departments work together throughout a client or service engagement. As a service-oriented provider of cloud and data center technology, Green House Data sought to create more of a culture of innovation and continuous improvement, from our helpdesk on to our product development and cloud service teams. Learn how the Chief Executive team helped guide managers and staff towards this goal with metrics to measure progress, staff hiring or realignment, and new technologies and certifications.
Technology has changed tremendously in the last 20 years. From onion architectures to APIs to microservices to cloud and containers, the technology artifacts shipped by teams has changed. And that's not all - roles have changed too. Functional silos have been replaced by cross-functional teams, the skill sets people need to have has been redefined and the tools and approaches for how software is developed and delivered has transformed. When we move from highly defined rigid roles and systems to more fluid ones, we gain agility at the cost of control. But where do we want to keep control? How do we take advantage of all these new changes without losing the ability to efficiently develop and ship great software? And how should program and project managers adapt?
When Enterprises started adopting Hadoop-based Big Data environments over the last ten years, they were mainly on-premise deployments. Organizations would spin up and manage large Hadoop clusters, where they would funnel exabytes or petabytes of unstructured data.However, over the last few years the economics of maintaining this enormous infrastructure compared with the elastic scalability of viable cloud options has changed this equation. The growth of cloud storage, cloud-managed big data environments, and cloud data warehouses like Snowflake, Redshift, BigQuery and Azure SQL DW, have given the cloud its own gravity - pulling data from existing environments. In this presentation we will discuss this transition, describe the challenges and solutions for creating the data flows necessary to move to cloud analytics, and provide real-world use-cases and benefits obtained through adop...
Docker and Kubernetes are key elements of modern cloud native deployment automations. After building your microservices, common practice is to create docker images and create YAML files to automate the deployment with Docker and Kubernetes. Writing these YAMLs, Dockerfile descriptors are really painful and error prone.Ballerina is a new cloud-native programing language which understands the architecture around it - the compiler is environment aware of microservices directly deployable into infrastructures like Docker and Kubernetes.
Your applications have evolved, your computing needs are changing, and your servers have become more and more dense. But your data center hasn't changed so you can't get the benefits of cheaper, better, smaller, faster... until now. Colovore is Silicon Valley's premier provider of high-density colocation solutions that are a perfect fit for companies operating modern, high-performance hardware. No other Bay Area colo provider can match our density, operating efficiency, and ease of scalability.