Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Ed Featherston, Dalibor Siroky, Pat Romanski

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

It's Called Cloud Computing Not Cheap Computing

The debate between private and public cloud is ridiculous

The debate between private and public cloud is ridiculous and we shouldn’t even be having it in the first place.

There’s a growing sector of the “cloud” market that is mobilizing to “discredit” private cloud. That ulterior motives exist behind this effort is certain (as followers of the movement would similarly claim regarding those who continue to support the private cloud) and these will certainly vary based on whom may be leading the charge at any given moment.

Reality is, however, that enterprises are going to build “cloud-like” architectural models whether the movement succeeds or not. While folks like Phil Wainewright can patiently point out that public clouds are less expensive and have a better TCO than any so-called private cloud implementation, he and others miss that it isn’t necessarily about raw dollars. It’s about a relationship between costs and benefits and risks, and analysis of the cost-risk-benefit relationship cannot be performed in a generalized, abstract manner. Such business analysis requires careful consideration of, well, the business and its needs – and that can’t be extrapolated and turned into a generalized formula without a lot of fine print, disclaimers, and caveats.

But let’s assume for a moment that no matter what the real cost-benefit analysis of private cloud versus public cloud might be for an organization that public cloud is less expensive.

So what?

If price were the only factor in IT acquisitions then a whole lot of us would be out of a job. Face it, just because a cheaper alternative to “leading brand X” exists does not mean that organizations buy into them (and vice-versa). Organizations have requirements for functionality, support, compliance with government and industry regulations and standards; they have an architecture into which such solutions must fit and integrate, interoperate and collaborate; they have needs that are both operational and business that must be balanced against costs.

Did you buy a Yugo instead of that BMW? No? Why not? The Yugo was certainly cheaper, after all, and that’s what counts, right?

IT organizations are no different. Do they want to lower their costs? Heck yeah. Do they want to do it at the expense of their business and operational requirements? Heck no. IT acquisition is always a balancing act and while there’s certainly an upper bounds for pricing it isn’t necessarily the deciding factor nor is it always a deal breaker.

It’s about the value of the solution for the cost. In some infrastructure that’s about performance and port density. In other it’s about features and flexibility. In still others it’s how well supported it is by other application infrastructure. The value of public cloud right now is in cheap compute and storage resources. For some organizations that’s enough, for others, it’s barely breaking the surface. The value of cloud is in its ability to orchestrate – to automatically manage resources according to business and operational needs. Those needs are unique to each organization and thus the cost-benefit-risk analysis of public versus private cloud must also be unique. Unilaterally declaring either public or private a “better value” is ludicrous unless you’ve factored in all the variables in the equation.

ORGANIZATIONS are not GREENFIELDS

I will, however, grant that public cloud computing offerings are almost certainly cheaper resources than private. But let’s look at the cost to integrate public cloud-deployed applications with enterprise infrastructure and supporting architectural components versus a private cloud integration effort.

Applications deployed out in the cloud still require things like application access control (a.k.a. ID management), and data stores, and remote access and analytics and monitoring and, well, you get the picture. Organizations have two options if they aren’t moving the entirety of their data center to the public cloud environment:

  1. DUPLICATION Organizations can replicate the infrastructure and supporting components necessary in the public cloud. Additional costs are incurred to synchronize, license, secure, and manage.
  2. INTEGRATION Organizations can simply integrate and leverage existing corporate-bound infrastructure through traditional means or they can acquire emerging “cloud” integration solutions. The former is going to require effort around ensuring security and performance of that connection (don’t want requests timing out on users, that’s bad for productivity) and the latter will incur capital (and ongoing operational) expenses.

Integration of public cloud-deployed applications with network and application infrastructure is going to happen because very few organizations are “green fields”. That means the organization has existing applications and organization processes and policies that must be integrated, followed, and adhered to by any new application. Applications are not silos, they are not islands, they are not the cheese that stands alone at the end of the childhood game. And because organizations are not green fields, the expense of fork-lifting an entire data center architecture and depositing it in a public cloud – which would be necessary to alleviate the additional costs in effort and solutions associated with cross-internet integration – is far greater than the “benefit” of cheaper resources.

Andi Mann twitterbird said it so well in a recent blog “Public Cloud Computing is NOT For Everyone”:

blockquote Public cloud might be logical for most smaller businesses, new businesses, or new applications like Netflix’ streaming video service, but for large enterprises, completely abandoning many millions of dollars of paid-for equipment, and an immeasurable amount of process and skill investment, is frequently unjustifiable. As much as they might want to get rid of internal IT, for large enterprises especially, it simply will not make sense – financially, or to the business.

Whether pundits and experts continue to disparage efforts by enterprise organizations will not change the reality that they are building such architectural models in their own data centers today. If the results are not as efficient, or as cheap, or as “cloudy” as a public cloud, well, as long as it offers the business and IT organization value and benefits over what they had, does it matter if it’s not “perfect” or as “inexpensive” if it provides value?

The constant “put down” of private cloud and organizations actively seeking to implement them is as bad as the constant excuse of security (or lack thereof) in public cloud as a means to avoid them. Public and private cloud computing both aim to reduce costs and increase flexibility and operational efficiency of IT organizations. If that means all public, all private, or some mix of the two then that’s what it takes.

That’s why I’m convinced that hybrid (sorry Randy) cloud computing will, in the end, be the preferred – or perhaps default - architectural model. There are applications for which public cloud computing makes sense in every organization, and applications for which private cloud computing makes sense. And then there are those applications for which cloud computing of any kind makes no sense.

Flexibility and agility is about choice; it’s about “personalization” of architectures and implementations for IT organizations such that they can build out a data center that meets the needs of the business they support. If you aren’t enabling that flexibility and choice, then you’re part of the problem, not the solution.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
Blockchain. A day doesn’t seem to go by without seeing articles and discussions about the technology. According to PwC executive Seamus Cushley, approximately $1.4B has been invested in blockchain just last year. In Gartner’s recent hype cycle for emerging technologies, blockchain is approaching the peak. It is considered by Gartner as one of the ‘Key platform-enabling technologies to track.’ While there is a lot of ‘hype vs reality’ discussions going on, there is no arguing that blockchain is b...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud host...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
The need for greater agility and scalability necessitated the digital transformation in the form of following equation: monolithic to microservices to serverless architecture (FaaS). To keep up with the cut-throat competition, the organisations need to update their technology stack to make software development their differentiating factor. Thus microservices architecture emerged as a potential method to provide development teams with greater flexibility and other advantages, such as the abili...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...