Welcome!

@CloudExpo Authors: Pat Romanski, Liz McMillan, Elizabeth White, Nishanth Kadiyala, William Schmarzo

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

It's Called Cloud Computing Not Cheap Computing

The debate between private and public cloud is ridiculous

The debate between private and public cloud is ridiculous and we shouldn’t even be having it in the first place.

There’s a growing sector of the “cloud” market that is mobilizing to “discredit” private cloud. That ulterior motives exist behind this effort is certain (as followers of the movement would similarly claim regarding those who continue to support the private cloud) and these will certainly vary based on whom may be leading the charge at any given moment.

Reality is, however, that enterprises are going to build “cloud-like” architectural models whether the movement succeeds or not. While folks like Phil Wainewright can patiently point out that public clouds are less expensive and have a better TCO than any so-called private cloud implementation, he and others miss that it isn’t necessarily about raw dollars. It’s about a relationship between costs and benefits and risks, and analysis of the cost-risk-benefit relationship cannot be performed in a generalized, abstract manner. Such business analysis requires careful consideration of, well, the business and its needs – and that can’t be extrapolated and turned into a generalized formula without a lot of fine print, disclaimers, and caveats.

But let’s assume for a moment that no matter what the real cost-benefit analysis of private cloud versus public cloud might be for an organization that public cloud is less expensive.

So what?

If price were the only factor in IT acquisitions then a whole lot of us would be out of a job. Face it, just because a cheaper alternative to “leading brand X” exists does not mean that organizations buy into them (and vice-versa). Organizations have requirements for functionality, support, compliance with government and industry regulations and standards; they have an architecture into which such solutions must fit and integrate, interoperate and collaborate; they have needs that are both operational and business that must be balanced against costs.

Did you buy a Yugo instead of that BMW? No? Why not? The Yugo was certainly cheaper, after all, and that’s what counts, right?

IT organizations are no different. Do they want to lower their costs? Heck yeah. Do they want to do it at the expense of their business and operational requirements? Heck no. IT acquisition is always a balancing act and while there’s certainly an upper bounds for pricing it isn’t necessarily the deciding factor nor is it always a deal breaker.

It’s about the value of the solution for the cost. In some infrastructure that’s about performance and port density. In other it’s about features and flexibility. In still others it’s how well supported it is by other application infrastructure. The value of public cloud right now is in cheap compute and storage resources. For some organizations that’s enough, for others, it’s barely breaking the surface. The value of cloud is in its ability to orchestrate – to automatically manage resources according to business and operational needs. Those needs are unique to each organization and thus the cost-benefit-risk analysis of public versus private cloud must also be unique. Unilaterally declaring either public or private a “better value” is ludicrous unless you’ve factored in all the variables in the equation.

ORGANIZATIONS are not GREENFIELDS

I will, however, grant that public cloud computing offerings are almost certainly cheaper resources than private. But let’s look at the cost to integrate public cloud-deployed applications with enterprise infrastructure and supporting architectural components versus a private cloud integration effort.

Applications deployed out in the cloud still require things like application access control (a.k.a. ID management), and data stores, and remote access and analytics and monitoring and, well, you get the picture. Organizations have two options if they aren’t moving the entirety of their data center to the public cloud environment:

  1. DUPLICATION Organizations can replicate the infrastructure and supporting components necessary in the public cloud. Additional costs are incurred to synchronize, license, secure, and manage.
  2. INTEGRATION Organizations can simply integrate and leverage existing corporate-bound infrastructure through traditional means or they can acquire emerging “cloud” integration solutions. The former is going to require effort around ensuring security and performance of that connection (don’t want requests timing out on users, that’s bad for productivity) and the latter will incur capital (and ongoing operational) expenses.

Integration of public cloud-deployed applications with network and application infrastructure is going to happen because very few organizations are “green fields”. That means the organization has existing applications and organization processes and policies that must be integrated, followed, and adhered to by any new application. Applications are not silos, they are not islands, they are not the cheese that stands alone at the end of the childhood game. And because organizations are not green fields, the expense of fork-lifting an entire data center architecture and depositing it in a public cloud – which would be necessary to alleviate the additional costs in effort and solutions associated with cross-internet integration – is far greater than the “benefit” of cheaper resources.

Andi Mann twitterbird said it so well in a recent blog “Public Cloud Computing is NOT For Everyone”:

blockquote Public cloud might be logical for most smaller businesses, new businesses, or new applications like Netflix’ streaming video service, but for large enterprises, completely abandoning many millions of dollars of paid-for equipment, and an immeasurable amount of process and skill investment, is frequently unjustifiable. As much as they might want to get rid of internal IT, for large enterprises especially, it simply will not make sense – financially, or to the business.

Whether pundits and experts continue to disparage efforts by enterprise organizations will not change the reality that they are building such architectural models in their own data centers today. If the results are not as efficient, or as cheap, or as “cloudy” as a public cloud, well, as long as it offers the business and IT organization value and benefits over what they had, does it matter if it’s not “perfect” or as “inexpensive” if it provides value?

The constant “put down” of private cloud and organizations actively seeking to implement them is as bad as the constant excuse of security (or lack thereof) in public cloud as a means to avoid them. Public and private cloud computing both aim to reduce costs and increase flexibility and operational efficiency of IT organizations. If that means all public, all private, or some mix of the two then that’s what it takes.

That’s why I’m convinced that hybrid (sorry Randy) cloud computing will, in the end, be the preferred – or perhaps default - architectural model. There are applications for which public cloud computing makes sense in every organization, and applications for which private cloud computing makes sense. And then there are those applications for which cloud computing of any kind makes no sense.

Flexibility and agility is about choice; it’s about “personalization” of architectures and implementations for IT organizations such that they can build out a data center that meets the needs of the business they support. If you aren’t enabling that flexibility and choice, then you’re part of the problem, not the solution.

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Tintri focuses on the Ops side of the DevOps, which basically is pushing more and more of the accessibility of the infrastructure to the developers and trying to get behind the scenes," explained Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
"We focus on composable infrastructure. Composable infrastructure has been named by companies like Gartner as the evolution of the IT infrastructure where everything is now driven by software," explained Bruno Andrade, CEO and Founder of HTBase, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing bes...
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
In this presentation, Striim CTO and founder Steve Wilkes will discuss practical strategies for counteracting fraud and cyberattacks by leveraging real-time streaming analytics. In his session at @ThingsExpo, Steve Wilkes, Founder and Chief Technology Officer at Striim, will provide a detailed look into leveraging streaming data management to correlate events in real time, and identify potential breaches across IoT and non-IoT systems throughout the enterprise. Strategies for processing massive ...
SYS-CON Events announced today that Ayehu will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara California. Ayehu provides IT Process Automation & Orchestration solutions for IT and Security professionals to identify and resolve critical incidents and enable rapid containment, eradication, and recovery from cyber security breaches. Ayehu provides customers greater control over IT infras...