Click here to close now.

Welcome!

Cloud Expo Authors: Liz McMillan, Carmen Gonzalez, Michael Jannery, AppDynamics Blog, Pat Romanski

Related Topics: Cloud Expo, SOA & WOA, Virtualization

Cloud Expo: Article

Why You Really, Truly Don’t Want a Private Cloud

There’s generally little or no solid business reason to build a private Cloud

The more you focus on the business benefits of Cloud, the more likely you'll be leaning toward public over private deployment models. Furthermore, this mind shift isn't all about security risks. Once you work through the issues, you'll likely come to the same conclusion: there's generally little or no solid business reason to build a private Cloud.

I had the pleasure of speaking at two quite different Cloud Computing conferences last week: Opal’s Business of Cloud Computing in Dallas and UBM’s CloudConnect in Bangalore. As the conference names and locations might suggest, the former was the more business-oriented while the latter was chock full of techies. What I didn’t expect, however, was that the business Cloud crowd had a more mature, advanced conception of Cloud than the technical audience. While the techies were still struggling with essential characteristics like elasticity, trying to free themselves from the vendor nonsense that drives such conferences, the business folks generally had a well-developed understanding of what Cloud is really all about, and as a result, focused their discussions on how best to leverage the approach to meet both tactical and strategic business goals.

Perhaps the most interesting contrast between the perspectives of these two audiences was their respective opinions about private Clouds. The techies at the Bangalore conference, having drunk too much of the vendor Kool-Aid, were generally of the opinion that public Clouds were too risky, and that their organizations should thus focus their efforts on the private deployment model. The Dallas business crowd, in contrast, generally held that the public approach was the way to go, with some folks even going so far as to claim that public Cloud was the only true approach to Cloud Computing.

This distinction is remarkable, and aligns with ZapThink’s thinking on this matter as well: the more you focus on the business benefits of Cloud, the more likely you’ll be leaning toward public over private deployment models. Furthermore, this mind shift isn’t all about security risks. We recently debunked the notion that public Clouds are inherently less secure than private ones, and many people at the Dallas conference agreed. But there’s more to this story. Once you work through the issues, you’ll likely come to the same conclusion: there’s generally little or no solid business reason to build a private Cloud.

The Problems with Private Clouds
The best way to understand the limitations of the private deployment model is to take the business perspective. What are the business benefits behind the move to the Cloud, and how can you achieve them?

  • Cloud will shift capital expense to operational expense – instead of having to invest in hardware and software, you can pay-as-you-go for what you need as an operational expense, and write it off your taxes right away. Except, of course, with private Clouds, where you have to build out the entire data center infrastructure yourself. If anything, private Clouds increase capital expenditures.

  • Cloud increases server utilization while dealing with spikes in demand – instead of setting up a data center full of servers that run idle most of the time on the off chance you need them to deal with the occasional Slashdot post or Justin Bieber tweet, the Cloud improves utilization while its elasticity deals with those annoying spikes. Except, of course, in private Clouds, unless your organization is so huge that multiple divisions look to your Cloud to handle many different spikes in demand, that you fervently hope arrive at different times. But what if that Kim Kardashian visit to your corporate HQ causes traffic to all your divisions to spike at once? Fugeddaboutit.

  • Cloud keeps infrastructure costs very low for new projects, since they don’t have much traffic yet – again, works much better in a public Cloud. How many such projects do you expect to have at any one time? If the number isn’t in the hundreds or thousands, then private Cloud is massive overkill for this purpose.

  • The elasticity benefit of the Cloud gives us the illusion of infinite capacity – infinite capacity is all fine and good, but it’s an illusion. And illusions work fine until, well, until they don’t. Elasticity provides the illusion of infinite capacity as long as there is always sufficient capacity to meet additional demand for Cloud resources. You’ll never consume all the capacity of a public Cloud, but your Private cloud is another matter entirely. It’s only so big. If one of your developers has the bright idea to provision a thousand virtual machine instances or a petabyte of storage for that Big Data project, and your private Cloud doesn’t have the physical capacity to do so, then bye-bye illusion.

  • We already have a significant investment in our existing data center, so converting it to a private Cloud will save us money while enabling us to obtain the benefits of the Cloudin your dreams. One essential requirement for building an effective private Cloud is rigorous homogeneity. You want all your physical servers, network equipment, virtualization technology, storage, etc. to be completely identical across every rack. Look at your existing, pre-Cloud data center. Homogeneity isn’t even on your radar.

  • We don’t want to be in the data center business. That’s why we’re moving to the Cloud – guess what? Building a private Cloud puts you in the data center business!

  • Whatever cost efficiencies the public Cloud providers can achieve we can also achieve in our private Cloud – this argument doesn’t hold water either. Not only to the leading public Clouds—Amazon, Microsoft Azure, Rackspace, etc.—have enormous economies of scale, but they’re also operating on razor-thin margins. Furthermore, if they can wring more efficiencies out of the model, they’ll lower their prices. They’re taking this “price war” approach to their margins for all the regular business school reasons: to keep smaller players from being competitive, and to push their larger competitors out of the business. It doesn’t matter how big your private Cloud is, it simply cannot compete on price.

OK fine, you get it. Private Clouds suck, fair enough. You’ll even buy our arguments that public Clouds may actually be more secure than private ones. But you’re in a regulated industry or otherwise have stringent regulatory requirements about data protection or data movement that the public Cloud providers can’t adequately address. The only way you can move to the Cloud at all is to build a private Cloud.

Not so fast. While it’s true that regulatory compliance business drivers and limitations are becoming an increasingly important part of the Cloud story, any regulatory drawbacks to using public Clouds are essentially temporary, as the market responds to this demand. A new class of public Cloud provider, what is shaping up to be the “Enterprise Public Cloud Provider” marketplace, is on the rise. The players in this space are putting together offerings that include rigorous auditing, more transparent and stringent service-level agreements, and overall better visibility for corporate customers with regulatory concerns.

The incumbent public Cloud providers aren’t standing still either. For example, while Amazon built their public Cloud (and with it, the entire industry) on a “one size fits all” model aimed initially at developers, startups, and other small to midsize companies, they have been working on building out their enterprise offerings for a while now. While you may not be able to get solutions from the big players that meet your regulatory needs today, you can be sure it won’t take them long to figure out how to compete in even the most regulated industries. In a few years, if you look back on your decision to build a private Cloud on the basis of regulatory compliance, you’ll likely feel quite foolish as your competitors who waited will soon have fully compliant public alternatives, while you’re stuck paying the bills on your private Cloud initiative that will have become an expensive money pit.

The ZapThink Take
So, should any organization build a private Cloud? Perhaps, but only the very largest enterprises, and only when those organizations can figure out how to get most or all of their divisions to share those private Clouds. If your enterprise is large enough to achieve similar economies of scale to the public providers, then—and only then—will a private option be a viable business alternative.

In many such cases, those large enterprise private Clouds essentially become community Clouds, as multiple divisions of an enterprise share a single internal Cloud provider that operates much like a public Cloud, albeit for internal use across the enterprise. This community model makes sense, for example, for many federal governments. They can achieve the cost efficiencies of public Clouds while maintaining the control benefits of private Clouds by supporting the Cloud initiatives across multiple agencies.

Virtual Private Clouds (VPCs) also give many organizations the best of both worlds, as they leverage the public Cloud but run logically on your private network. Many hybrid Clouds follow the VPC approach, as hybrid on premise/Cloud models typically leverage private networks. ZapThink predicts this hybrid VPC model will become the predominant deployment model in the enterprise.

Still not convinced? Well, ask yourself why, and the answer is likely to be a question of control. Many executives will still be uncomfortable about public Clouds, even when we address the security and compliance issues that currently face public Cloud providers, simply because they don’t control the public Cloud. Our answer? Distribution of IT control is essential to the ZapThink 2020 vision, and is at the heart of the Agile Architecture Revolution. The Web doesn’t have centralized control, after all, and it works just fine. The app store model for enterprise IT, the rise of bring your own device (BYOD), and the fundamentally mobility-driven architecture of the Internet of Things are all examples of the broader shift to the notion decentralized control over IT. Fighting to maintain control is a losing proposition, and as a result, by 2020, private Clouds will be a mostly-forgotten bump on the road to the next big thing.

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).

@CloudExpo Stories
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investme...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...