|By Jason Bloomberg||
|June 2, 2012 10:00 AM EDT||
The more you focus on the business benefits of Cloud, the more likely you'll be leaning toward public over private deployment models. Furthermore, this mind shift isn't all about security risks. Once you work through the issues, you'll likely come to the same conclusion: there's generally little or no solid business reason to build a private Cloud.
I had the pleasure of speaking at two quite different Cloud Computing conferences last week: Opal’s Business of Cloud Computing in Dallas and UBM’s CloudConnect in Bangalore. As the conference names and locations might suggest, the former was the more business-oriented while the latter was chock full of techies. What I didn’t expect, however, was that the business Cloud crowd had a more mature, advanced conception of Cloud than the technical audience. While the techies were still struggling with essential characteristics like elasticity, trying to free themselves from the vendor nonsense that drives such conferences, the business folks generally had a well-developed understanding of what Cloud is really all about, and as a result, focused their discussions on how best to leverage the approach to meet both tactical and strategic business goals.
Perhaps the most interesting contrast between the perspectives of these two audiences was their respective opinions about private Clouds. The techies at the Bangalore conference, having drunk too much of the vendor Kool-Aid, were generally of the opinion that public Clouds were too risky, and that their organizations should thus focus their efforts on the private deployment model. The Dallas business crowd, in contrast, generally held that the public approach was the way to go, with some folks even going so far as to claim that public Cloud was the only true approach to Cloud Computing.
This distinction is remarkable, and aligns with ZapThink’s thinking on this matter as well: the more you focus on the business benefits of Cloud, the more likely you’ll be leaning toward public over private deployment models. Furthermore, this mind shift isn’t all about security risks. We recently debunked the notion that public Clouds are inherently less secure than private ones, and many people at the Dallas conference agreed. But there’s more to this story. Once you work through the issues, you’ll likely come to the same conclusion: there’s generally little or no solid business reason to build a private Cloud.
The Problems with Private Clouds
The best way to understand the limitations of the private deployment model is to take the business perspective. What are the business benefits behind the move to the Cloud, and how can you achieve them?
Cloud will shift capital expense to operational expense – instead of having to invest in hardware and software, you can pay-as-you-go for what you need as an operational expense, and write it off your taxes right away. Except, of course, with private Clouds, where you have to build out the entire data center infrastructure yourself. If anything, private Clouds increase capital expenditures.
Cloud increases server utilization while dealing with spikes in demand – instead of setting up a data center full of servers that run idle most of the time on the off chance you need them to deal with the occasional Slashdot post or Justin Bieber tweet, the Cloud improves utilization while its elasticity deals with those annoying spikes. Except, of course, in private Clouds, unless your organization is so huge that multiple divisions look to your Cloud to handle many different spikes in demand, that you fervently hope arrive at different times. But what if that Kim Kardashian visit to your corporate HQ causes traffic to all your divisions to spike at once? Fugeddaboutit.
Cloud keeps infrastructure costs very low for new projects, since they don’t have much traffic yet – again, works much better in a public Cloud. How many such projects do you expect to have at any one time? If the number isn’t in the hundreds or thousands, then private Cloud is massive overkill for this purpose.
The elasticity benefit of the Cloud gives us the illusion of infinite capacity – infinite capacity is all fine and good, but it’s an illusion. And illusions work fine until, well, until they don’t. Elasticity provides the illusion of infinite capacity as long as there is always sufficient capacity to meet additional demand for Cloud resources. You’ll never consume all the capacity of a public Cloud, but your Private cloud is another matter entirely. It’s only so big. If one of your developers has the bright idea to provision a thousand virtual machine instances or a petabyte of storage for that Big Data project, and your private Cloud doesn’t have the physical capacity to do so, then bye-bye illusion.
We already have a significant investment in our existing data center, so converting it to a private Cloud will save us money while enabling us to obtain the benefits of the Cloud – in your dreams. One essential requirement for building an effective private Cloud is rigorous homogeneity. You want all your physical servers, network equipment, virtualization technology, storage, etc. to be completely identical across every rack. Look at your existing, pre-Cloud data center. Homogeneity isn’t even on your radar.
We don’t want to be in the data center business. That’s why we’re moving to the Cloud – guess what? Building a private Cloud puts you in the data center business!
Whatever cost efficiencies the public Cloud providers can achieve we can also achieve in our private Cloud – this argument doesn’t hold water either. Not only to the leading public Clouds—Amazon, Microsoft Azure, Rackspace, etc.—have enormous economies of scale, but they’re also operating on razor-thin margins. Furthermore, if they can wring more efficiencies out of the model, they’ll lower their prices. They’re taking this “price war” approach to their margins for all the regular business school reasons: to keep smaller players from being competitive, and to push their larger competitors out of the business. It doesn’t matter how big your private Cloud is, it simply cannot compete on price.
OK fine, you get it. Private Clouds suck, fair enough. You’ll even buy our arguments that public Clouds may actually be more secure than private ones. But you’re in a regulated industry or otherwise have stringent regulatory requirements about data protection or data movement that the public Cloud providers can’t adequately address. The only way you can move to the Cloud at all is to build a private Cloud.
Not so fast. While it’s true that regulatory compliance business drivers and limitations are becoming an increasingly important part of the Cloud story, any regulatory drawbacks to using public Clouds are essentially temporary, as the market responds to this demand. A new class of public Cloud provider, what is shaping up to be the “Enterprise Public Cloud Provider” marketplace, is on the rise. The players in this space are putting together offerings that include rigorous auditing, more transparent and stringent service-level agreements, and overall better visibility for corporate customers with regulatory concerns.
The incumbent public Cloud providers aren’t standing still either. For example, while Amazon built their public Cloud (and with it, the entire industry) on a “one size fits all” model aimed initially at developers, startups, and other small to midsize companies, they have been working on building out their enterprise offerings for a while now. While you may not be able to get solutions from the big players that meet your regulatory needs today, you can be sure it won’t take them long to figure out how to compete in even the most regulated industries. In a few years, if you look back on your decision to build a private Cloud on the basis of regulatory compliance, you’ll likely feel quite foolish as your competitors who waited will soon have fully compliant public alternatives, while you’re stuck paying the bills on your private Cloud initiative that will have become an expensive money pit.
The ZapThink Take
So, should any organization build a private Cloud? Perhaps, but only the very largest enterprises, and only when those organizations can figure out how to get most or all of their divisions to share those private Clouds. If your enterprise is large enough to achieve similar economies of scale to the public providers, then—and only then—will a private option be a viable business alternative.
In many such cases, those large enterprise private Clouds essentially become community Clouds, as multiple divisions of an enterprise share a single internal Cloud provider that operates much like a public Cloud, albeit for internal use across the enterprise. This community model makes sense, for example, for many federal governments. They can achieve the cost efficiencies of public Clouds while maintaining the control benefits of private Clouds by supporting the Cloud initiatives across multiple agencies.
Virtual Private Clouds (VPCs) also give many organizations the best of both worlds, as they leverage the public Cloud but run logically on your private network. Many hybrid Clouds follow the VPC approach, as hybrid on premise/Cloud models typically leverage private networks. ZapThink predicts this hybrid VPC model will become the predominant deployment model in the enterprise.
Still not convinced? Well, ask yourself why, and the answer is likely to be a question of control. Many executives will still be uncomfortable about public Clouds, even when we address the security and compliance issues that currently face public Cloud providers, simply because they don’t control the public Cloud. Our answer? Distribution of IT control is essential to the ZapThink 2020 vision, and is at the heart of the Agile Architecture Revolution. The Web doesn’t have centralized control, after all, and it works just fine. The app store model for enterprise IT, the rise of bring your own device (BYOD), and the fundamentally mobility-driven architecture of the Internet of Things are all examples of the broader shift to the notion decentralized control over IT. Fighting to maintain control is a losing proposition, and as a result, by 2020, private Clouds will be a mostly-forgotten bump on the road to the next big thing.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 7, 2015 03:30 AM EDT Reads: 457
The modern software development landscape consists of best practices and tools that allow teams to deliver software in a near-continuous manner. By adopting a culture of automation, measurement and sharing, the time to ship code has been greatly reduced, allowing for shorter release cycles and quicker feedback from customers and users. Still, with all of these tools and methods, how can teams stay on top of what is taking place across their infrastructure and codebase? Hopping between services a...
Oct. 7, 2015 03:00 AM EDT Reads: 403
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Oct. 7, 2015 03:00 AM EDT Reads: 369
The cloud has reached mainstream IT. Those 18.7 million data centers out there (server closets to corporate data centers to colocation deployments) are moving to the cloud. In his session at 17th Cloud Expo, Achim Weiss, CEO & co-founder of ProfitBricks, will share how two companies – one in the U.S. and one in Germany – are achieving their goals with cloud infrastructure. More than a case study, he will share the details of how they prioritized their cloud computing infrastructure deployments ...
Oct. 7, 2015 03:00 AM EDT Reads: 694
NHK, Japan Broadcasting will feature upcoming @ThingsExpo Silicon Valley in a special IoT documentary which will be filmed on the expo floor November 3 to 5, 2015 in Santa Clara. NHK is the sole public TV network in Japan equivalent to BBC in UK and the largest in Asia with many award winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology covering @ThingsExpo Silicon Valley. The program will be aired during the highest viewership season of ...
Oct. 7, 2015 02:45 AM EDT
Interested in leveraging automation technologies and a cloud architecture to make developers more productive? Learn how PaaS can benefit your organization to help you streamline your application development, allow you to use existing infrastructure and improve operational efficiencies. Begin charting your path to PaaS with OpenShift Enterprise.
Oct. 7, 2015 02:00 AM EDT Reads: 511
SYS-CON Events announced today that Luxoft Holding, Inc., a leading provider of software development services and innovative IT solutions, has been named “Bronze Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Luxoft’s software development services consist of core and mission-critical custom software development and support, product engineering and testing, and technology consulting.
Oct. 7, 2015 01:15 AM EDT Reads: 508
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/...
Oct. 7, 2015 12:45 AM EDT Reads: 130
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
Oct. 7, 2015 12:00 AM EDT Reads: 507
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Oct. 6, 2015 10:00 PM EDT Reads: 663
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
Oct. 6, 2015 08:00 PM EDT Reads: 316
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
Oct. 6, 2015 05:00 PM EDT Reads: 261
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
Oct. 6, 2015 04:40 PM EDT Reads: 115
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
Oct. 6, 2015 03:00 PM EDT Reads: 128
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Oct. 6, 2015 02:45 PM EDT Reads: 373
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
Oct. 6, 2015 01:00 PM EDT Reads: 744
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 6, 2015 12:45 PM EDT Reads: 462
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
Oct. 6, 2015 12:30 PM EDT Reads: 590
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...
Oct. 6, 2015 12:15 PM EDT Reads: 124
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
Oct. 6, 2015 12:00 PM EDT Reads: 445