|By Jason Bloomberg||
|June 2, 2012 10:00 AM EDT||
The more you focus on the business benefits of Cloud, the more likely you'll be leaning toward public over private deployment models. Furthermore, this mind shift isn't all about security risks. Once you work through the issues, you'll likely come to the same conclusion: there's generally little or no solid business reason to build a private Cloud.
I had the pleasure of speaking at two quite different Cloud Computing conferences last week: Opal’s Business of Cloud Computing in Dallas and UBM’s CloudConnect in Bangalore. As the conference names and locations might suggest, the former was the more business-oriented while the latter was chock full of techies. What I didn’t expect, however, was that the business Cloud crowd had a more mature, advanced conception of Cloud than the technical audience. While the techies were still struggling with essential characteristics like elasticity, trying to free themselves from the vendor nonsense that drives such conferences, the business folks generally had a well-developed understanding of what Cloud is really all about, and as a result, focused their discussions on how best to leverage the approach to meet both tactical and strategic business goals.
Perhaps the most interesting contrast between the perspectives of these two audiences was their respective opinions about private Clouds. The techies at the Bangalore conference, having drunk too much of the vendor Kool-Aid, were generally of the opinion that public Clouds were too risky, and that their organizations should thus focus their efforts on the private deployment model. The Dallas business crowd, in contrast, generally held that the public approach was the way to go, with some folks even going so far as to claim that public Cloud was the only true approach to Cloud Computing.
This distinction is remarkable, and aligns with ZapThink’s thinking on this matter as well: the more you focus on the business benefits of Cloud, the more likely you’ll be leaning toward public over private deployment models. Furthermore, this mind shift isn’t all about security risks. We recently debunked the notion that public Clouds are inherently less secure than private ones, and many people at the Dallas conference agreed. But there’s more to this story. Once you work through the issues, you’ll likely come to the same conclusion: there’s generally little or no solid business reason to build a private Cloud.
The Problems with Private Clouds
The best way to understand the limitations of the private deployment model is to take the business perspective. What are the business benefits behind the move to the Cloud, and how can you achieve them?
Cloud will shift capital expense to operational expense – instead of having to invest in hardware and software, you can pay-as-you-go for what you need as an operational expense, and write it off your taxes right away. Except, of course, with private Clouds, where you have to build out the entire data center infrastructure yourself. If anything, private Clouds increase capital expenditures.
Cloud increases server utilization while dealing with spikes in demand – instead of setting up a data center full of servers that run idle most of the time on the off chance you need them to deal with the occasional Slashdot post or Justin Bieber tweet, the Cloud improves utilization while its elasticity deals with those annoying spikes. Except, of course, in private Clouds, unless your organization is so huge that multiple divisions look to your Cloud to handle many different spikes in demand, that you fervently hope arrive at different times. But what if that Kim Kardashian visit to your corporate HQ causes traffic to all your divisions to spike at once? Fugeddaboutit.
Cloud keeps infrastructure costs very low for new projects, since they don’t have much traffic yet – again, works much better in a public Cloud. How many such projects do you expect to have at any one time? If the number isn’t in the hundreds or thousands, then private Cloud is massive overkill for this purpose.
The elasticity benefit of the Cloud gives us the illusion of infinite capacity – infinite capacity is all fine and good, but it’s an illusion. And illusions work fine until, well, until they don’t. Elasticity provides the illusion of infinite capacity as long as there is always sufficient capacity to meet additional demand for Cloud resources. You’ll never consume all the capacity of a public Cloud, but your Private cloud is another matter entirely. It’s only so big. If one of your developers has the bright idea to provision a thousand virtual machine instances or a petabyte of storage for that Big Data project, and your private Cloud doesn’t have the physical capacity to do so, then bye-bye illusion.
We already have a significant investment in our existing data center, so converting it to a private Cloud will save us money while enabling us to obtain the benefits of the Cloud – in your dreams. One essential requirement for building an effective private Cloud is rigorous homogeneity. You want all your physical servers, network equipment, virtualization technology, storage, etc. to be completely identical across every rack. Look at your existing, pre-Cloud data center. Homogeneity isn’t even on your radar.
We don’t want to be in the data center business. That’s why we’re moving to the Cloud – guess what? Building a private Cloud puts you in the data center business!
Whatever cost efficiencies the public Cloud providers can achieve we can also achieve in our private Cloud – this argument doesn’t hold water either. Not only to the leading public Clouds—Amazon, Microsoft Azure, Rackspace, etc.—have enormous economies of scale, but they’re also operating on razor-thin margins. Furthermore, if they can wring more efficiencies out of the model, they’ll lower their prices. They’re taking this “price war” approach to their margins for all the regular business school reasons: to keep smaller players from being competitive, and to push their larger competitors out of the business. It doesn’t matter how big your private Cloud is, it simply cannot compete on price.
OK fine, you get it. Private Clouds suck, fair enough. You’ll even buy our arguments that public Clouds may actually be more secure than private ones. But you’re in a regulated industry or otherwise have stringent regulatory requirements about data protection or data movement that the public Cloud providers can’t adequately address. The only way you can move to the Cloud at all is to build a private Cloud.
Not so fast. While it’s true that regulatory compliance business drivers and limitations are becoming an increasingly important part of the Cloud story, any regulatory drawbacks to using public Clouds are essentially temporary, as the market responds to this demand. A new class of public Cloud provider, what is shaping up to be the “Enterprise Public Cloud Provider” marketplace, is on the rise. The players in this space are putting together offerings that include rigorous auditing, more transparent and stringent service-level agreements, and overall better visibility for corporate customers with regulatory concerns.
The incumbent public Cloud providers aren’t standing still either. For example, while Amazon built their public Cloud (and with it, the entire industry) on a “one size fits all” model aimed initially at developers, startups, and other small to midsize companies, they have been working on building out their enterprise offerings for a while now. While you may not be able to get solutions from the big players that meet your regulatory needs today, you can be sure it won’t take them long to figure out how to compete in even the most regulated industries. In a few years, if you look back on your decision to build a private Cloud on the basis of regulatory compliance, you’ll likely feel quite foolish as your competitors who waited will soon have fully compliant public alternatives, while you’re stuck paying the bills on your private Cloud initiative that will have become an expensive money pit.
The ZapThink Take
So, should any organization build a private Cloud? Perhaps, but only the very largest enterprises, and only when those organizations can figure out how to get most or all of their divisions to share those private Clouds. If your enterprise is large enough to achieve similar economies of scale to the public providers, then—and only then—will a private option be a viable business alternative.
In many such cases, those large enterprise private Clouds essentially become community Clouds, as multiple divisions of an enterprise share a single internal Cloud provider that operates much like a public Cloud, albeit for internal use across the enterprise. This community model makes sense, for example, for many federal governments. They can achieve the cost efficiencies of public Clouds while maintaining the control benefits of private Clouds by supporting the Cloud initiatives across multiple agencies.
Virtual Private Clouds (VPCs) also give many organizations the best of both worlds, as they leverage the public Cloud but run logically on your private network. Many hybrid Clouds follow the VPC approach, as hybrid on premise/Cloud models typically leverage private networks. ZapThink predicts this hybrid VPC model will become the predominant deployment model in the enterprise.
Still not convinced? Well, ask yourself why, and the answer is likely to be a question of control. Many executives will still be uncomfortable about public Clouds, even when we address the security and compliance issues that currently face public Cloud providers, simply because they don’t control the public Cloud. Our answer? Distribution of IT control is essential to the ZapThink 2020 vision, and is at the heart of the Agile Architecture Revolution. The Web doesn’t have centralized control, after all, and it works just fine. The app store model for enterprise IT, the rise of bring your own device (BYOD), and the fundamentally mobility-driven architecture of the Internet of Things are all examples of the broader shift to the notion decentralized control over IT. Fighting to maintain control is a losing proposition, and as a result, by 2020, private Clouds will be a mostly-forgotten bump on the road to the next big thing.
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at EMC, will introduce a methodology for capturing, enriching and sharing data (and analytics) across the organizati...
Sep. 24, 2016 09:45 PM EDT Reads: 1,587
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Sep. 24, 2016 09:30 PM EDT Reads: 3,317
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...
Sep. 24, 2016 09:15 PM EDT Reads: 1,190
SYS-CON Events announced today that Secure Channels will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The bedrock of Secure Channels Technology is a uniquely modified and enhanced process based on superencipherment. Superencipherment is the process of encrypting an already encrypted message one or more times, either using the same or a different algorithm.
Sep. 24, 2016 09:00 PM EDT Reads: 1,407
The vision of a connected smart home is becoming reality with the application of integrated wireless technologies in devices and appliances. The use of standardized and TCP/IP networked wireless technologies in line-powered and battery operated sensors and controls has led to the adoption of radios in the 2.4GHz band, including Wi-Fi, BT/BLE and 802.15.4 applied ZigBee and Thread. This is driving the need for robust wireless coexistence for multiple radios to ensure throughput performance and th...
Sep. 24, 2016 08:30 PM EDT Reads: 1,434
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, will discuss the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports. The session will include a working demo and a technical d...
Sep. 24, 2016 08:00 PM EDT Reads: 1,644
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Sep. 24, 2016 07:00 PM EDT Reads: 2,778
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Sep. 24, 2016 06:45 PM EDT Reads: 1,469
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
Sep. 24, 2016 04:30 PM EDT Reads: 1,502
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Sep. 24, 2016 04:30 PM EDT Reads: 1,692
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Sep. 24, 2016 04:00 PM EDT Reads: 1,545
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Sep. 24, 2016 03:00 PM EDT Reads: 1,023
SYS-CON Events announced today that China Unicom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. China United Network Communications Group Co. Ltd ("China Unicom") was officially established in 2009 on the basis of the merger of former China Netcom and former China Unicom. China Unicom mainly operates a full range of telecommunications services including mobile broadband (GSM, WCDMA, LTE F...
Sep. 24, 2016 01:30 PM EDT Reads: 1,673
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Sep. 24, 2016 01:00 PM EDT Reads: 1,480
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
Sep. 24, 2016 12:00 PM EDT Reads: 751
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Sep. 24, 2016 12:00 PM EDT Reads: 761
Digital innovation is the next big wave of business transformation based on digital technologies of which IoT and Big Data are key components, For example: Business boundary innovation is a challenge to excavate third-party business value using IoT and BigData, like Nest Business structure innovation may propose re-building business structure from scratch, as Uber does in the taxicab industry The social model innovation is also a big challenge to the new social architecture with the design fr...
Sep. 24, 2016 11:45 AM EDT Reads: 1,027
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
Sep. 24, 2016 10:45 AM EDT Reads: 1,719
IoT is fundamentally transforming the auto industry, turning the vehicle into a hub for connected services, including safety, infotainment and usage-based insurance. Auto manufacturers – and businesses across all verticals – have built an entire ecosystem around the Connected Car, creating new customer touch points and revenue streams. In his session at @ThingsExpo, Macario Namie, Head of IoT Strategy at Cisco Jasper, will share real-world examples of how IoT transforms the car from a static p...
Sep. 24, 2016 10:30 AM EDT Reads: 1,471
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Sep. 24, 2016 10:15 AM EDT Reads: 2,524