|By Jason Bloomberg||
|June 2, 2012 10:00 AM EDT||
The more you focus on the business benefits of Cloud, the more likely you'll be leaning toward public over private deployment models. Furthermore, this mind shift isn't all about security risks. Once you work through the issues, you'll likely come to the same conclusion: there's generally little or no solid business reason to build a private Cloud.
I had the pleasure of speaking at two quite different Cloud Computing conferences last week: Opal’s Business of Cloud Computing in Dallas and UBM’s CloudConnect in Bangalore. As the conference names and locations might suggest, the former was the more business-oriented while the latter was chock full of techies. What I didn’t expect, however, was that the business Cloud crowd had a more mature, advanced conception of Cloud than the technical audience. While the techies were still struggling with essential characteristics like elasticity, trying to free themselves from the vendor nonsense that drives such conferences, the business folks generally had a well-developed understanding of what Cloud is really all about, and as a result, focused their discussions on how best to leverage the approach to meet both tactical and strategic business goals.
Perhaps the most interesting contrast between the perspectives of these two audiences was their respective opinions about private Clouds. The techies at the Bangalore conference, having drunk too much of the vendor Kool-Aid, were generally of the opinion that public Clouds were too risky, and that their organizations should thus focus their efforts on the private deployment model. The Dallas business crowd, in contrast, generally held that the public approach was the way to go, with some folks even going so far as to claim that public Cloud was the only true approach to Cloud Computing.
This distinction is remarkable, and aligns with ZapThink’s thinking on this matter as well: the more you focus on the business benefits of Cloud, the more likely you’ll be leaning toward public over private deployment models. Furthermore, this mind shift isn’t all about security risks. We recently debunked the notion that public Clouds are inherently less secure than private ones, and many people at the Dallas conference agreed. But there’s more to this story. Once you work through the issues, you’ll likely come to the same conclusion: there’s generally little or no solid business reason to build a private Cloud.
The Problems with Private Clouds
The best way to understand the limitations of the private deployment model is to take the business perspective. What are the business benefits behind the move to the Cloud, and how can you achieve them?
Cloud will shift capital expense to operational expense – instead of having to invest in hardware and software, you can pay-as-you-go for what you need as an operational expense, and write it off your taxes right away. Except, of course, with private Clouds, where you have to build out the entire data center infrastructure yourself. If anything, private Clouds increase capital expenditures.
Cloud increases server utilization while dealing with spikes in demand – instead of setting up a data center full of servers that run idle most of the time on the off chance you need them to deal with the occasional Slashdot post or Justin Bieber tweet, the Cloud improves utilization while its elasticity deals with those annoying spikes. Except, of course, in private Clouds, unless your organization is so huge that multiple divisions look to your Cloud to handle many different spikes in demand, that you fervently hope arrive at different times. But what if that Kim Kardashian visit to your corporate HQ causes traffic to all your divisions to spike at once? Fugeddaboutit.
Cloud keeps infrastructure costs very low for new projects, since they don’t have much traffic yet – again, works much better in a public Cloud. How many such projects do you expect to have at any one time? If the number isn’t in the hundreds or thousands, then private Cloud is massive overkill for this purpose.
The elasticity benefit of the Cloud gives us the illusion of infinite capacity – infinite capacity is all fine and good, but it’s an illusion. And illusions work fine until, well, until they don’t. Elasticity provides the illusion of infinite capacity as long as there is always sufficient capacity to meet additional demand for Cloud resources. You’ll never consume all the capacity of a public Cloud, but your Private cloud is another matter entirely. It’s only so big. If one of your developers has the bright idea to provision a thousand virtual machine instances or a petabyte of storage for that Big Data project, and your private Cloud doesn’t have the physical capacity to do so, then bye-bye illusion.
We already have a significant investment in our existing data center, so converting it to a private Cloud will save us money while enabling us to obtain the benefits of the Cloud – in your dreams. One essential requirement for building an effective private Cloud is rigorous homogeneity. You want all your physical servers, network equipment, virtualization technology, storage, etc. to be completely identical across every rack. Look at your existing, pre-Cloud data center. Homogeneity isn’t even on your radar.
We don’t want to be in the data center business. That’s why we’re moving to the Cloud – guess what? Building a private Cloud puts you in the data center business!
Whatever cost efficiencies the public Cloud providers can achieve we can also achieve in our private Cloud – this argument doesn’t hold water either. Not only to the leading public Clouds—Amazon, Microsoft Azure, Rackspace, etc.—have enormous economies of scale, but they’re also operating on razor-thin margins. Furthermore, if they can wring more efficiencies out of the model, they’ll lower their prices. They’re taking this “price war” approach to their margins for all the regular business school reasons: to keep smaller players from being competitive, and to push their larger competitors out of the business. It doesn’t matter how big your private Cloud is, it simply cannot compete on price.
OK fine, you get it. Private Clouds suck, fair enough. You’ll even buy our arguments that public Clouds may actually be more secure than private ones. But you’re in a regulated industry or otherwise have stringent regulatory requirements about data protection or data movement that the public Cloud providers can’t adequately address. The only way you can move to the Cloud at all is to build a private Cloud.
Not so fast. While it’s true that regulatory compliance business drivers and limitations are becoming an increasingly important part of the Cloud story, any regulatory drawbacks to using public Clouds are essentially temporary, as the market responds to this demand. A new class of public Cloud provider, what is shaping up to be the “Enterprise Public Cloud Provider” marketplace, is on the rise. The players in this space are putting together offerings that include rigorous auditing, more transparent and stringent service-level agreements, and overall better visibility for corporate customers with regulatory concerns.
The incumbent public Cloud providers aren’t standing still either. For example, while Amazon built their public Cloud (and with it, the entire industry) on a “one size fits all” model aimed initially at developers, startups, and other small to midsize companies, they have been working on building out their enterprise offerings for a while now. While you may not be able to get solutions from the big players that meet your regulatory needs today, you can be sure it won’t take them long to figure out how to compete in even the most regulated industries. In a few years, if you look back on your decision to build a private Cloud on the basis of regulatory compliance, you’ll likely feel quite foolish as your competitors who waited will soon have fully compliant public alternatives, while you’re stuck paying the bills on your private Cloud initiative that will have become an expensive money pit.
The ZapThink Take
So, should any organization build a private Cloud? Perhaps, but only the very largest enterprises, and only when those organizations can figure out how to get most or all of their divisions to share those private Clouds. If your enterprise is large enough to achieve similar economies of scale to the public providers, then—and only then—will a private option be a viable business alternative.
In many such cases, those large enterprise private Clouds essentially become community Clouds, as multiple divisions of an enterprise share a single internal Cloud provider that operates much like a public Cloud, albeit for internal use across the enterprise. This community model makes sense, for example, for many federal governments. They can achieve the cost efficiencies of public Clouds while maintaining the control benefits of private Clouds by supporting the Cloud initiatives across multiple agencies.
Virtual Private Clouds (VPCs) also give many organizations the best of both worlds, as they leverage the public Cloud but run logically on your private network. Many hybrid Clouds follow the VPC approach, as hybrid on premise/Cloud models typically leverage private networks. ZapThink predicts this hybrid VPC model will become the predominant deployment model in the enterprise.
Still not convinced? Well, ask yourself why, and the answer is likely to be a question of control. Many executives will still be uncomfortable about public Clouds, even when we address the security and compliance issues that currently face public Cloud providers, simply because they don’t control the public Cloud. Our answer? Distribution of IT control is essential to the ZapThink 2020 vision, and is at the heart of the Agile Architecture Revolution. The Web doesn’t have centralized control, after all, and it works just fine. The app store model for enterprise IT, the rise of bring your own device (BYOD), and the fundamentally mobility-driven architecture of the Internet of Things are all examples of the broader shift to the notion decentralized control over IT. Fighting to maintain control is a losing proposition, and as a result, by 2020, private Clouds will be a mostly-forgotten bump on the road to the next big thing.
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 09:30 AM EST Reads: 1,074
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
Dec. 10, 2016 09:00 AM EST Reads: 1,774
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
Dec. 10, 2016 08:30 AM EST Reads: 1,505
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 08:15 AM EST Reads: 1,255
"We analyze the video streaming experience. We are gathering the user behavior in real time from the user devices and we analyze how users experience the video streaming," explained Eric Kim, Founder and CEO at Streamlyzer, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 07:30 AM EST Reads: 773
"This is specifically designed to accommodate some of the needs for high availability and failover in a network managed system for the major Korean corporations," stated Thomas Masters, Managing Director at InfranicsUSA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 06:45 AM EST Reads: 548
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Dec. 10, 2016 06:00 AM EST Reads: 715
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 04:45 AM EST Reads: 1,104
"We are a leader in the market space called network visibility solutions - it enables monitoring tools and Big Data analysis to access the data and be able to see the performance," explained Shay Morag, VP of Sales and Marketing at Niagara Networks, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 04:30 AM EST Reads: 558
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
Dec. 10, 2016 04:15 AM EST Reads: 1,392
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Dec. 10, 2016 04:15 AM EST Reads: 585
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Dec. 10, 2016 04:00 AM EST Reads: 5,530
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Dec. 10, 2016 04:00 AM EST Reads: 5,320
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
Dec. 10, 2016 02:45 AM EST Reads: 2,272
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
Dec. 10, 2016 02:15 AM EST Reads: 809
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Dec. 10, 2016 02:00 AM EST Reads: 666
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 10, 2016 02:00 AM EST Reads: 1,997
An IoT product’s log files speak volumes about what’s happening with your products in the field, pinpointing current and potential issues, and enabling you to predict failures and save millions of dollars in inventory. But until recently, no one knew how to listen. In his session at @ThingsExpo, Dan Gettens, Chief Research Officer at OnProcess, discussed recent research by Massachusetts Institute of Technology and OnProcess Technology, where MIT created a new, breakthrough analytics model for s...
Dec. 10, 2016 01:30 AM EST Reads: 804
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 10, 2016 01:30 AM EST Reads: 4,006
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 01:15 AM EST Reads: 1,229