Welcome!

@CloudExpo Authors: William Schmarzo, Stefana Muller, Elizabeth White, Karthick Viswanathan, Pat Romanski

Related Topics: @CloudExpo

@CloudExpo: Article

Public Cloud Computing: Enabling the Elastic Enterprise

The private cloud enables elastic computing but the public cloud enables the elastic enterprise

Sometimes the results of a Google search can be most enlightening for what they don't show. 

For instance, I was wondering about the differences between private and public cloud computing in terms the strategic business benefits of each, so I did a Google search that looked like this:

As you can see, that search only returned 24 results, and ten of those were duplicates!   Not only is 14 a surprisingly tiny number, but about half of those were cases where the two phrases happened to occur on the same page with no semantic relationship between them, and the remainder all expressed the same view: "Private cloud computing provides no strategic business benefits".

Is that true?  I think it depends on how one defines "private" and "public".  In my view, cloud computing infrastructure and applications operated within an organization's private network and only made available to its employees, agents and contractors is a private cloud.  Conversely, cloud infrastructure and apps made available through the Internet to an organization's operatives and potentially also shared with its partners and customers is a public cloud.

Given those definitions, both private and public clouds certainly offer a variety of significant common operational and tactical benefits, including lower cost, faster deployment, and improved scalability.  And, the private cloud is often validly sold against the public one on the basis of its often superior control, security and flexibility.

But, what can a private cloud do to directly enhance a business's ability to enter a new market, kill a competitor, or deliver a superior product?  I would have to say, "bupkis".  Private cloud computing makes IT more elastic but it doesn't change how the business functions internally or how it interacts with customers and partners externally.

The public cloud is quite different in this regard.  It is a shared, Internet-based IT context that can make the business itself more elastic by enhancing the richness, flexibility and efficiency of its interaction with customers and partners at all three levels of the cloud - Software, Platform, and Infrastructure.

By using public cloud capabilities like the ones above, companies can, for example,

  • Leverage customer experience for marketing, sales and support
  • Provide inventory and order tracking to their customers and suppliers
  • Jointly pursue larger business opportunities than they could individually
  • Build virtual product and services businesses by combining specialties

The specific capabilities of the public cloud plus its utility-style economic model, public context, third party management, and technology transparency all combine and work together to enable unprecedented business creativity and agility.  Private clouds simply can't do that.

If you are currently looking into how you might use public cloud computing in your business, then you probably already know that there are many, many different vendors and different kinds of offerings in this space.  Some cover very small parts of the spectrum while other aim to cover most or all of it.

For example, Amazon's EC2 is IaaS, offering virtual servers with some system software, while Microsoft Azure offers similar IaaS features, but also adds PaaS features for developing, deploying and managing applications.  Longjump is a pure-play Paas offering, while Google combines its App Engine PaaS with its Google Apps productivity SaaS and Salesforce.com does something similar with its Force.com PaaS layer and its extensible SaaS CRM apps.  And finally, there are many things like Lotus Live, pure cloud-based SaaS apps.

Across the spectrum of different kinds of public cloud software, though, there are several common factors that all prospective buyers should consider carefully before making a commitment:

Security and Compliance

Public cloud environments, like public web sites, can be quite vulnerable to a variety of security threats, including DDoS attacks and various kinds of malware infections.  But, the risk can be compounded by the complexity and scale of some of these services and exacerbated by insufficient security monitoring and response automation.  Also, there are a growing number of regulations governing the security and physical location of sensitive data for which the distributed nature and immaturity of public cloud computing can make compliance a challenge.  Grill prospective vendors on their security and regulatory compliance capabilities.

Assured Service Level

If you use managed hosting, colocation or other types conventional computing and networking services, you know what an SLA is.  A Service Level Agreement is a contract wherein a vendor specifies committed levels of availability, throughput, management oversight, etc. and the remedies and penalties that will apply when those levels are not met.  Many public cloud service providers are not yet offering rigorous and detailed SLAs to their subscribers, and with no SLA, you could end up SOL.

Multi-Party Capabilities

Many public cloud services closely mimic conventional IT and private cloud functionality that assumes a server, database, network address space, application runtime or other element of the service will only be used by a single organization.  Of, if it is to be shared by more than one organization, that will be done at the edges through things like traditional web services.  If you aim to use SaaS groupware for, say, joint R&D, or you intend to build a supply chain database application to be shared by your suppliers and customers, make sure that the services you are considering can accommodate multiple parties in a correct manner, without the need for specialized glue code or exceptional architecture.   Good examples of this being done properly include the "Salesforce to Salesforce" feature of Force.com, which enables different tenants to easily integrate their applications through a straightforward publish-and-subscribe interface, or the rigorous multi-party "deal room" document sharing features of Watchdox.

It is still early days for the public cloud, but its importance cannot be overstated.  It provides a unique pathway to new ways of doing business that transcend the century-old models currently in use and promises to make private computing, like site-based power generators and PBXs, largely a thing of the past, supplanted by ubiquitous, efficient, and affordable utility services that deliver the same advantages to businesses of all sizes around the world.

More Stories By Tim Negris

Tim Negris is SVP, Marketing & Sales at Yottamine Analytics, a pioneering Big Data machine learning software company. He occasionally authors software industry news analysis and insights on Ulitzer.com, is a 25-year technology industry veteran with expertise in software development, database, networking, social media, cloud computing, mobile apps, analytics, and other enabling technologies.

He is recognized for ability to rapidly translate complex technical information and concepts into compelling, actionable knowledge. He is also widely credited with coining the term and co-developing the concept of the “Thin Client” computing model while working for Larry Ellison in the early days of Oracle.

Tim has also held a variety of executive and consulting roles in a numerous start-ups, and several established companies, including Sybase, Oracle, HP, Dell, and IBM. He is a frequent contributor to a number of publications and sites, focusing on technologies and their applications, and has written a number of advanced software applications for social media, video streaming, and music education.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and storage creates a platform for consolidation, migration and enabling digital transformation.
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addressed the challenges of scaling document repositories to this level; architectural approaches for coordinating data; search and storage technologies, Solr, and Amazon storage and database technologies; the breadth of use cases that modern content systems need to support; how to support user applications that require subsecond response times.
An edge gateway is an essential piece of infrastructure for large scale cloud-based services. In his session at 17th Cloud Expo, Mikey Cohen, Manager, Edge Gateway at Netflix, detailed the purpose, benefits and use cases for an edge gateway to provide security, traffic management and cloud cross region resiliency. He discussed how a gateway can be used to enhance continuous deployment and help testing of new service versions and get service insights and more. Philosophical and architectural approaches to what belongs in a gateway vs what should be in services were also discussed. Real examples of how gateway services are used in front of nearly all of Netflix's consumer facing traffic showed how gateway infrastructure is used in real highly available, massive scale services.
By 2021, 500 million sensors are set to be deployed worldwide, nearly 40x as many as exist today. In order to scale fast and keep pace with industry growth, the team at Unacast turned to the public cloud to build the world's largest location data platform with optimal scalability, minimal DevOps, and maximum flexibility. Drawing from his experience with the Google Cloud Platform, VP of Engineering Andreas Heim will speak to the architecture of Unacast's platform and developer-focused processes.