Welcome!

@CloudExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, Government Cloud

@CloudExpo: Article

An Inconvenient Truth of the NIST Definition of Cloud Computing

The classification and some definitions of the four deployment models are redundant and inconsistent

Amid the many benefits of having the NIST SP 800-145 as a tool to facilitate the understanding of cloud computing, the classification and some definitions of the four deployment models are redundant and inconsistent. Particularly, the definition of "community cloud" is a redundancy of that of a private cloud, the deployment models are defined with two sets of criteria, and "hybrid cloud" is a confusing, ambiguous, and extraneous term.

SP 800-145 is the de facto standard in the IT industry of describing what cloud computing is with five essential characteristics, three delivery methods, and four deployment models. The five essential characteristics will specify the qualifications and expected behaviors of an object qualified with the term cloud. The three delivery methods signify the essence of cloud computing centered on the concept of a "service." Both the characteristics and the delivery methods in SP 800-145 form a solid foundation and present a conceptual model envisioning what cloud computing is and about. SP 800-145 gets inconvenient where the four deployment models including public, community, private, and hybrid clouds are defined, as shown below.

The Premise
Reviewing the definitions of the first three deployment models, there is a common theme. Among public, community, and private clouds, the classification is based on the intended audiences whom a cloud with its resources is dedicated to. Namely, a public cloud is intended to be consumed by the general public and a private cloud is dedicated to a single organization, i.e., for a targeted group of users. SP 800-145 classifies a private cloud and a public cloud with consistent criteria.

It is important to recognize that building a cloud with owned hardware does not default it as a private cloud of the owner's, while a cloud with accessibility via the Internet or operated by an internet service provider does not automatically make it a public cloud either. Again, the intended audiences determine it is a private or public cloud. Although many seem to default a private cloud as an on-premise deployment to owned hardware, this is nonetheless not a requirement of a private cloud.

Further "public" here does not suggest that it is free or accessible anonymously. It simply means the cloud is dedicated for the general public to consume, while there can be business or administrative restrictions imposed. Microsoft Office 365, available based on a subscription, and Hotmail, requiring a Live ID to sign, are vivid examples of public cloud offerings with restrictions.

Inconvenience #1: The classification of "community cloud" is extraneous.
A community cloud according to 800-145 is a cloud for a specific community of consumers from organizations. As far as a member of the associated community is concerned, a community cloud is indeed a private cloud for that particular community. The number of the organizations and the administrative boundaries encompassing a community are irrelevant since from a private cloud's view point, an authorized user is an authorized user regardless which organization one belongs to. A cloud for a community of users from either various departments, business units within a company or business partners from companies in many parts of the world is essentially a private cloud dedicated for that community.

Inconvenience #2: Using two sets of criteria to define cloud deployment models roots inconsistency and ambiguity.
As defined in SP 800-145, a hybrid cloud is a composition of infrastructures, yet at the same time a private cloud and a public cloud are defined according to their intended audiences. The change of criteria in classifying a hybrid cloud roots inconsistency and ambiguity in the deployment models presented in SP 800-145. Forming a concept with two sets of criteria is simply a confusing way to describe an already very confusing subject like cloud computing.

Inconvenience #3: "Hybrid cloud" is an ambiguous, confusing, and frequently misused term.
A hybrid cloud is a composition of two or more distinct cloud infrastructures (private, community, or public) as stated in SP 800-145. That is to say that a hybrid cloud can be a composition of private/private, private/community, private/public, etc. From a consumer's point of view, they are in essence a private cloud, a private cloud, and a public or private cloud, respectively. Regardless of how a hybrid cloud is constructed, if it is intended for public consumption it is a public cloud, and if for a particular group of people it is then a private cloud according to SP 800-145. Essentially the composition of clouds is still a cloud and it is either a public or private cloud, and cannot be both at the same time.

For many enterprise IT professionals, a hybrid cloud means an on-premise private cloud connected with some off-premise resources. Notice these off-premise resources are not necessarily in reality a cloud. In such cases, it is simply a private cloud with some extended boundaries. A cloud is a set of capabilities and must be referenced in the context of the delivered application. Just placing a VM in the cloud or referencing a database placed in the cloud does not make the VM or the database a public cloud application.

The key is that a hybrid cloud is a derived concept of clouds. Namely, a hybrid can be integrations, modifications, extensions, or a combination of all of the cloud infrastructures. A hybrid is nevertheless not a new concept or a different deployment model and should not be classified as a unique deployment model in addition to the two essential ones, i.e., the public and private cloud models. A cloud is either public or private and there isn't a third kind of cloud deployment model based on the intended users.

"Hybrid cloud" is perhaps a great catchy marketing term. For many, a hybrid seems to suggest it is advanced, leading edge, and magical, and therefore better and preferred. The truth is "hybrid cloud" is an ambiguous, confusing, and frequently misused term. It confuses people, interjects noises into a conversation, and only to further confirm the state of confusion and inability to clearly understand what cloud computing is.

More Stories By Yung Chou

Yung Chou is a Technology Evangelist in Microsoft. Within the company, he has had opportunities serving customers in the areas of support account management, technical support, technical sales, and evangelism. Prior to Microsoft, he had established capacities in system programming, application development, consulting services, and IT management. His recent technical focuses have been in virtualization and cloud computing with strong interests in hybrid cloud and emerging enterprise computing architecture. He is a frequent speaker in Microsoft conferences, roadshow, and TechNet events.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
All zSystem customers have a significant new business opportunity to extend their reach to new customers and markets with new applications and services, and to improve the experience of existing customers. This can be achieved by exposing existing z assets (which have been developed over time) as APIs for accessing Systems of Record, while leveraging mobile and cloud capabilities with new Systems of Engagement applications. In this session, we will explore business drivers with new Node.js apps for delivering enhanced customer experience (with mobile and cloud adoption), how to accelerate development and management of SoE app APIs with API management.
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully been able to harness the excess capacity of privately owned vehicles and turned into a meaningful business. This concept can be step-functioned to harnessing the spare compute capacity of smartphones that can be orchestrated by MEC to provide cloud service at the edge.
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addressed the challenges of scaling document repositories to this level; architectural approaches for coordinating data; search and storage technologies, Solr, and Amazon storage and database technologies; the breadth of use cases that modern content systems need to support; how to support user applications that require subsecond response times.
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage of data and analytics in the cloud, Architecture, integration, governance and security scenarios and Key challenges and success factors of moving data and analytics to the cloud