@CloudExpo Authors: Liz McMillan, Zakia Bouachraoui, Elizabeth White, Pat Romanski, Carmen Gonzalez

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Machine Learning

@CloudExpo: Blog Feed Post

The Gartner Cloud Computing Smell Test

Sniff, sniff…. Yep, it’s a Cloud!

Gartner released five criteria to determine whether the pile you’ve been sniffing is in fact what you think it is. In general, I think this type of shoehorn analysis is dangerous because it never really goes deep enough. They are invariably so nebulous that in some small way my hacker nephew could probably qualify his College project. The problem with top down research like this is that the devil is always in the details.

I took the liberty of regurgitating the list for you to peruse here:

Service-Based: Consumer concerns are abstracted from provider concerns through service interfaces that are well-defined. The interfaces hide the implementation details and enable a completely automated response by the provider of the service to the consumer of the service. The service could be considered “ready to use” or “off the shelf” because the service is designed to serve the specific needs of a set of consumers, and the technologies are tailored to that need rather than the service being tailored to how the technology works.

The articulation of the service feature is based on service levels and IT outcomes (availability, response time, performance versus price, and clear and predefined operational processes), rather than technology and its capabilities. In other words, what the service needs to do is more important than how the technologies are used to implement the solution.

Scalable and Elastic: The service can scale capacity up or down as the consumer demands at the speed of full automation (which may be seconds for some services and hours for others). Elasticity is a trait of shared pools of resources. Scalability is a feature of the underlying infrastructure and software platforms. Elasticity is associated with not only scale, but also an economic model that enables scaling in both directions in an automated fashion. This means that services scale on demand to add or remove resources as needed.

Shared: Services share a pool of resources to build economies of scale. IT resources are used with maximum efficiency. The underlying infrastructure, software or platforms are shared among the consumers of the service (usually unknown to the consumers). This enables unused resources to serve multiple needs for multiple consumers, all working at the same time.

Metered by Use: Services are tracked with usage metrics to enable multiple payment models. The service provider has a usage accounting model for measuring the use of the services, which could then be used to create different pricing plans and models. These may include pay-as-you go plans, subscriptions, fixed plans and even free plans. The implied payment plans will be based on usage, not on the cost of the equipment. These plans are based on the amount of the service used by the consumers, which may be in terms of hours, data transfers or other use-based attributes delivered.

Uses Internet Technologies: The service is delivered using Internet identifiers, formats and protocols, such as URLs, HTTP, IP and representational state transfer Web-oriented architecture. Many examples of Web technology exist as the foundation for Internet-based services. Google’s Gmail, Amazon.com’s book buying, eBay’s auctions and Lolcats’ picture sharing all exhibit the use of Internet and Web technologies and protocols.

So 6fusion officially passes the Gartner smell test. Woop-dee-doo! Can a service provider actually use what we cooked up? The Gartner criteria can’t tell you.

What is going to matter most in the coming years is the next set of five criteria, which will measure the Cloud service provider’s actual relevance (now there’s a term right out of the .bomb era, eh?) to a service organization (outsourced or internal, it doesn’t matter). Here are the next five criteria you need to think about in for Gartner’s smell test to really help you tell the difference between the piles you are sniffing, especially if you are thinking about using cloud services to offer solutions to real business customers:

Business Value: The service can clearly demonstrate business value to the customer. Two examples of business value are proven and documented (maybe even guaranteed) TCO improvement (or operational efficiency gains), and an SLA with teeth (real financial risk for the service provider if they blow it). A real cloud service must do more than pay these subjects casual lip service.

Data Residency Control & Interoperability: The service provider is able to allow the customer to choose where data resides in the cloud (physically). Compliance is not something that can be ignored just because Gmail is cool. That may fly during the honeymoon period, but business customers care where their data (such as email or CRM) is sitting and who can potentially get their hands on it (just ask Liquid Motors what they think). In addition, the service provider must support a minimum standard for interoperability. If a customer builds a workload on your cloud today, they should be able to move to some other cloud tomorrow. Proprietary lock in is so 1999. Examples of supported interoperability include standards like Open Virtualization Format (OVF).

Any App Architecture: The service provider must be able to address non-web services applications. Contrary to the folks with SaaS on the brain these days, the vast majority of businesses in fact do *not* run web services applications. In order for a cloud service provider to prove relevance they must be able to enable or assist with the transition from the old paradigm to the new without first selling the customer a shiny new forklift.

Business Process Integration: The services provided cannot be an island. This is not called island computing. This is called cloud computing. Therefore, cloud services must be able to integrate with existing customer business process (particularly the automated ones) practices. A good indication as to the level of sophistication your cloud service provider maintains is the number of use cases. Does their cloud service interweave with your DR plan and your production collaboration suite? Or are they a one-trick pony?

Cost Profiling: The service provider must allow the customer to profile their applications before moving to the cloud service in order to gauge performance, and most importantly, the cost. Metered use is great. But if a customer can’t translate that into what they will really pay for services, it is not enough. Simplicity is the hallmark of cost profiling and automation is a cornerstone.

Knowing what cloud computing should smell so that when it comes along you can recognize it is obviously important to mainstream adoption of cloud computing. But this alone isn’t enough of a guide as you find your way through the dark. Being able to independently judge a service offering for relevance to your most prized possession (your customers) will prove to be a much more poignant test.

Read the original blog entry...

More Stories By John Cowan

John Cowan is co-founder and CEO of 6fusion. John is credited as 6fusion's business model visionary, bridging concepts and services behind cloud computing to the IT Service channel. In 2008, he along with his 6fusion collaborators successfully launched the industry's first single unit of meausurement for x86 computing, known as the Workload Allocation Cube (WAC). John is a 12 year veteran of business and product development within the IT and Telecommunications sectors and a graduate of Queen's University at Kingston.

CloudEXPO Stories
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes.
When you're operating multiple services in production, building out forensics tools such as monitoring and observability becomes essential. Unfortunately, it is a real challenge balancing priorities between building new features and tools to help pinpoint root causes. Linkerd provides many of the tools you need to tame the chaos of operating microservices in a cloud native world. Because Linkerd is a transparent proxy that runs alongside your application, there are no code changes required. It even comes with Prometheus to store the metrics for you and pre-built Grafana dashboards to show exactly what is important for your services - success rate, latency, and throughput.
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application performance guarantees & data privacy.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the benefits of the cloud without losing performance as containers become the new paradigm.