@CloudExpo Authors: Zakia Bouachraoui, Elizabeth White, Liz McMillan, Pat Romanski, Roger Strukhoff

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Blog Feed Post

On Cloud Lock-in, Standards, Decoupling and Why SaaS Does Not Scale

How the three-tier cloud as we know it is changing

With security and legal concerns being slowly addressed by the industry, lock-in and standards are rapidly becoming the biggest concerns regarding cloud computing. If the cloud industry is to make good on its promise, these will need to somehow be addressed. Let’s examine some recent developments.

Interesting to see how, just a week after my blog on “The Principles and Perils of Vendor Lock-in” *1 several vendors made announcements seemingly supporting my suggested approach. For example, after hinting at the potential benefit of decoupling SaaS and PaaS from its underlying Infrastructure (IaaS) layers, Microsoft announced it is making Azure available as a PaaS platform to several large IaaS providers *2. Now I am sure this had nothing to do with my blog on preventing lock-in and all to do with a desire in Redmond to increase market share for their PaaS platform, which ironically - if too successful – may even increase lock-in. But the move will offer customers who select Microsoft’s PaaS platform a choice of vendors for the underlying Infrastructure services (IaaS).

At the same time NASA and Rackspace announced they are joining forces around an open source platform for Private Clouds called OpenStack *3. Rackspace’s initiative is no doubt as commercially motivated as Microsoft’s’. If Rackspace –in my view correctly - expects that many private clouds in the foreseeable future will start to source additional capacity (cloudburst) from public clouds, then having these private clouds be based on the same architecture as their public cloud offering will help Rackspace. NASA's motives stem from the US governments cloud stimulus approach *4 , a specific stated goal of which is “to accelerate the creation of cloud standards”. If history is to repeat itself, we can expect to first see industry standards lead to a “plug compatible cloud market”, before a serious “open standards cloud market” will take shape. As NASA is determined to have workable cloud standards a lot faster than the decade or so it took to get a man on the moon, it is understandable they see the Rackspace route as a viable shortcut. This is also understandable because agreeing on open cloud standards today would be as difficult as agreeing 3D TV standards back in the days of the black & white moon landing broadcasts. (And, reader beware, if there ever was a time to keep options open and not lock yourself into what looks to become an early standard , it would be today).

A decoupled cloud
For many readers my recommendation to prevent lock-in by decoupling the choice of application (SaaS) and platform (PaaS) vendors from the underlying choices of infrastructure (IaaS) vendors was pure heresy. Their logic was this: if you want control over underlying layers you should not embark on cloud computing because the whole idea of cloud computing is that someone else is responsible for the underlying layers. But that’s like saying that: if you don’t want to buy clothes that were made by under aged children; you should get a sewing machine and make your own clothes.

Others struggled with imagining what such a decoupled cloud would look like in practice. Luckily, also last week (it was indeed an eventful week for cloud computing) a first real live example of such a decoupled offering went live. Skygone Inc. announced they were offering a choice of GIS (Geographical Information System) services *5 by aggregating solutions from several GIS software vendors across a choice of infra-structure platforms and vendors. Companies in need of such geographical information – which is a complex and specialized area, beyond the expertise and interest of most internal IT departments - can now simply source this without locking themselves into a specific vendor or platform. (Disclosure: Skygone uses as underlying platform Applogic from 3Tera, now owned by my employer CA Technologies.)

Several analyst firms predicted early on that this type of “brokering of cloud services” would become an important market force. But in recent months –maybe under the influence of several self-proclaimed 100 pound gorilla’s entering the cloud market – the analyst community became very quiet about the concept, which is a shame because it also addresses the fact that in an enterprise context “SaaS does not scale”.

  • SaaS does not Scale?
  • Now before readers get all wound up (again), with “SaaS does not Scale”, I do not mean that SaaS applications cannot scale to service millions of users. They do already, although some more successful and reliable than others. I mean that the average enterprise or government organization, which typically has a portfolio of several hundreds or even thousands of applications, can simply not afford to source these from a similar number of SaaS providers. The mandatory auditing of the infrastructure and processes of all these providers would simply not be feasible, also as a leading analyst firm just pointed out that a SAS70 certificate is no replacement for such mandatory due diligence *6a. They did so at about at the same time they suggested that for many a SaaS vendor it would make sense to partner with IaaS vendors for delivery of their services *6band that the traditional SaaS market may not grow to be as big as many initially expected *6c (shows once more that predicting developments and/or placing customer bets in a brand new area like cloud computing continues to be a risky business).

A better way
Summarizing the described mix and match approach of a decoupled, brokered cloud aims to allow enterprises to select the applications they need from several SaaS vendors, pick the platforms they like from a choice of PaaS vendors and deploy these across their choice of selected and audited IaaS vendors, without running into lock-in or scalability issues.

Now it is important to understand that this approach does not in any way, shape or form resemble the old way IT used to work. Let’s use an analogy from the consumer IT market to describe the difference:

  • IT, the old way: As a consumer you would go to a computer store to pick a software package, let’s say a cooking application. From the 20 available offers you pick one (likely the one with the nicest picture on the box), only to arrive home and discover your PC has a release of the operating system / database / browser that is not supported. After fixing this (there goes the weekend), you still cannot get it to run. You solicit some consulting from your neighbor/nephew/colleague; while your spouse remarks that at this rate you will be eating take out for another month (no pressure!). Finally during week 3 you get it to work, although printing still has it quirks. You learned a lot more about your PC, but little about cooking. One month later you buy a new PC and strangely the whole thing stops working again. Luckily the vendor sends you an email in which they offer an upgrade that runs on your new PC. Comparing it to the cost of takeout, you decide to buy the upgrade.
  • IT, the new way: You feel hungry, without leaving your seat you visit the appstore on your phone, they offer 60 cooking applications, you pick the one most downloaded (after reading some of the user comments). You prepare your first dish. It is too salty. You blame the application, remove it, and pick another one. That tastes better. You decide whether you use the free version (that includes a automatically printed shopping list for the supermarket chain sponsoring the app) or you pay 20 cents per recipe cooked.

The decoupled cloud experience we are aiming for should of course feel like the second scenario. Also note how in the first example we talked mainly about technology and in the second mainly about cooking. Somehow we in IT moved from talking about what our companies do (selling soup, soap or insurance) to mainly discussing technologies (like SOA, SOAP and yes: Cloud).

In other words we need to change from being mainly Supply Driven, with IT in the role of factory managers running production of services, to a Demand-Driven IT organization with IT in the role of a supply chain manager, finding the best way to source the functionality for the business, preferably without locking our company into a dead-end street. End goal is being able to deliver the 20% that really differentiates our company, while at the same time being able to source the 80% that is pretty much the same for all companies.

That type of agility is the real promise of cloud computing.

This post originally appeared on July 26 at ITSMportal.com


*1 The Principles and Perils of Vendor Lock-In

*2 Microsoft announced it is making Azure available as a PaaS platform to several large IaaS providers

*3 NASA and Rackspace announced they are joining forces around an open source platform for Private Clouds called OpenStack

*4 The US government investments in cloud computing could be seen as a modern day industry stimulus package. In my view current efforts of NASA and the like may have as deep an impact on cloud computing as the cold war DoD budgets had on the development of computer networks and the Apollo project had on technology advancement in general.

*5 Skygone Inc. announced offering a choice of GIS (Geographical Information System) services

*6a SAS 70 is Not Proof of Security, Privacy, or Continuity Compliance

*6b Public Cloud Infrastructure Helps SaaS Vendor Economics

*6c Organizations Need to Re-Evaluate the Rationale for SaaS

More Stories By Gregor Petri

Gregor Petri is a regular expert or keynote speaker at industry events throughout Europe and wrote the cloud primer “Shedding Light on Cloud Computing”. He was also a columnist at ITSM Portal, contributing author to the Dutch “Over Cloud Computing” book, member of the Computable expert panel and his LeanITmanager blog is syndicated across many sites worldwide. Gregor was named by Cloud Computing Journal as one of The Top 100 Bloggers on Cloud Computing.

Follow him on Twitter @GregorPetri or read his blog at blog.gregorpetri.com

CloudEXPO Stories
The precious oil is extracted from the seeds of prickly pear cactus plant. After taking out the seeds from the fruits, they are adequately dried and then cold pressed to obtain the oil. Indeed, the prickly seed oil is quite expensive. Well, that is understandable when you consider the fact that the seeds are really tiny and each seed contain only about 5% of oil in it at most, plus the seeds are usually handpicked from the fruits. This means it will take tons of these seeds to produce just one bottle of the oil for commercial purpose. But from its medical properties to its culinary importance, skin lightening, moisturizing, and protection abilities, down to its extraordinary hair care properties, prickly seed oil has got lots of excellent rewards for anyone who pays the price.
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected path for IoT innovators to scale globally, and the smartest path to cross-device synergy in an instrumented, connected world.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.