@CloudExpo Authors: Carmen Gonzalez, Yeshim Deniz, Pat Romanski, Zakia Bouachraoui, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Blog Feed Post

On Cloud Lock-in, Standards, Decoupling and Why SaaS Does Not Scale

How the three-tier cloud as we know it is changing

With security and legal concerns being slowly addressed by the industry, lock-in and standards are rapidly becoming the biggest concerns regarding cloud computing. If the cloud industry is to make good on its promise, these will need to somehow be addressed. Let’s examine some recent developments.

Interesting to see how, just a week after my blog on “The Principles and Perils of Vendor Lock-in” *1 several vendors made announcements seemingly supporting my suggested approach. For example, after hinting at the potential benefit of decoupling SaaS and PaaS from its underlying Infrastructure (IaaS) layers, Microsoft announced it is making Azure available as a PaaS platform to several large IaaS providers *2. Now I am sure this had nothing to do with my blog on preventing lock-in and all to do with a desire in Redmond to increase market share for their PaaS platform, which ironically - if too successful – may even increase lock-in. But the move will offer customers who select Microsoft’s PaaS platform a choice of vendors for the underlying Infrastructure services (IaaS).

At the same time NASA and Rackspace announced they are joining forces around an open source platform for Private Clouds called OpenStack *3. Rackspace’s initiative is no doubt as commercially motivated as Microsoft’s’. If Rackspace –in my view correctly - expects that many private clouds in the foreseeable future will start to source additional capacity (cloudburst) from public clouds, then having these private clouds be based on the same architecture as their public cloud offering will help Rackspace. NASA's motives stem from the US governments cloud stimulus approach *4 , a specific stated goal of which is “to accelerate the creation of cloud standards”. If history is to repeat itself, we can expect to first see industry standards lead to a “plug compatible cloud market”, before a serious “open standards cloud market” will take shape. As NASA is determined to have workable cloud standards a lot faster than the decade or so it took to get a man on the moon, it is understandable they see the Rackspace route as a viable shortcut. This is also understandable because agreeing on open cloud standards today would be as difficult as agreeing 3D TV standards back in the days of the black & white moon landing broadcasts. (And, reader beware, if there ever was a time to keep options open and not lock yourself into what looks to become an early standard , it would be today).

A decoupled cloud
For many readers my recommendation to prevent lock-in by decoupling the choice of application (SaaS) and platform (PaaS) vendors from the underlying choices of infrastructure (IaaS) vendors was pure heresy. Their logic was this: if you want control over underlying layers you should not embark on cloud computing because the whole idea of cloud computing is that someone else is responsible for the underlying layers. But that’s like saying that: if you don’t want to buy clothes that were made by under aged children; you should get a sewing machine and make your own clothes.

Others struggled with imagining what such a decoupled cloud would look like in practice. Luckily, also last week (it was indeed an eventful week for cloud computing) a first real live example of such a decoupled offering went live. Skygone Inc. announced they were offering a choice of GIS (Geographical Information System) services *5 by aggregating solutions from several GIS software vendors across a choice of infra-structure platforms and vendors. Companies in need of such geographical information – which is a complex and specialized area, beyond the expertise and interest of most internal IT departments - can now simply source this without locking themselves into a specific vendor or platform. (Disclosure: Skygone uses as underlying platform Applogic from 3Tera, now owned by my employer CA Technologies.)

Several analyst firms predicted early on that this type of “brokering of cloud services” would become an important market force. But in recent months –maybe under the influence of several self-proclaimed 100 pound gorilla’s entering the cloud market – the analyst community became very quiet about the concept, which is a shame because it also addresses the fact that in an enterprise context “SaaS does not scale”.

  • SaaS does not Scale?
  • Now before readers get all wound up (again), with “SaaS does not Scale”, I do not mean that SaaS applications cannot scale to service millions of users. They do already, although some more successful and reliable than others. I mean that the average enterprise or government organization, which typically has a portfolio of several hundreds or even thousands of applications, can simply not afford to source these from a similar number of SaaS providers. The mandatory auditing of the infrastructure and processes of all these providers would simply not be feasible, also as a leading analyst firm just pointed out that a SAS70 certificate is no replacement for such mandatory due diligence *6a. They did so at about at the same time they suggested that for many a SaaS vendor it would make sense to partner with IaaS vendors for delivery of their services *6band that the traditional SaaS market may not grow to be as big as many initially expected *6c (shows once more that predicting developments and/or placing customer bets in a brand new area like cloud computing continues to be a risky business).

A better way
Summarizing the described mix and match approach of a decoupled, brokered cloud aims to allow enterprises to select the applications they need from several SaaS vendors, pick the platforms they like from a choice of PaaS vendors and deploy these across their choice of selected and audited IaaS vendors, without running into lock-in or scalability issues.

Now it is important to understand that this approach does not in any way, shape or form resemble the old way IT used to work. Let’s use an analogy from the consumer IT market to describe the difference:

  • IT, the old way: As a consumer you would go to a computer store to pick a software package, let’s say a cooking application. From the 20 available offers you pick one (likely the one with the nicest picture on the box), only to arrive home and discover your PC has a release of the operating system / database / browser that is not supported. After fixing this (there goes the weekend), you still cannot get it to run. You solicit some consulting from your neighbor/nephew/colleague; while your spouse remarks that at this rate you will be eating take out for another month (no pressure!). Finally during week 3 you get it to work, although printing still has it quirks. You learned a lot more about your PC, but little about cooking. One month later you buy a new PC and strangely the whole thing stops working again. Luckily the vendor sends you an email in which they offer an upgrade that runs on your new PC. Comparing it to the cost of takeout, you decide to buy the upgrade.
  • IT, the new way: You feel hungry, without leaving your seat you visit the appstore on your phone, they offer 60 cooking applications, you pick the one most downloaded (after reading some of the user comments). You prepare your first dish. It is too salty. You blame the application, remove it, and pick another one. That tastes better. You decide whether you use the free version (that includes a automatically printed shopping list for the supermarket chain sponsoring the app) or you pay 20 cents per recipe cooked.

The decoupled cloud experience we are aiming for should of course feel like the second scenario. Also note how in the first example we talked mainly about technology and in the second mainly about cooking. Somehow we in IT moved from talking about what our companies do (selling soup, soap or insurance) to mainly discussing technologies (like SOA, SOAP and yes: Cloud).

In other words we need to change from being mainly Supply Driven, with IT in the role of factory managers running production of services, to a Demand-Driven IT organization with IT in the role of a supply chain manager, finding the best way to source the functionality for the business, preferably without locking our company into a dead-end street. End goal is being able to deliver the 20% that really differentiates our company, while at the same time being able to source the 80% that is pretty much the same for all companies.

That type of agility is the real promise of cloud computing.

This post originally appeared on July 26 at ITSMportal.com


*1 The Principles and Perils of Vendor Lock-In

*2 Microsoft announced it is making Azure available as a PaaS platform to several large IaaS providers

*3 NASA and Rackspace announced they are joining forces around an open source platform for Private Clouds called OpenStack

*4 The US government investments in cloud computing could be seen as a modern day industry stimulus package. In my view current efforts of NASA and the like may have as deep an impact on cloud computing as the cold war DoD budgets had on the development of computer networks and the Apollo project had on technology advancement in general.

*5 Skygone Inc. announced offering a choice of GIS (Geographical Information System) services

*6a SAS 70 is Not Proof of Security, Privacy, or Continuity Compliance

*6b Public Cloud Infrastructure Helps SaaS Vendor Economics

*6c Organizations Need to Re-Evaluate the Rationale for SaaS

More Stories By Gregor Petri

Gregor Petri is a regular expert or keynote speaker at industry events throughout Europe and wrote the cloud primer “Shedding Light on Cloud Computing”. He was also a columnist at ITSM Portal, contributing author to the Dutch “Over Cloud Computing” book, member of the Computable expert panel and his LeanITmanager blog is syndicated across many sites worldwide. Gregor was named by Cloud Computing Journal as one of The Top 100 Bloggers on Cloud Computing.

Follow him on Twitter @GregorPetri or read his blog at blog.gregorpetri.com

CloudEXPO Stories
Blockchain has shifted from hype to reality across many industries including Financial Services, Supply Chain, Retail, Healthcare and Government. While traditional tech and crypto organizations are generally male dominated, women have embraced blockchain technology from its inception. This is no more evident than at companies where women occupy many of the blockchain roles and leadership positions. Join this panel to hear three women in blockchain share their experience and their POV on the future of blockchain.
Concerns about security, downtime and latency, budgets, and general unfamiliarity with cloud technologies continue to create hesitation for many organizations that truly need to be developing a cloud strategy. Hybrid cloud solutions are helping to elevate those concerns by enabling the combination or orchestration of two or more platforms, including on-premise infrastructure, private clouds and/or third-party, public cloud services. This gives organizations more comfort to begin their digital transformation without a complete overhaul of their existing infrastructure - serving as a sort of "missing link" for transition to cloud utilization.
Cloud Storage 2.0 has brought many innovations, including the availability of cloud storage services that are less expensive and much faster than previous generations of cloud storage. Cloud Storage 2.0 has also delivered new and faster methods for migrating your premises storage environment to the cloud and the concept of multi-cloud. This session will provide technical details on Cloud Storage 2.0 and the methods used to efficiently migrate from premises-to-cloud storage. This session will also discuss best practices for implementing multi-cloud environments.
In very short order, the term "Blockchain" has lost an incredible amount of meaning. With too many jumping on the bandwagon, the market is inundated with projects and use cases that miss the real potential of the technology. We have to begin removing Blockchain from the conversation and ground ourselves in the motivating principles of the technology itself; whether it is consumer privacy, data ownership, trust or even participation in the global economy, the world is faced with serious problems that this technology could ultimately help us in at least partially solving. But if we do not unpack what is real and what is not, we can lose sight of the potential. In this presentation, John Bates-who leads data science, machine learning and AI in the Adobe Analytics business unit-will present his 4-prong model of the general areas where Blockchain can have a real impact and the specific use...
FinTech is a disruptive innovation that denotes the adoption of technologies that have changed how traditional financial services work. While FinTech is now embedded deeply into the financial services ecosystem, the rise of digital age has paved way to FinTech 2.0 - which is rolling out innovative solutions through emerging technologies at a disruptive pace while maintaining the tenets of security and compliances. Blockchain as a technology has started seeing pilot adoption in FinTech around trade settlements, fraud detection and would need to sort out few of the technology challenges primarily around transaction time, interoperability with existing systems before being fully adopted into mainstream systems. While private blockchain adoption by Banks have taken shape, the challenge of real time transaction settlement, preventing double spend attacks need to be addressed.