@CloudExpo Authors: Liz McMillan, Yeshim Deniz, Zakia Bouachraoui, Pat Romanski, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Blog Feed Post

On Cloud Lock-in, Standards, Decoupling and Why SaaS Does Not Scale

How the three-tier cloud as we know it is changing

With security and legal concerns being slowly addressed by the industry, lock-in and standards are rapidly becoming the biggest concerns regarding cloud computing. If the cloud industry is to make good on its promise, these will need to somehow be addressed. Let’s examine some recent developments.

Interesting to see how, just a week after my blog on “The Principles and Perils of Vendor Lock-in” *1 several vendors made announcements seemingly supporting my suggested approach. For example, after hinting at the potential benefit of decoupling SaaS and PaaS from its underlying Infrastructure (IaaS) layers, Microsoft announced it is making Azure available as a PaaS platform to several large IaaS providers *2. Now I am sure this had nothing to do with my blog on preventing lock-in and all to do with a desire in Redmond to increase market share for their PaaS platform, which ironically - if too successful – may even increase lock-in. But the move will offer customers who select Microsoft’s PaaS platform a choice of vendors for the underlying Infrastructure services (IaaS).

At the same time NASA and Rackspace announced they are joining forces around an open source platform for Private Clouds called OpenStack *3. Rackspace’s initiative is no doubt as commercially motivated as Microsoft’s’. If Rackspace –in my view correctly - expects that many private clouds in the foreseeable future will start to source additional capacity (cloudburst) from public clouds, then having these private clouds be based on the same architecture as their public cloud offering will help Rackspace. NASA's motives stem from the US governments cloud stimulus approach *4 , a specific stated goal of which is “to accelerate the creation of cloud standards”. If history is to repeat itself, we can expect to first see industry standards lead to a “plug compatible cloud market”, before a serious “open standards cloud market” will take shape. As NASA is determined to have workable cloud standards a lot faster than the decade or so it took to get a man on the moon, it is understandable they see the Rackspace route as a viable shortcut. This is also understandable because agreeing on open cloud standards today would be as difficult as agreeing 3D TV standards back in the days of the black & white moon landing broadcasts. (And, reader beware, if there ever was a time to keep options open and not lock yourself into what looks to become an early standard , it would be today).

A decoupled cloud
For many readers my recommendation to prevent lock-in by decoupling the choice of application (SaaS) and platform (PaaS) vendors from the underlying choices of infrastructure (IaaS) vendors was pure heresy. Their logic was this: if you want control over underlying layers you should not embark on cloud computing because the whole idea of cloud computing is that someone else is responsible for the underlying layers. But that’s like saying that: if you don’t want to buy clothes that were made by under aged children; you should get a sewing machine and make your own clothes.

Others struggled with imagining what such a decoupled cloud would look like in practice. Luckily, also last week (it was indeed an eventful week for cloud computing) a first real live example of such a decoupled offering went live. Skygone Inc. announced they were offering a choice of GIS (Geographical Information System) services *5 by aggregating solutions from several GIS software vendors across a choice of infra-structure platforms and vendors. Companies in need of such geographical information – which is a complex and specialized area, beyond the expertise and interest of most internal IT departments - can now simply source this without locking themselves into a specific vendor or platform. (Disclosure: Skygone uses as underlying platform Applogic from 3Tera, now owned by my employer CA Technologies.)

Several analyst firms predicted early on that this type of “brokering of cloud services” would become an important market force. But in recent months –maybe under the influence of several self-proclaimed 100 pound gorilla’s entering the cloud market – the analyst community became very quiet about the concept, which is a shame because it also addresses the fact that in an enterprise context “SaaS does not scale”.

  • SaaS does not Scale?
  • Now before readers get all wound up (again), with “SaaS does not Scale”, I do not mean that SaaS applications cannot scale to service millions of users. They do already, although some more successful and reliable than others. I mean that the average enterprise or government organization, which typically has a portfolio of several hundreds or even thousands of applications, can simply not afford to source these from a similar number of SaaS providers. The mandatory auditing of the infrastructure and processes of all these providers would simply not be feasible, also as a leading analyst firm just pointed out that a SAS70 certificate is no replacement for such mandatory due diligence *6a. They did so at about at the same time they suggested that for many a SaaS vendor it would make sense to partner with IaaS vendors for delivery of their services *6band that the traditional SaaS market may not grow to be as big as many initially expected *6c (shows once more that predicting developments and/or placing customer bets in a brand new area like cloud computing continues to be a risky business).

A better way
Summarizing the described mix and match approach of a decoupled, brokered cloud aims to allow enterprises to select the applications they need from several SaaS vendors, pick the platforms they like from a choice of PaaS vendors and deploy these across their choice of selected and audited IaaS vendors, without running into lock-in or scalability issues.

Now it is important to understand that this approach does not in any way, shape or form resemble the old way IT used to work. Let’s use an analogy from the consumer IT market to describe the difference:

  • IT, the old way: As a consumer you would go to a computer store to pick a software package, let’s say a cooking application. From the 20 available offers you pick one (likely the one with the nicest picture on the box), only to arrive home and discover your PC has a release of the operating system / database / browser that is not supported. After fixing this (there goes the weekend), you still cannot get it to run. You solicit some consulting from your neighbor/nephew/colleague; while your spouse remarks that at this rate you will be eating take out for another month (no pressure!). Finally during week 3 you get it to work, although printing still has it quirks. You learned a lot more about your PC, but little about cooking. One month later you buy a new PC and strangely the whole thing stops working again. Luckily the vendor sends you an email in which they offer an upgrade that runs on your new PC. Comparing it to the cost of takeout, you decide to buy the upgrade.
  • IT, the new way: You feel hungry, without leaving your seat you visit the appstore on your phone, they offer 60 cooking applications, you pick the one most downloaded (after reading some of the user comments). You prepare your first dish. It is too salty. You blame the application, remove it, and pick another one. That tastes better. You decide whether you use the free version (that includes a automatically printed shopping list for the supermarket chain sponsoring the app) or you pay 20 cents per recipe cooked.

The decoupled cloud experience we are aiming for should of course feel like the second scenario. Also note how in the first example we talked mainly about technology and in the second mainly about cooking. Somehow we in IT moved from talking about what our companies do (selling soup, soap or insurance) to mainly discussing technologies (like SOA, SOAP and yes: Cloud).

In other words we need to change from being mainly Supply Driven, with IT in the role of factory managers running production of services, to a Demand-Driven IT organization with IT in the role of a supply chain manager, finding the best way to source the functionality for the business, preferably without locking our company into a dead-end street. End goal is being able to deliver the 20% that really differentiates our company, while at the same time being able to source the 80% that is pretty much the same for all companies.

That type of agility is the real promise of cloud computing.

This post originally appeared on July 26 at ITSMportal.com


*1 The Principles and Perils of Vendor Lock-In

*2 Microsoft announced it is making Azure available as a PaaS platform to several large IaaS providers

*3 NASA and Rackspace announced they are joining forces around an open source platform for Private Clouds called OpenStack

*4 The US government investments in cloud computing could be seen as a modern day industry stimulus package. In my view current efforts of NASA and the like may have as deep an impact on cloud computing as the cold war DoD budgets had on the development of computer networks and the Apollo project had on technology advancement in general.

*5 Skygone Inc. announced offering a choice of GIS (Geographical Information System) services

*6a SAS 70 is Not Proof of Security, Privacy, or Continuity Compliance

*6b Public Cloud Infrastructure Helps SaaS Vendor Economics

*6c Organizations Need to Re-Evaluate the Rationale for SaaS

More Stories By Gregor Petri

Gregor Petri is a regular expert or keynote speaker at industry events throughout Europe and wrote the cloud primer “Shedding Light on Cloud Computing”. He was also a columnist at ITSM Portal, contributing author to the Dutch “Over Cloud Computing” book, member of the Computable expert panel and his LeanITmanager blog is syndicated across many sites worldwide. Gregor was named by Cloud Computing Journal as one of The Top 100 Bloggers on Cloud Computing.

Follow him on Twitter @GregorPetri or read his blog at blog.gregorpetri.com

CloudEXPO Stories
There's no doubt that blockchain technology is a powerful tool for the enterprise, but bringing it mainstream has not been without challenges. As VP of Technology at 8base, Andrei is working to make developing a blockchain application accessible to anyone. With better tools, entrepreneurs and developers can work together to quickly and effectively launch applications that integrate smart contracts and blockchain technology. This will ultimately accelerate blockchain adoption on a global scale.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also received the prestigious Outstanding Technical Achievement Award three times - an accomplishment befitting only the most innovative thinkers. Shankar Kalyana is among the most respected strategists in the global technology industry. As CTO, with over 32 years of IT experience, Mr. Kalyana has architected, designed, developed, and implemented custom and packaged software solutions across a vast spectrum o...
SAP is the world leader in enterprise applications in terms of software and software-related service revenue. Based on market capitalization, we are the world's third largest independent software manufacturer. Harness the power of your data and accelerate trusted outcome-driven innovation by developing intelligent and live solutions for real-time decisions and actions on a single data copy. Support next-generation transactional and analytical processing with a broad set of advanced analytics - run securely across hybrid and multicloud environments.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, will discuss how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.
Founded in 2002 and headquartered in Chicago, Nexum® takes a comprehensive approach to security. Nexum approaches business with one simple statement: “Do what’s right for the customer and success will follow.” Nexum helps you mitigate risks, protect your data, increase business continuity and meet your unique business objectives by: Detecting and preventing network threats, intrusions and disruptions Equipping you with the information, tools, training and resources you need to effectively manage IT risk Nexum, Latin for an arrangement by which one pledged one’s very liberty as security, Nexum is committed to ensuring your security. At Nexum, We Mean Security®.