Welcome!

@CloudExpo Authors: Liz McMillan, Pat Romanski, Mehdi Daoudi, Elizabeth White, Rene Buest

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

ROI: Justifying the Cloud

You can improve TCO by up to 80% by using applications in a public cloud

A classic use of ROI or its twin TCO is in the Microsoft Economics of the Cloud, Nov 2010 paper. The conclusion is you can improve TCO by up to 80% by using applications in public cloud versus on-premise deployment. The basics of the calculation being:

  • improved utilization (10% to 90%) enabled by virtualization/consolidation & elasticity
  • the economies (power, operations, HW purchase etc..) of scale of multi-tenant cloud scale hosting

Given most costs in a DC are directly linked to the amount of infrastructure deployed, then improving utilization from 10% to 90% sounds like the primary justification for the 80% improvement. The misuse of the information is not more evident that when Phil Wainewright writes that the strength of this "research" is enough to put the nail in the coffin of the concept of private cloud. Definitive words indeed.. The problem I have with this conclusion is it is a black & white, monolithic view of "what is cloud". This is combined with TCO/ROI modeling that uses some pretty broad assumptions to underpin the cost model. It is often good marketing or publicity to offer polarized view of the issues, but it does not provide a real-world executable decision making capability (read "value") for future consumer of cloud services (public or private). So why are people needing to kill the concept of the "private cloud"? James Urquhart tweeted it best (4/12/2011):

  • "I don't hear ANYONE who isn't a public cloud provider or SI who bet the farm on public cloud make any claims about "false clouds". Period."
  • "Oh, wait. There may be one or two startups and journalists in there...all of which stand to gain from public cloud. Sorry about that. :\ "

If you take that approach and as a result just build a "cloud" for the sake of "cloud" then you are making "the BIG mistake". Implementing a framework as a product is doomed to fail. If you implemented SOA this way then disaster, ITIL would equal chaos, Prince2 would create inertia, Web2.0 would have resulted in mega-$$$. These concepts, whether they be architectural, process or other are meant to guide execution, not be implemented blindly. So how should ROI be used? So when people ask the question. "What's is the ROI of the cloud?" it is not an issue of "What is an ROI?" or "What is the benefit of Cloud?" or even "What data goes into a ROI calculation?". It is about how to answer the question of why, what and how to adopt the cloud. Most of the Cloud ROI (return-on-investment) or TCO (total cost of ownership) discussions are like the whitepaper from Microsoft. Comparing side by side a complex cloud deployment with a traditional infrastructure deployment. In reality, it's too difficult to develop a model to cater "true total cost of ownership", you quickly have to jump to broad assumptions, and narrow scope to make it manageable. If you start you model as a greenfield cloud deployment, your model has radical inaccuracies as you try apply this to brownfield or legacy enterprises. Try starting based on data of a legacy deployment and you have huge problems dealing with the depreciation of assets. Brownfield models also have the challenge of dealing with the elasticity of the delivery assets or opportunity costs; for example, you can manage 100 or 150 servers with the same team, or your existing 20% utilized asset can possible only support 2X or maybe as much as 10X the workload. You then overlay this with the changing economics of real estate facilities, HVAC, compute. The result is, you end up with a model that can have error factors upward of 100% It's too complex a problem to solve without a huge dataset to validate the variables, dependencies, etc... Armada takes a Fast Track approach to solving the problem. You are looking at cloud as a reference framework to help develop a solution that returns the business value. You calculate ROI based on a specific situation and end-state solution. A ROI needs to have a pay back of less than a year, so long-term theoretical modeling has no significant value. So how do you do it? Remember three things;

  • You must have a triggering event
  • Use scenario analysis and not lifecycle modeling
  • Apply the 80/20 rule to data, and only the stuff that impacts your costs

Triggering Event
Most of the time being a technical architect in consulting creates looks of skepticism from engineers in enterprise customers. Fair enough, when I was in that seat I felt the same way. When I gave up internal politics for politics of "revenue/pipeline", "everyone is a salesperson" and "whitepapers and webinars" a few things became pretty crystal clear. The most important thing is, don't waste time doing anything unless there is a pain point, problem to solve, triggering event. Wants are good, but needs are better. This is important in ROI calculation. The triggering event is the anchor point for the evaluation and defines where you are looking for the biggest "return" in the ROI.. The triggering event can be something specific like;

  • "we will run out of datacenter space in 6 months"
  • "it takes us 6 months to deploy an environment"
  • "we are on CNN if our primary datacenter fails because we have no DR"

Alternatively, it can be softer and described as the business goal, business driver like:

  • "we need to reduce operational management costs"
  • "we need to improve our infrastructure utilization"

These things are scoping statements for the project and then the ROI is applied to the return for this project.

Scenario Analysis
You scope the project, but if you try and calculate the return based on lifecycle costs over a long term, you will be scratching your head forever. If the ROI is not 1-3 years, then you are probably not doing it. Most likely it needs to be in less than a year. Scenario analysis is fairly simple, but a little time consuming. It is, however, a step down the direction of implementation, rather than a detour into developing a business case that will never be used or validated later. You create three (3) scenarios:

  • Business as usual - sometimes this is the "no decision, decision" or just solve the problem the way you have in the past
  • Option 1 - the "go big or go home" scenario, build the pure play cloud solution
  • Option 2 - the "pragmatic solution", or sometimes called the cheap solution. This is often the winner, but generally can be folded into option 1 after a subsequent round of funding.

Gather the requirements. Design the end-state architectures for three options and price out the implementation and on-going costs. You are already starting the design, so when the solution is green lighted, you are ready to go..

80/20 Rule of Data
A basic premise of the Fast Track method is to make decisions based on readily available information. Creating data and model takes time and effort for little return. In the time it takes to do this, IT services are evolving and changing. So in collecting data for a ROI analysis, use what is available, don't over process it and limit yourself to the data that impacts your business. From Gartner and other models we know that the biggest contributors to ROI/TCO are;

  • Hardware Costs (storage, compute, network)
  • Hardware Maintenance/Support
  • Software License (applications licenses, tools licenses etc..)
  • Software Maintenance/Support
  • Management & Operations (people, benefits etc..)
  • Facilities (real estate, hvac, security etc..)
  • Development/Customization/System Integration
  • Opportunity Cost (increase costs in existing infrastructure by reducing its scale)

Focus on capturing this information to support the scope of your project. If your project is not looking for value in reduction of power costs, then don't include it in the model. Just deliver the value you have visbility and control over. You should try and be as complete as possible, without creating an environment of political inertia. So with this approach its easy to capture a return on investment (ROI) calculation. I need to add, that David Linthicum wrote a very relevant post that reinforced that ROI does not make a business case. You need to also include the soft value factors, which for the cloud revolve around agility and time-to-market. Hard to define or place a value, but critical to the final assessment.

More Stories By Brad Vaughan

Brad Vaughan is a twenty year veteran consultant working with companies around the globe to transform technology infrastructure to deliver enhanced business services.

@CloudExpo Stories
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
As more and more companies are making the shift from on-premises to public cloud, the standard approach to DevOps is evolving. From encryption, compliance and regulations like GDPR, security in the cloud has become a hot topic. Many DevOps-focused companies have hired dedicated staff to fulfill these requirements, often creating further siloes, complexity and cost. This session aims to highlight existing DevOps cultural approaches, tooling and how security can be wrapped in every facet of the bu...
Connecting to major cloud service providers is becoming central to doing business. But your cloud provider’s performance is only as good as your connectivity solution. Massive Networks will place you in the driver's seat by exposing how you can extend your LAN from any location to include any cloud platform through an advanced high-performance connection that is secure and dedicated to your business-critical data. In his session at 21st Cloud Expo, Paul Mako, CEO & CIO of Massive Networks, wil...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
SYS-CON Events announced today that CAST Software will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CAST was founded more than 25 years ago to make the invisible visible. Built around the idea that even the best analytics on the market still leave blind spots for technical teams looking to deliver better software and prevent outages, CAST provides the software intelligence that matter ...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
SYS-CON Events announced today that Pulzze Systems will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems Inc, provides the software product "The Interactor" that uniquely simplifies building IoT, Web and Smart Enterprise Solutions. It is a Silicon Valley startup funded by US government agencies, NSF and DHS to bring innovative solutions to market.