|By Brad Vaughan||
|April 16, 2011 11:45 AM EDT||
A classic use of ROI or its twin TCO is in the Microsoft Economics of the Cloud, Nov 2010 paper. The conclusion is you can improve TCO by up to 80% by using applications in public cloud versus on-premise deployment. The basics of the calculation being:
- improved utilization (10% to 90%) enabled by virtualization/consolidation & elasticity
- the economies (power, operations, HW purchase etc..) of scale of multi-tenant cloud scale hosting
Given most costs in a DC are directly linked to the amount of infrastructure deployed, then improving utilization from 10% to 90% sounds like the primary justification for the 80% improvement. The misuse of the information is not more evident that when Phil Wainewright writes that the strength of this "research" is enough to put the nail in the coffin of the concept of private cloud. Definitive words indeed.. The problem I have with this conclusion is it is a black & white, monolithic view of "what is cloud". This is combined with TCO/ROI modeling that uses some pretty broad assumptions to underpin the cost model. It is often good marketing or publicity to offer polarized view of the issues, but it does not provide a real-world executable decision making capability (read "value") for future consumer of cloud services (public or private). So why are people needing to kill the concept of the "private cloud"? James Urquhart tweeted it best (4/12/2011):
- "I don't hear ANYONE who isn't a public cloud provider or SI who bet the farm on public cloud make any claims about "false clouds". Period."
- "Oh, wait. There may be one or two startups and journalists in there...all of which stand to gain from public cloud. Sorry about that. :\ "
If you take that approach and as a result just build a "cloud" for the sake of "cloud" then you are making "the BIG mistake". Implementing a framework as a product is doomed to fail. If you implemented SOA this way then disaster, ITIL would equal chaos, Prince2 would create inertia, Web2.0 would have resulted in mega-$$$. These concepts, whether they be architectural, process or other are meant to guide execution, not be implemented blindly. So how should ROI be used? So when people ask the question. "What's is the ROI of the cloud?" it is not an issue of "What is an ROI?" or "What is the benefit of Cloud?" or even "What data goes into a ROI calculation?". It is about how to answer the question of why, what and how to adopt the cloud. Most of the Cloud ROI (return-on-investment) or TCO (total cost of ownership) discussions are like the whitepaper from Microsoft. Comparing side by side a complex cloud deployment with a traditional infrastructure deployment. In reality, it's too difficult to develop a model to cater "true total cost of ownership", you quickly have to jump to broad assumptions, and narrow scope to make it manageable. If you start you model as a greenfield cloud deployment, your model has radical inaccuracies as you try apply this to brownfield or legacy enterprises. Try starting based on data of a legacy deployment and you have huge problems dealing with the depreciation of assets. Brownfield models also have the challenge of dealing with the elasticity of the delivery assets or opportunity costs; for example, you can manage 100 or 150 servers with the same team, or your existing 20% utilized asset can possible only support 2X or maybe as much as 10X the workload. You then overlay this with the changing economics of real estate facilities, HVAC, compute. The result is, you end up with a model that can have error factors upward of 100% It's too complex a problem to solve without a huge dataset to validate the variables, dependencies, etc... Armada takes a Fast Track approach to solving the problem. You are looking at cloud as a reference framework to help develop a solution that returns the business value. You calculate ROI based on a specific situation and end-state solution. A ROI needs to have a pay back of less than a year, so long-term theoretical modeling has no significant value. So how do you do it? Remember three things;
- You must have a triggering event
- Use scenario analysis and not lifecycle modeling
- Apply the 80/20 rule to data, and only the stuff that impacts your costs
Most of the time being a technical architect in consulting creates looks of skepticism from engineers in enterprise customers. Fair enough, when I was in that seat I felt the same way. When I gave up internal politics for politics of "revenue/pipeline", "everyone is a salesperson" and "whitepapers and webinars" a few things became pretty crystal clear. The most important thing is, don't waste time doing anything unless there is a pain point, problem to solve, triggering event. Wants are good, but needs are better. This is important in ROI calculation. The triggering event is the anchor point for the evaluation and defines where you are looking for the biggest "return" in the ROI.. The triggering event can be something specific like;
- "we will run out of datacenter space in 6 months"
- "it takes us 6 months to deploy an environment"
- "we are on CNN if our primary datacenter fails because we have no DR"
Alternatively, it can be softer and described as the business goal, business driver like:
- "we need to reduce operational management costs"
- "we need to improve our infrastructure utilization"
These things are scoping statements for the project and then the ROI is applied to the return for this project.
You scope the project, but if you try and calculate the return based on lifecycle costs over a long term, you will be scratching your head forever. If the ROI is not 1-3 years, then you are probably not doing it. Most likely it needs to be in less than a year. Scenario analysis is fairly simple, but a little time consuming. It is, however, a step down the direction of implementation, rather than a detour into developing a business case that will never be used or validated later. You create three (3) scenarios:
- Business as usual - sometimes this is the "no decision, decision" or just solve the problem the way you have in the past
- Option 1 - the "go big or go home" scenario, build the pure play cloud solution
- Option 2 - the "pragmatic solution", or sometimes called the cheap solution. This is often the winner, but generally can be folded into option 1 after a subsequent round of funding.
Gather the requirements. Design the end-state architectures for three options and price out the implementation and on-going costs. You are already starting the design, so when the solution is green lighted, you are ready to go..
80/20 Rule of Data
A basic premise of the Fast Track method is to make decisions based on readily available information. Creating data and model takes time and effort for little return. In the time it takes to do this, IT services are evolving and changing. So in collecting data for a ROI analysis, use what is available, don't over process it and limit yourself to the data that impacts your business. From Gartner and other models we know that the biggest contributors to ROI/TCO are;
- Hardware Costs (storage, compute, network)
- Hardware Maintenance/Support
- Software License (applications licenses, tools licenses etc..)
- Software Maintenance/Support
- Management & Operations (people, benefits etc..)
- Facilities (real estate, hvac, security etc..)
- Development/Customization/System Integration
- Opportunity Cost (increase costs in existing infrastructure by reducing its scale)
Focus on capturing this information to support the scope of your project. If your project is not looking for value in reduction of power costs, then don't include it in the model. Just deliver the value you have visbility and control over. You should try and be as complete as possible, without creating an environment of political inertia. So with this approach its easy to capture a return on investment (ROI) calculation. I need to add, that David Linthicum wrote a very relevant post that reinforced that ROI does not make a business case. You need to also include the soft value factors, which for the cloud revolve around agility and time-to-market. Hard to define or place a value, but critical to the final assessment.
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
May. 24, 2015 07:00 AM EDT Reads: 1,495
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization's assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In...
May. 24, 2015 07:00 AM EDT Reads: 4,391
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is - how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, discussed the common mistakes businesses make when transitioning t...
May. 24, 2015 06:30 AM EDT Reads: 3,053
Are your Big Data initiatives resulting in Big Impact or Big Mess? In her session at Big Data Expo, Penelope Everall Gordon, Emerging Technology Strategist at 1Plug Corporation, shared her successes in improving Big Decision outcomes by building stories compelling to the target audience – and her failures when she lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. The cast of characters includes the agency head [city official? elected official?...
May. 24, 2015 06:30 AM EDT Reads: 3,348
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
May. 24, 2015 06:30 AM EDT Reads: 5,538
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
May. 24, 2015 06:30 AM EDT Reads: 2,527
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
May. 24, 2015 06:30 AM EDT Reads: 2,826
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developmen...
May. 24, 2015 06:00 AM EDT Reads: 2,438
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
May. 24, 2015 06:00 AM EDT Reads: 3,868
The move to the cloud brings a number of new security challenges, but the application remains your last line of defense. In his session at 15th Cloud Expo, Arthur Hicken, Evangelist at Parasoft, discussed how developers are extremely well-poised to perform tasks critical for securing the application – provided that certain key obstacles are overcome. Arthur Hicken has been involved in automating various practices at Parasoft for almost 20 years. He has worked on projects including database dev...
May. 24, 2015 06:00 AM EDT Reads: 2,775
Software-driven innovation is becoming a primary approach to how businesses create and deliver new value to customers. A survey of 400 business and IT executives by the IBM Institute for Business Value showed businesses that are more effective at software delivery are also more profitable than their peers nearly 70 percent of the time (1). DevOps provides a way for businesses to remain competitive, applying lean and agile principles to software development to speed the delivery of software that ...
May. 24, 2015 05:45 AM EDT Reads: 6,462
“Oh, dev is dev and ops is ops, and never the twain shall meet.” With apoloies to Rudyard Kipling and all of his fans, this describes the early state of the two sides of DevOps. Yet the DevOps approach is demanded by cloud computing, as the speed, flexibility, and scalability in today's so-called “Third Platform” must not be hindered by the traditional limitations of software development and deployment. A recent report by Gartner, for example, says that 25% of Global 2000 companies will b...
May. 24, 2015 05:45 AM EDT Reads: 2,775
JFrog on Thursday announced that it has added Docker support to Bintray, its distribution-as-a-service (DaaS) platform. When combined with JFrog’s Artifactory binary repository management system, organizations can now manage Docker images with an end-to-end solution that supports all technologies. The new version of Bintray allows organizations to create an unlimited number of private Docker repositories, and through the use of fast Akamai content delivery networks (CDNs), it decreases the dow...
May. 24, 2015 05:30 AM EDT Reads: 3,312
More organizations are embracing DevOps to realize compelling business benefits such as more frequent feature releases, increased application stability, and more productive resource utilization. However, security and compliance monitoring tools have not kept up and often represent the single largest remaining hurdle to continuous delivery. In their session at DevOps Summit, Justin Criswell, Senior Sales Engineer at Alert Logic, Ricardo Lupo, a Solution Architect with Chef, will discuss how to ...
May. 24, 2015 05:15 AM EDT Reads: 3,109
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
May. 24, 2015 05:00 AM EDT Reads: 2,482
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
May. 24, 2015 05:00 AM EDT Reads: 4,370
Move from reactive to proactive cloud management in a heterogeneous cloud infrastructure. In his session at 16th Cloud Expo, Manoj Khabe, Innovative Solution-Focused Transformation Leader at Vicom Computer Services, Inc., will show how to replace a help desk-centric approach with an ITIL-based service model and service-centric CMDB that’s tightly integrated with an event and incident management platform. Learn how to expand the scope of operations management to service management. He will al...
May. 24, 2015 05:00 AM EDT Reads: 1,581
In this scenarios approach Joe Thykattil, Technology Architect & Sales at TimeWarner / Navisite, presented examples that will allow business-savvy professionals to make informed decisions based on a sound business model. This model covered the technology options in detail as well as a financial analysis. The TCO (Total Cost of Ownership) and ROI (Return on Investment) demonstrated how to start, develop and formulate a business case that will allow both small and large scale projects to achieve...
May. 24, 2015 04:30 AM EDT Reads: 3,189
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...
May. 24, 2015 04:30 AM EDT Reads: 2,869
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
May. 24, 2015 04:30 AM EDT Reads: 3,213