Welcome!

@CloudExpo Authors: Kevin Benedict, Steve Wilson, Yeshim Deniz, Stackify Blog, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Blog Feed Post

About that ‘Unassailable Economic Argument’ for Public Cloud Computing

Turns out that ‘unassailable’ economic argument for public cloud computing is very assailable

Turns out that ‘unassailable’ economic argument for public cloud computing is very assailable

quote-badgeThe economic arguments are unassailable. Economies of scale make cloud computing more cost effective than running their own servers for all but the largest organisations. Cloud computing is also a perfect fit for the smart mobile devices that are eating into PC and laptop market. -- Tim Anderson, “Let the Cloud Developer Wars Begin”

imageAh, Tim. The arguments are not unassailable and, in fact, it appears you might be guilty of having tunnel vision – seeing only the list price and forgetting to factor in the associated costs that make public cloud computing not so economically attractive under many situations. Yes, on a per hour basis, per CPU cycle, per byte of RAM, public cloud computing is almost certainly cheaper than any other option. But that doesn’t mean that arguments for cloud computing (which is much more than just cheap compute resources) are economically unassailable. Ignoring for a moment that it isn’t as clear cut as basing a deployment strategy purely on costs, the variability in bandwidth and storage costs along with other factors that generate both hard and soft costs associated with applications must be considered .

MACRO versus MICRO ECONOMICS

The economic arguments for cloud computing almost always boil down to the competing views of micro versus macro economics. Those in favor of public cloud computing are micro-economic enthusiasts, narrowing in on the cost per cycle or hour of a given resource. But micro-economics don’t work for an application because an application is not an island of functionality; it’s an integrated, dependent component that is part of a larger, macro-economic environment in which other factors impact total costs.

The lack of control over resources in external environments can be problematic for IT organizations seeking to leverage cheaper, commodity resources in public cloud environments. Failing to impose constraints on auto-scaling – as well as defining processes for de-scaling – and the inability to track and manage developer instances launched and left running are certainly two of the more common causes of “cloud sprawl.” Such scenarios can certainly lead to spiraling costs that, while not technically the fault of cloud computing or providers, may engender enough concern in enterprise IT to keep from pushing the “launch” button.

quote-badgeThe touted cost savings associated with cloud services didn't pan out for Ernie Neuman, not because the savings weren't real, but because the use of the service got out of hand.

When he worked in IT for the Cole & Weber advertising firm in Seattle two and a half years ago, Neuman enlisted cloud services from a provider called Tier3, but had to bail because the costs quickly overran the budget, a victim of what he calls cloud sprawl - the uncontrolled growth of virtual servers as developers set them up at will, then abandoned them to work on other servers without shutting down the servers they no longer need.

Whereas he expected the developers to use up to 25 virtual servers, the actual number hit 70 or so. "The bills were out of control compared with what the business planned to spend," he says.

-- Unchecked usage can kill cost benefits of cloud services

But these are not the only causes of cost overruns in public cloud computing environments and, in fact, uncontrolled provisioning whether due to auto-scaling or developers forgetfulness is not peculiar to public cloud but rather can be a problem in private cloud computing implementations as well. Without the proper processes and policies – and the right infrastructure and systems to enforce them – cloud sprawl will certainly impact especially those large enterprises for whom private cloud is becoming so attractive an option.

While it’s vastly more difficult to implement the proper processes and procedures automatically in public as opposed to private cloud computing environments because of the lack of maturity in infrastructure services in the public arena, there are other, hotter issues in public cloud that will just as quickly burn up an IT or business budget if not recognized and addressed before deployment.  And it’s this that cloud computing cannot necessarily address even by offering infrastructure services, which makes private cloud all the more attractive.

imageTRAFFIC SPRAWL

Though not quite technically accurate, we’ll use traffic sprawl to describe increasing amounts of unrelated traffic a cloud-deployed application must process. It’s the extra traffic – the malicious attacks and the leftovers from the last application that occupied an IP address – that the application must field and ultimately reject. This traffic is nothing less than a money pit, burning up CPU cycles and RAM that translate directly into dollars for customers. Every request an application handles – good or bad – costs money.

The traditional answer to preventing the unnecessary consumption of resources on servers due to malicious or unwanted traffic is a web application firewall (WAF) and basic firewalling services. Both do, in fact, prevent that traffic from consuming resources on the server because they reject it, thereby preventing it from ever being seen by the application. So far so good. But in a public cloud computing environment you’re going to have to pay for the resources the services consumed, too. In other words, you’re paying per hour to process illegitimate and unwanted traffic no matter what. Even if IaaS providers were to offer WAF and more firewall services, you’re going to pay for that and all the unwanted, malicious traffic that comes your way will cost you, burning up your budget faster than you can say “technological money pit.”

This is not to say that both types of firewall services are not a good idea in a public cloud environment; they are a valuable resource regardless and should be part and parcel of any dynamic infrastructure. But it is true that in a public cloud environment they address only security issues, and are unlikely to redress cost overruns but instead may help you further along the path to budget burnout.

HYBRID WILL DOMINATE CLOUD COMPUTING

I’ve made the statement before, I’ll make it again: hybrid models will dominate cloud computing in general due primarily to issues around control. Control over processes, over budgets, and over services. The inability to effectively control traffic at the network layer imposes higher processing and server consumption rates in public environments than in private, controlled environments even when public resources are leveraged in the private environment through hybrid architectures enabled by virtual private cloud computing technologies. Traffic sprawl initiated because of shared IP addresses in public cloud computing environments alone is simply not a factor in private and even hybrid style architectures where public resources are never exposed via a publicly accessible IP address. Malicious traffic is never processed by applications and servers in a well-secured and architected private environment because firewalls and application firewalls screen out such traffic and prevent them from unnecessarily increasing compute and network resource consumption, thereby expanding the capacity of existing resources. The costs of such technology and controls are shared across the organization and are fixed, leading to better forecasting in budgeting and planning and eliminating the concern that such essential services are not the cause of a budget overrun.

Control over provisioning of resources in private environments is more easily achieved through existing and emerging technology, while public cloud computing environments still struggle to offer even the most rudimentary of data center infrastructure services. Without the ability to apply enterprise-class controls and limits on public cloud computing resources, organizations are likely to find that the macro-economic costs of cloud end up negating the benefits initially realized by cheap, easy to provision resources. A clear strategy with defined boundaries and processes – both technical and people related – must be defined before making the leap lest sprawl overrun budgets and eliminate the micro-economic benefits that could be realized by public cloud computing.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, will discuss th...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, discussed the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insights from the vast t...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
While some vendors scramble to create and sell you a fancy solution for monitoring your spanking new Amazon Lambdas, hear how you can do it on the cheap using just built-in Java APIs yourself. By exploiting a little-known fact that Lambdas aren’t exactly single-threaded, you can effectively identify hot spots in your serverless code. In his session at @DevOpsSummit at 21st Cloud Expo, Dave Martin, Product owner at CA Technologies, will give a live demonstration and code walkthrough, showing how ...
SYS-CON Events announced today that Elastifile will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Elastifile Cloud File System (ECFS) is software-defined data infrastructure designed for seamless and efficient management of dynamic workloads across heterogeneous environments. Elastifile provides the architecture needed to optimize your hybrid cloud environment, by facilitating efficient...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to...
SYS-CON Events announced today that Golden Gate University will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Since 1901, non-profit Golden Gate University (GGU) has been helping adults achieve their professional goals by providing high quality, practice-based undergraduate and graduate educational programs in law, taxation, business and related professions. Many of its courses are taug...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
SYS-CON Events announced today that DXWorldExpo has been named “Global Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Digital Transformation is the key issue driving the global enterprise IT business. Digital Transformation is most prominent among Global 2000 enterprises and government institutions.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
yperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let’s say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it....
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.