Welcome!

@CloudExpo Authors: Pat Romanski, Elizabeth White, Liz McMillan, AppDynamics Blog, Kevin Benedict

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @BigDataExpo, SDN Journal, OpenStack Journal

@CloudExpo: Article

The Evolution of Cloud Computing

Conceptual origins of cloud computing

Definitions of cloud computing are easy to find, but a single, authoritative definition is hard to come by. Perhaps the best work in this area was done by Böhm, et al. By compiling characteristics of 17 different scholarly and industrial definitions, the authors identified five primary characteristics of cloud computing allowing a definition such as: "Cloud computing is a service that delivers scalable hardware and/or software solutions via the Internet or other network on a pay-per-usage basis." (Emphasis indicates essential definition elements).

Cloud computing can further be broken down into three common types: SaaS, PaaS, and IaaS. SaaS (Software as a Service) allows users to log into and utilize preprogrammed software that is owned and maintained by the service provider. PaaS (Platform as a Service) gives users tools and languages owned and maintained by the service provider that can be used to build and deploy customized applications. IaaS (Infrastructure as a Service) provides users with storage and processing, allowing users full control over the use of that infrastructure. There are other divisions of cloud computing, but these are the most common.

Conceptual Origins of Cloud Computing
Looking back, it seems that cloud computing was seen as the end goal of many computer pioneers in the 1960s, or, at least, the goal of the early experiments that would eventually become the Internet.

There are three main figures commonly cited as laying the conceptual framework for cloud computing: John McCarthy, JCR Licklider, and Douglas F. Parkhill.

McCarthy first proposed in 1957 that time sharing of computing resources might allow companies to sell excess computation services for maximum utilization of the resource. He even imagined that computation might be organized as a utility.

Licklider, a programmer at the Advanced Research Projects Agency, highlighted some of the promise and challenges in cloud computing in a 1963 memo to those he described as the "Members and Affiliates of the Intergalactic Computer Network." Specifically, he talked about the ability to send a problem to a network of computers that could then pool their resources to solve it, and the need to establish a shared language to allow the computers to talk to one another.

In 1966 Parkhill published "The Challenge of the Computer Utility," which identified many of the challenges facing cloud computing, such as scalability and the need for large bandwidth connections. He also initiated a comparison with electric utilities.

Why We Are in Cloud Computing Time
If cloud computing has been around for so long conceptually, why does it seem like a revolutionary idea at all? Because only now are we in cloud computing time.

Science fiction scholars commonly use the shorthand "steam engine time" to describe the phenomenon that ideas pop up several times but don't catch on for many years. They point out that the Romans knew what steam engines were and could make them, but it wasn't until 1600 years later that the technology came to fruition. The world just wasn't ready for steam engines. The same is true of cloud computing.

The necessary elements that had to be in place before cloud computing could become a reality were the presence of very large datacenters, high-speed Internet connectivity, and the acceptance of cloud computing as a viable model for supplying IT needs.

The presence of very large datacenters is a crucial piece in the foundation of cloud computing. To be able to offer cloud services at a competitive price, suppliers must have datacenters sufficiently large to take advantage of the economies of scale benefits that can reduce costs 80-86% over the medium-sized datacenters that many companies previously utilized. These very large datacenters were manufactured for their own use by many companies that would later become cloud computing providers, such as Amazon, Google, and Microsoft.

Almost universal access to high-speed Internet connectivity is crucial to cloud computing. If your data is bottlenecked getting to and from the cloud, it simply can't be a practical solution for your IT needs.

Finally, it is important for potential users to see cloud computing as a viable solution for IT needs. People need to be able to trust that some ethereal company is going to be able to provide for your urgent IT needs on a daily basis. This cultural work was done by many disparate influences, from MMOs to Google, which expanded acceptance of online resources beyond the IT community. Another crucial but oft-neglected part of this cultural work was performed by peer-to-peer computing, which introduced many people to the notion that they could utilize the resources of other computers via the Internet.

Cloud Computing Timeline: Who, When, and Why
There are many good timelines about cloud computing available, and several are available in my resources section, but it's still important to give a basic timeline to show the evolution of cloud computing service offerings:

  • 1999: Salesforce launches its SaaS enterprise applications
  • 2002: Amazon launches Amazon Web Services (AWS), which offer both artificial and human intelligence for problem solving via the Internet
  • 2006: Google launches Google Docs, a free, web-based competitor to Microsoft Office
  • 2006: Amazon launches Elastic Compute Cloud (EC2) and Simple Storage Service (S3), sometimes described as the first IaaS
  • 2007: Salesforce launches Force.com, often described as the first PaaS
  • 2008: Google App Engine launched
  • 2009: Microsoft launches Windows Azure

Armbrust, et al. note many motives that drive companies to launch cloud computing services, including:

  • Profit: By taking advantage of cost savings from very large datacenters, companies can underbid competitors and still make significant profit
  • Leverage existing investment: For example, many of the applications in AWS were developed for internal use first, then sold in slightly altered form for additional revenue
  • Defend a franchise: Microsoft launched Windows Azure to help maintain competitiveness of the Windows brand
  • Attack a competitor: Google Docs was launched partly as an attack on Microsoft's profitable Office products
  • Leverage customer relationships: Windows Azure gives existing clients a branded cloud service that plays up perceived reliability of the brand, constantly emphasizing that it is a "rock-solid" cloud service

These are the motives that bring competitors to offer cloud computing services, but what drives companies and individuals to adopt cloud computing, and what barriers still exist to full cloud implementation.

The Cloud Computing Market: Where It's At, and Where It's Going
According to a study by IT trade group CompTIA, up to 80% of businesses use some form of cloud computing, although the degree of use varies widely. IBM's studies show that although only 8% of businesses believe cloud computing currently has a significant impact on their business, it is expected to grow to more than 30% in the next three years.

Cloud computing is often sold on the basis of price, but the primary benefit companies are seeking from cloud computing, according to recent surveys, is flexibility. With the huge swings caused by viral phenomena on the Internet, companies can see demand for their site and services fluctuate wildly in a short period of time. Cloud computing gives companies the flexibility to purchase computing resources on demand. A more conventional benefit of cloud computing's flexibility is the ability to avoid hiring and firing IT personnel for short-term projects.

One of the major obstacles to full adoption of cloud computing services remains security concerns. Although cloud-based security solutions exist, there is still a perception that cloud computing puts data at risk compared to private datacenters and increases the operational impact of denial-of-service attacks.

Despite these concerns, however, all sectors of the cloud computing market are expected to thrive in the near future, with revenue in nearly all sectors doubling within the next 3-5 years.

More Stories By Matthew Candelaria

Dr. Matthew Candelaria is a professional writer with more than five years' experience writing copy in industries such as law, medicine, technology and computer security. For more information about him and his work, visit www.writermc.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.