Welcome!

@CloudExpo Authors: Pat Romanski, Elizabeth White, Dana Gardner, Liz McMillan, Stefan Bernbo

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @BigDataExpo, SDN Journal, OpenStack Journal

@CloudExpo: Article

The Evolution of Cloud Computing

Conceptual origins of cloud computing

Definitions of cloud computing are easy to find, but a single, authoritative definition is hard to come by. Perhaps the best work in this area was done by Böhm, et al. By compiling characteristics of 17 different scholarly and industrial definitions, the authors identified five primary characteristics of cloud computing allowing a definition such as: "Cloud computing is a service that delivers scalable hardware and/or software solutions via the Internet or other network on a pay-per-usage basis." (Emphasis indicates essential definition elements).

Cloud computing can further be broken down into three common types: SaaS, PaaS, and IaaS. SaaS (Software as a Service) allows users to log into and utilize preprogrammed software that is owned and maintained by the service provider. PaaS (Platform as a Service) gives users tools and languages owned and maintained by the service provider that can be used to build and deploy customized applications. IaaS (Infrastructure as a Service) provides users with storage and processing, allowing users full control over the use of that infrastructure. There are other divisions of cloud computing, but these are the most common.

Conceptual Origins of Cloud Computing
Looking back, it seems that cloud computing was seen as the end goal of many computer pioneers in the 1960s, or, at least, the goal of the early experiments that would eventually become the Internet.

There are three main figures commonly cited as laying the conceptual framework for cloud computing: John McCarthy, JCR Licklider, and Douglas F. Parkhill.

McCarthy first proposed in 1957 that time sharing of computing resources might allow companies to sell excess computation services for maximum utilization of the resource. He even imagined that computation might be organized as a utility.

Licklider, a programmer at the Advanced Research Projects Agency, highlighted some of the promise and challenges in cloud computing in a 1963 memo to those he described as the "Members and Affiliates of the Intergalactic Computer Network." Specifically, he talked about the ability to send a problem to a network of computers that could then pool their resources to solve it, and the need to establish a shared language to allow the computers to talk to one another.

In 1966 Parkhill published "The Challenge of the Computer Utility," which identified many of the challenges facing cloud computing, such as scalability and the need for large bandwidth connections. He also initiated a comparison with electric utilities.

Why We Are in Cloud Computing Time
If cloud computing has been around for so long conceptually, why does it seem like a revolutionary idea at all? Because only now are we in cloud computing time.

Science fiction scholars commonly use the shorthand "steam engine time" to describe the phenomenon that ideas pop up several times but don't catch on for many years. They point out that the Romans knew what steam engines were and could make them, but it wasn't until 1600 years later that the technology came to fruition. The world just wasn't ready for steam engines. The same is true of cloud computing.

The necessary elements that had to be in place before cloud computing could become a reality were the presence of very large datacenters, high-speed Internet connectivity, and the acceptance of cloud computing as a viable model for supplying IT needs.

The presence of very large datacenters is a crucial piece in the foundation of cloud computing. To be able to offer cloud services at a competitive price, suppliers must have datacenters sufficiently large to take advantage of the economies of scale benefits that can reduce costs 80-86% over the medium-sized datacenters that many companies previously utilized. These very large datacenters were manufactured for their own use by many companies that would later become cloud computing providers, such as Amazon, Google, and Microsoft.

Almost universal access to high-speed Internet connectivity is crucial to cloud computing. If your data is bottlenecked getting to and from the cloud, it simply can't be a practical solution for your IT needs.

Finally, it is important for potential users to see cloud computing as a viable solution for IT needs. People need to be able to trust that some ethereal company is going to be able to provide for your urgent IT needs on a daily basis. This cultural work was done by many disparate influences, from MMOs to Google, which expanded acceptance of online resources beyond the IT community. Another crucial but oft-neglected part of this cultural work was performed by peer-to-peer computing, which introduced many people to the notion that they could utilize the resources of other computers via the Internet.

Cloud Computing Timeline: Who, When, and Why
There are many good timelines about cloud computing available, and several are available in my resources section, but it's still important to give a basic timeline to show the evolution of cloud computing service offerings:

  • 1999: Salesforce launches its SaaS enterprise applications
  • 2002: Amazon launches Amazon Web Services (AWS), which offer both artificial and human intelligence for problem solving via the Internet
  • 2006: Google launches Google Docs, a free, web-based competitor to Microsoft Office
  • 2006: Amazon launches Elastic Compute Cloud (EC2) and Simple Storage Service (S3), sometimes described as the first IaaS
  • 2007: Salesforce launches Force.com, often described as the first PaaS
  • 2008: Google App Engine launched
  • 2009: Microsoft launches Windows Azure

Armbrust, et al. note many motives that drive companies to launch cloud computing services, including:

  • Profit: By taking advantage of cost savings from very large datacenters, companies can underbid competitors and still make significant profit
  • Leverage existing investment: For example, many of the applications in AWS were developed for internal use first, then sold in slightly altered form for additional revenue
  • Defend a franchise: Microsoft launched Windows Azure to help maintain competitiveness of the Windows brand
  • Attack a competitor: Google Docs was launched partly as an attack on Microsoft's profitable Office products
  • Leverage customer relationships: Windows Azure gives existing clients a branded cloud service that plays up perceived reliability of the brand, constantly emphasizing that it is a "rock-solid" cloud service

These are the motives that bring competitors to offer cloud computing services, but what drives companies and individuals to adopt cloud computing, and what barriers still exist to full cloud implementation.

The Cloud Computing Market: Where It's At, and Where It's Going
According to a study by IT trade group CompTIA, up to 80% of businesses use some form of cloud computing, although the degree of use varies widely. IBM's studies show that although only 8% of businesses believe cloud computing currently has a significant impact on their business, it is expected to grow to more than 30% in the next three years.

Cloud computing is often sold on the basis of price, but the primary benefit companies are seeking from cloud computing, according to recent surveys, is flexibility. With the huge swings caused by viral phenomena on the Internet, companies can see demand for their site and services fluctuate wildly in a short period of time. Cloud computing gives companies the flexibility to purchase computing resources on demand. A more conventional benefit of cloud computing's flexibility is the ability to avoid hiring and firing IT personnel for short-term projects.

One of the major obstacles to full adoption of cloud computing services remains security concerns. Although cloud-based security solutions exist, there is still a perception that cloud computing puts data at risk compared to private datacenters and increases the operational impact of denial-of-service attacks.

Despite these concerns, however, all sectors of the cloud computing market are expected to thrive in the near future, with revenue in nearly all sectors doubling within the next 3-5 years.

More Stories By Matthew Candelaria

Dr. Matthew Candelaria is a professional writer with more than five years' experience writing copy in industries such as law, medicine, technology and computer security. For more information about him and his work, visit www.writermc.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SaaS companies can greatly expand revenue potential by pushing beyond their own borders. The challenge is how to do this without degrading service quality. In his session at 18th Cloud Expo, Adam Rogers, Managing Director at Anexia, discussed how IaaS providers with a global presence and both virtual and dedicated infrastructure can help companies expand their service footprint with low “go-to-market” costs.
So you’ve heard how click-to-call widgets can really enhance a website’s potential for customer interaction and you want to try it out for yourself. Or you’re considering offloading pieces of your VoIP infrastructure, but want to see how that would unfold first. Where can you find this technology, that’s free and available to try out? Spotting the potential in a space where customers can experiment with these types of features, Voxbone is launching The Workshop.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors ...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
"We work in the area of Big Data analytics and Big Data analytics is a very crowded space - you have Hadoop, ETL, warehousing, visualization and there's a lot of effort trying to get these tools to talk to each other," explained Mukund Deshpande, head of the Analytics practice at Accelerite, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
The initial debate is over: Any enterprise with a serious commitment to IT is migrating to the cloud. But things are not so simple. There is a complex mix of on-premises, colocated, and public-cloud deployments. In this power panel at 18th Cloud Expo, moderated by Conference Chair Roger Strukhoff, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships at Commvault; Dave Landa, Chief Operating Officer at kintone; William Morrish, General Manager Product Sales at Interou...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
Dialogic has announced that ZVRS chose Dialogic® PowerMedia™ XMS software media server as part of its latest video relay and translation service offering. ZVRS uses Dialogic’s PowerMedia XMS technology to provide a robust solution that supports a broad range of legacy devices and any-to-any video capabilities with its flagship Z70 videophone. ZVRS selected Dialogic’s solution to facilitate a release of Z70 that met its stringent requirements for legacy device support (H.263 and H.264) with high...
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
You are moving to the Cloud. The question is not if, it’s when. Now that your competitors are in the cloud and lapping you, your “when” better hurry up and get here. But saying and doing are two different things. In his session at @DevOpsSummit at 18th Cloud Expo, Robert Reeves, CTO of Datical, explained how DevOps can be your onramp to the cloud. By adopting simple, platform independent DevOps strategies, you can accelerate your move to the cloud. Spoiler Alert: He also makes sure you don’t...
Edge Hosting has announced a partnership with and the availability of CloudFlare, a web application firewall, CDN and DDoS mitigation service. “This partnership enhances Edge Hosting’s world class, perimeter layer, application (layer 7) defensive mechanism,” said Mark Houpt, Edge Hosting CISO. “The goal was to enable a new layer of customer controlled defense and compliance through the application of DDoS filters and mitigations, the web application firewall (WAF) feature and the added benefit ...
Digital Initiatives create new ways of conducting business, which drive the need for increasingly advanced security and regulatory compliance challenges with exponentially more damaging consequences. In the BMC and Forbes Insights Survey in 2016, 97% of executives said they expect a rise in data breach attempts in the next 12 months. Sixty percent said operations and security teams have only a general understanding of each other’s requirements, resulting in a “SecOps gap” leaving organizations u...