Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, William Schmarzo, Stefana Muller

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Article

Cloud Computing: A Comparison of Computing Models

Pros and cons

A simple definition of cloud computing is that it's a unique storage service that involves storage of data and software. With cloud computing, the person or company who enjoys the service is not fully aware of the location, storage facilities or configuration of the system used to offer the storage services. A good analogy to explain this concept is the regular power grid that supplies power to homes and businesses. The end user who enjoys the services is hardly aware of all the power generation and distribution devices and components used to deliver the service.

Since its introduction, cloud computing has evolved through several stages like virtualization computing to autonomic utility form of cloud computing. It has also evolved from service-oriented architecture.

There are some concepts that are quite similar to cloud computing. These cloud computing comparisons may share some identical features with cloud computing, but they are quite distinct and should not be classified as being the same. Some cloud computing comparisons include the following:

  1. Mainframe Computer: Mainframe computers are large, powerful computers that are usually used in big organizational settings. The application of mainframe computers include the processing of voluminous data as required for resource planning, census statistics, industry and consumer statistics and large financial transactions.
  2. Grid Computer: This concept describes a type of parallel computing system distributed and shared with a cluster of interconnected computer systems. The networked computers work together to perform tasks that are too large for only one computer to perform. This computer network is then hooked up to one main super computer.
  3. Utility Computing: This concept involves arranging computer resources that have to do with storage and computation. One example is the use of metered service for public utility.
  4. Client/Server Model: This concept describes a unique server computing arrangement that includes any distributed application that has the ability to differentiate requests from the server (service providers) and requests from clients (service requests).
  5. Peer-to-Peer: This arrangement does not involve central coordination as is required in other concepts like the client/server model. With a peer-to-peer arrangement, each participating system serves as a resource consumer and supplier.
  6. Autonomic Computing: This concept is used to describe a computer system that is able manage itself.
  7. Service-Oriented Computing: This concept involves computing methods that are mainly based on software service. The main difference with cloud computing is that cloud is mainly based on services related to computing.

A major feature of cloud computing is that it is dynamic. Both the date and processing cannot be located in any one particular static place. This is quite different from other models where you can identify or pinpoint the servers or systems where the processing is done. This particular feature is what gives the concept the tag "cloud" computing, because it seems as if the processing and data takes place within the cloud. All the other known models can thus be said to supplement the concept of cloud computing.

Could computing service delivery involves use of some software systems and structures. For example, there are several cloud components that interface and interact over some application programming. This works via a system of web services structured on a three-tier structure. The application principle works like the UNIX program, which involves several programs working together at the same time along universal interfaces.

Cloud computing components can also be broken down into back end and front end. The back end comprises the part of the system that the end user may not see. This includes the computers, servers and data storage systems devices that make up the "cloud." The front end comprises of the computer and applications that the user can see and access.

More Stories By Anne Lee

Anne Lee is a freelance technology journalist, a wife and a mother of two.

CloudEXPO Stories
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is being used on IBM Cloud, Amazon, and Microsoft Azure and how to gain access to these resources in the cloud... for FREE!
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. In this session, learn how today's business leaders are managing finance in the cloud and the essential steps required to get on the right path to creating an agile, efficient and future-ready business.
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure.
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, shared success stories from a few folks who have already started using VM-aware storage. By managing storage operations at the VM-level, they’ve been able to solve their most vexing storage problems, and create infrastructures that scale to meet the needs of their applications. Best of all, they’ve got predictable, manageable storage performance – at a level conventional storage can’t match. ...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.