Welcome!

@CloudExpo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Steven Lamb, William Schmarzo

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

The Difference Between Public and Private Cloud Computing

If the cloud’s origin is essentially private, what is the “public cloud”?

Cloud computing has essentially been private from the beginning. Google, for one, demonstrates that the world's most successful search engine runs not on an IBM Power, Sun Enterprise, or other massively powerful machine engineered for business by the go-to-vendors for computer solutions. More in keeping with its democratic, spare, and ubiquitous, engineering ethos Google created a "cloud" of white box computer - throw-away hardware servers running Linux.

Rather than purchase software box-software or hire a large consulting firm to design and build the infrastructure, Google engineered proprietary software to enable search requests and indexing tasks to be coordinated across this white "cloud" of white box hardware. Like most innovations that somehow become synonymous with brilliance and innovation, the innovation of using large numbers of identical machines to process enormous amounts of work isn't new at all. In fact it isn't even an innovation that can be attributed to computing at all.

If the cloud's origin is essentially private, what is the "public cloud"? One possible definition is that the public cloud is a disruptive channel through which to package and deliver some of the advantages of the cloud to "the public." Why is the public interested in the cloud? The cloud promises advantages that elude even the largest enterprises today. Public cloud computing services are purchased by individual users, small business, medium-sized business, and one of the world's largest firms.

What problems does the business community imagine the cloud might solve? Traditional technology solutions take a lot of time and money to implement, yet business needs to move quickly and flexibly to seize new opportunities. The business systems of yesterday seldom adapt easily to new requirements. And the work involved in adding new capabilities to existing systems is almost always significantly greater than the effort to build from scratch. Given this dynamic, the risk of building new systems or modifying existing systems seems high. What makes the current practice even less appealing is the tendency over time for the cost of maintaining existing systems to grow. Some studies show that today maintaining current systems consumes seventy percent of a firm's technology budget.

Given these dynamics, the (public) cloud computing model seems to offer an approach that mitigates some of the issues faced in businesses of all sizes.

  1. Firms would like to be able to purchase focused, low-cost, customizable, and flexible technology services
  2. Pay for these services when and how they are consumed. For example, some cloud computing vendors offer a metered rate model in which the firm or individual pays for just the right amount and quality of resources required to meet demand.
  3. Provision these firms as they are needed. If a company needs to provide a temporary call center in Asia for three months while consolidating their data centers in the region, then the cloud computing model offers the ability to provision, configure, and host the software and desktops to do so. If a 25-person firm decides that a customer relationship management solution seems like a good idea, the firm can provision and use that solution in the cloud.

Beyond the rapid deployment, the capability to flexibly alter and shape technology services in the cloud infrastructure can help firms design, deploy, and "shape" technology solutions that fit their immediate needs, yet can adapt over time as the business evolves. In other words, compared to traditional technology practices, the financial model of the cloud seems attractive.

CIOs and business owners tend to look at the return on investment for existing and new technology spending. One of the key factors in the ROI model of investment is the length of time, or "payback" period over which the benefits of the expenditure outweigh the costs. It's not any easy decision because in the traditional model, the CIO purchases equipment, software, services, training up-front, and then hopes that the benefits can be clearly demonstrated. Yet most firms have difficulty tracking costs and benefits in a way that makes the outcome clear. If the CIO chooses too little hardware, or implements a solution that the business users later reject, the whole solution can require additional customization or additional hardware. The cloud computing model assists in mitigating these risks by enabling both the cost and the benefit flows to be aligned. Because the building blocks of a cloud solution are much more scalable, many aspects of the solution can be tuned.

Most firms choose not to write their own desktop operating system or desktop applications. Firms make these choices every day. Yet much of the computing expenditures today deliver little competitive advantage, yet consume scarce and valuable human and capital resources. For a firm like Google, a Private Cloud of white boxes orchestrated to index and return search results makes sense. Yet, for the majority of businesses the public cloud computing model may enable business to better align the cost of computing with the business value, and make competitive advantages achievable through technology.

More Stories By Brian McCallion

Brian McCallion Bronze Drum works with executives to develop Cloud Strategy, Big Data proof-of-concepts, and trains enterprise teams to rethink process and operations. Focus areas include: Enterprise Cloud Strategy and Project Management Cloud Data Governance and Compliance Infrastructure Automation

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...