Welcome!

@CloudExpo Authors: Liz McMillan, Elizabeth White, William Schmarzo, Kevin Benedict, Pat Romanski

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, Containers Expo Blog, Agile Computing, Apache

@CloudExpo: Article

Virtual Private Cloud Computing vs. Public Cloud Computing

Which one makes the most sense depends on your requirements

Cloud computing has found its way into many organizations as business leaders and IT departments look to capitalize on the many benefits that cloud offers. As your company considers moving all or part of its IT operation to the cloud, a key decision is whether to rely on public cloud, virtual private cloud, or a combination. Finalizing a cloud strategy must start with understanding your objectives and how they best align with the value each offering can provide.

The public cloud can be characterized by IT resources delivered via the Internet using a standardized, self-service, pay-per-use methodology. Public clouds are designed to provide compute resources virtually at will - similar to that of a utility. Public clouds are highly standardized, allow limited customization, and their respective resources can be oversubscribed and massively shared. Workloads requiring inexpensive storage or compute cycles where known response time to the user community is not critical can be a fit with the public cloud.

Virtual private clouds offer scalable compute resources similar to that of public clouds, but in a more controlled environment. Virtual private cloud providers, especially those with managed services around hosted applications, bring insight into the workload and impacts to the infrastructure. Virtual private cloud providers have the flexibility to customize solutions to meet security and performance requirements. They can also identify where customer data is stored, as in a specific data center or country. The setup allows for more customization and delivers a higher-degree of privacy and security.

As you determine which methodology makes the most sense for your business, here are the three major assessment areas to consider and help guide you in your decision.

Availability Comparison
When it comes to accessing more computing resources, both virtual private and public clouds are designed to provide highly elastic compute power and data storage. When you need more resources, you can request and receive them almost immediately. However, there is a tradeoff since public cloud customers are competing for the same pool of resources. This can impact the cloud experience with unexpected bursts in demand or seasonal type activity. Virtual private cloud providers are able to introduce a level of segmentation to protect workload for a predictable user experience, but still provide the resiliency and flexibility the cloud provides for availability.

Like the public cloud, virtual private cloud services rely on virtualized computing resources to provide elasticity and scale. However, each customer is given its own private pool of resources rather than sharing them. Resources can be expanded, but it is done in a more controlled manner.

Virtual private clouds can offer a degree of elasticity, but also a higher degree of stability than public clouds. This is why virtual private clouds are more attractive for production environments, where the ability to scale is important, but uptime is just as critical.

Another key component to availability is access to the compute resources in the cloud. Traditionally access to the public cloud is done via the Internet. Virtual private cloud providers can be more accommodating for those customers that want to leverage the private line wide area networks currently deployed. With the potential to leverage the Internet as an alternate path to the environment with a dynamic reroute across a hardware-based VPN solution should any carrier issues arise.

Security Comparison
Like any utility, public clouds are easily accessible by the masses. Security controls are in place, but with limits as to how much they can control risk. Public clouds thus can be attractive targets for hackers who enjoy the challenge of breaking into public clouds, which they can then use anonymously to attack other sites.

Virtual private clouds offer more security since computing resources are more logically separated. Where virtual private cloud providers are hosting known applications, tighter security at the network layer can be deployed to further reduce the risk of unnecessary traffic. Security zones and firewall rule sets can be deployed to address multi-tenancy concerns of cloud offerings.

As stated above on availability, there is also a higher degree of security with access to the cloud resources and connectivity. Companies accessing the virtual private cloud via virtual private networks or dedicated circuit can beneficial for firms in highly regulated arenas where enterprise data needs to be protected carefully to demonstrate financial and operational stability to regulators and investors.

Control Comparison
By design, public clouds give users direct control over the volume of computing resources provisioned: you simply provision what you need when you need it. But, you cannot control what other customers in the resource pool access, which may affect your environment and minimize performance predictability.

Public clouds also make modifications to the underlying infrastructure more challenging. For example, if a technical change is needed, such as a software patch or hardware swap, that change impacts everyone because customers are not isolated from each other. Also there is no coordination with the application MTRs running on top of the infrastructure and how the updates may impact functionality. In addition, customers must diligently control the level of computing resources they contract for, monitoring the resources they need and use, and then requesting resources to be turned off when no longer needed; providing less control over computing costs.

Conversely, a virtual private cloud gives you more control over the performance of the technology environment. Customers can work jointly with virtual private cloud providers to adhere to change control policies that may already be established. Resource allocation and load balancing can be finely tuned based on each customer's environment, usage patterns, and resource consumption.

The environment is also more resilient as more sophisticated redundancy and failover capabilities can be incorporated. Virtual private clouds can also more easily provide degrees of data backup for various data retention policies. Customized solutions for disaster recovery customers based on recovery point and recovery time objectives can all be taken into the design criteria for a solution.

Utility and Consistency Requirements Dictate the Ultimate Choice
If your business requires basic computing resources where uptime and system control are not mission-critical, public clouds can serve as an inexpensive method for rapid provisioning of IT infrastructure. As is the case with most utility companies, public cloud providers offer a serviceable, raw platform at a low cost.

But if you want scalability benefits with more control, virtual private cloud services are much more likely to meet your requirements. Virtual private clouds essentially provide a more consistent experience because providers are more in tune with how their customers use the infrastructure and can plan accordingly. This allows for application performance SLAs where customers can shift their focus away from managing the infrastructure and concentrate on their business. Customers receive the benefits of scale and can leverage the cost savings that cloud provides without all the management issues.

More Stories By Kjel Hanson

Kjel Hanson is Director of Infrastructure and Engineering Services at Velocity Technology Solutions, where for the last 14 years he has focused on the delivery of hosting JD Edwards and Infrastructure Managed Services. He has participated in over 75 customer ERP migrations to hosting. Areas of responsibility have included the design and operational delivery of all data center and cloud technologies including network, server platforms, virtualization, and storage.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2017 New York The 7th Internet of @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, New York. Chris Matthieu is the co-founder and CTO of Octoblu, a revolutionary real-time IoT platform recently acquired by Citrix. Octoblu connects things, systems, people and clouds to a global mesh network allowing users to automate and control design flo...
You are moving to the Cloud. The question is not if, it’s when. Now that your competitors are in the cloud and lapping you, your “when” better hurry up and get here. But saying and doing are two different things. In his session at @DevOpsSummit at 18th Cloud Expo, Robert Reeves, CTO of Datical, explained how DevOps can be your onramp to the cloud. By adopting simple, platform independent DevOps strategies, you can accelerate your move to the cloud. Spoiler Alert: He also makes sure you don’t...
President Obama recently announced the launch of a new national awareness campaign to "encourage more Americans to move beyond passwords – adding an extra layer of security like a fingerprint or codes sent to your cellphone." The shift from single passwords to multi-factor authentication couldn’t be timelier or more strategic. This session will focus on why passwords alone are no longer effective, and why the time to act is now. In his session at 19th Cloud Expo, Chris Webber, security strateg...
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
Redis is not only the fastest database, but it has become the most popular among the new wave of applications running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 18th Cloud Expo, Dave Nielsen, Developer Relations at Redis Labs, shared the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
"We are a custom software development, engineering firm. We specialize in cloud applications from helping customers that have on-premise applications migrating to the cloud, to helping customers design brand new apps in the cloud. And we specialize in mobile apps," explained Peter Di Stefano, Vice President of Marketing at Impiger Technologies, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...