@CloudExpo Authors: Kevin Jackson, John Mertic, Elizabeth White, Liz McMillan, Ruxit Blog

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, Containers Expo Blog, Agile Computing, Apache

@CloudExpo: Article

Virtual Private Cloud Computing vs. Public Cloud Computing

Which one makes the most sense depends on your requirements

Cloud computing has found its way into many organizations as business leaders and IT departments look to capitalize on the many benefits that cloud offers. As your company considers moving all or part of its IT operation to the cloud, a key decision is whether to rely on public cloud, virtual private cloud, or a combination. Finalizing a cloud strategy must start with understanding your objectives and how they best align with the value each offering can provide.

The public cloud can be characterized by IT resources delivered via the Internet using a standardized, self-service, pay-per-use methodology. Public clouds are designed to provide compute resources virtually at will - similar to that of a utility. Public clouds are highly standardized, allow limited customization, and their respective resources can be oversubscribed and massively shared. Workloads requiring inexpensive storage or compute cycles where known response time to the user community is not critical can be a fit with the public cloud.

Virtual private clouds offer scalable compute resources similar to that of public clouds, but in a more controlled environment. Virtual private cloud providers, especially those with managed services around hosted applications, bring insight into the workload and impacts to the infrastructure. Virtual private cloud providers have the flexibility to customize solutions to meet security and performance requirements. They can also identify where customer data is stored, as in a specific data center or country. The setup allows for more customization and delivers a higher-degree of privacy and security.

As you determine which methodology makes the most sense for your business, here are the three major assessment areas to consider and help guide you in your decision.

Availability Comparison
When it comes to accessing more computing resources, both virtual private and public clouds are designed to provide highly elastic compute power and data storage. When you need more resources, you can request and receive them almost immediately. However, there is a tradeoff since public cloud customers are competing for the same pool of resources. This can impact the cloud experience with unexpected bursts in demand or seasonal type activity. Virtual private cloud providers are able to introduce a level of segmentation to protect workload for a predictable user experience, but still provide the resiliency and flexibility the cloud provides for availability.

Like the public cloud, virtual private cloud services rely on virtualized computing resources to provide elasticity and scale. However, each customer is given its own private pool of resources rather than sharing them. Resources can be expanded, but it is done in a more controlled manner.

Virtual private clouds can offer a degree of elasticity, but also a higher degree of stability than public clouds. This is why virtual private clouds are more attractive for production environments, where the ability to scale is important, but uptime is just as critical.

Another key component to availability is access to the compute resources in the cloud. Traditionally access to the public cloud is done via the Internet. Virtual private cloud providers can be more accommodating for those customers that want to leverage the private line wide area networks currently deployed. With the potential to leverage the Internet as an alternate path to the environment with a dynamic reroute across a hardware-based VPN solution should any carrier issues arise.

Security Comparison
Like any utility, public clouds are easily accessible by the masses. Security controls are in place, but with limits as to how much they can control risk. Public clouds thus can be attractive targets for hackers who enjoy the challenge of breaking into public clouds, which they can then use anonymously to attack other sites.

Virtual private clouds offer more security since computing resources are more logically separated. Where virtual private cloud providers are hosting known applications, tighter security at the network layer can be deployed to further reduce the risk of unnecessary traffic. Security zones and firewall rule sets can be deployed to address multi-tenancy concerns of cloud offerings.

As stated above on availability, there is also a higher degree of security with access to the cloud resources and connectivity. Companies accessing the virtual private cloud via virtual private networks or dedicated circuit can beneficial for firms in highly regulated arenas where enterprise data needs to be protected carefully to demonstrate financial and operational stability to regulators and investors.

Control Comparison
By design, public clouds give users direct control over the volume of computing resources provisioned: you simply provision what you need when you need it. But, you cannot control what other customers in the resource pool access, which may affect your environment and minimize performance predictability.

Public clouds also make modifications to the underlying infrastructure more challenging. For example, if a technical change is needed, such as a software patch or hardware swap, that change impacts everyone because customers are not isolated from each other. Also there is no coordination with the application MTRs running on top of the infrastructure and how the updates may impact functionality. In addition, customers must diligently control the level of computing resources they contract for, monitoring the resources they need and use, and then requesting resources to be turned off when no longer needed; providing less control over computing costs.

Conversely, a virtual private cloud gives you more control over the performance of the technology environment. Customers can work jointly with virtual private cloud providers to adhere to change control policies that may already be established. Resource allocation and load balancing can be finely tuned based on each customer's environment, usage patterns, and resource consumption.

The environment is also more resilient as more sophisticated redundancy and failover capabilities can be incorporated. Virtual private clouds can also more easily provide degrees of data backup for various data retention policies. Customized solutions for disaster recovery customers based on recovery point and recovery time objectives can all be taken into the design criteria for a solution.

Utility and Consistency Requirements Dictate the Ultimate Choice
If your business requires basic computing resources where uptime and system control are not mission-critical, public clouds can serve as an inexpensive method for rapid provisioning of IT infrastructure. As is the case with most utility companies, public cloud providers offer a serviceable, raw platform at a low cost.

But if you want scalability benefits with more control, virtual private cloud services are much more likely to meet your requirements. Virtual private clouds essentially provide a more consistent experience because providers are more in tune with how their customers use the infrastructure and can plan accordingly. This allows for application performance SLAs where customers can shift their focus away from managing the infrastructure and concentrate on their business. Customers receive the benefits of scale and can leverage the cost savings that cloud provides without all the management issues.

More Stories By Kjel Hanson

Kjel Hanson is Director of Infrastructure and Engineering Services at Velocity Technology Solutions, where for the last 14 years he has focused on the delivery of hosting JD Edwards and Infrastructure Managed Services. He has participated in over 75 customer ERP migrations to hosting. Areas of responsibility have included the design and operational delivery of all data center and cloud technologies including network, server platforms, virtualization, and storage.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@CloudExpo Stories
SYS-CON Events announced today that Interface Masters Technologies, a leader in Network Visibility and Uptime Solutions, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Interface Masters Technologies is a leading vendor in the network monitoring and high speed networking markets. Based in the heart of Silicon Valley, Interface Masters' expertise lies in Gigabit, 10 Gigabit and 40 Gigabit Eth...
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
If you had a chance to enter on the ground level of the largest e-commerce market in the world – would you? China is the world’s most populated country with the second largest economy and the world’s fastest growing market. It is estimated that by 2018 the Chinese market will be reaching over $30 billion in gaming revenue alone. Admittedly for a foreign company, doing business in China can be challenging. Often changing laws, administrative regulations and the often inscrutable Chinese Interne...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...