Click here to close now.




















Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Esmeralda Swartz, Liz McMillan, Nicholas Lee

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud, Containers Expo Blog, Agile Computing, Apache

@CloudExpo: Article

Virtual Private Cloud Computing vs. Public Cloud Computing

Which one makes the most sense depends on your requirements

Cloud computing has found its way into many organizations as business leaders and IT departments look to capitalize on the many benefits that cloud offers. As your company considers moving all or part of its IT operation to the cloud, a key decision is whether to rely on public cloud, virtual private cloud, or a combination. Finalizing a cloud strategy must start with understanding your objectives and how they best align with the value each offering can provide.

The public cloud can be characterized by IT resources delivered via the Internet using a standardized, self-service, pay-per-use methodology. Public clouds are designed to provide compute resources virtually at will - similar to that of a utility. Public clouds are highly standardized, allow limited customization, and their respective resources can be oversubscribed and massively shared. Workloads requiring inexpensive storage or compute cycles where known response time to the user community is not critical can be a fit with the public cloud.

Virtual private clouds offer scalable compute resources similar to that of public clouds, but in a more controlled environment. Virtual private cloud providers, especially those with managed services around hosted applications, bring insight into the workload and impacts to the infrastructure. Virtual private cloud providers have the flexibility to customize solutions to meet security and performance requirements. They can also identify where customer data is stored, as in a specific data center or country. The setup allows for more customization and delivers a higher-degree of privacy and security.

As you determine which methodology makes the most sense for your business, here are the three major assessment areas to consider and help guide you in your decision.

Availability Comparison
When it comes to accessing more computing resources, both virtual private and public clouds are designed to provide highly elastic compute power and data storage. When you need more resources, you can request and receive them almost immediately. However, there is a tradeoff since public cloud customers are competing for the same pool of resources. This can impact the cloud experience with unexpected bursts in demand or seasonal type activity. Virtual private cloud providers are able to introduce a level of segmentation to protect workload for a predictable user experience, but still provide the resiliency and flexibility the cloud provides for availability.

Like the public cloud, virtual private cloud services rely on virtualized computing resources to provide elasticity and scale. However, each customer is given its own private pool of resources rather than sharing them. Resources can be expanded, but it is done in a more controlled manner.

Virtual private clouds can offer a degree of elasticity, but also a higher degree of stability than public clouds. This is why virtual private clouds are more attractive for production environments, where the ability to scale is important, but uptime is just as critical.

Another key component to availability is access to the compute resources in the cloud. Traditionally access to the public cloud is done via the Internet. Virtual private cloud providers can be more accommodating for those customers that want to leverage the private line wide area networks currently deployed. With the potential to leverage the Internet as an alternate path to the environment with a dynamic reroute across a hardware-based VPN solution should any carrier issues arise.

Security Comparison
Like any utility, public clouds are easily accessible by the masses. Security controls are in place, but with limits as to how much they can control risk. Public clouds thus can be attractive targets for hackers who enjoy the challenge of breaking into public clouds, which they can then use anonymously to attack other sites.

Virtual private clouds offer more security since computing resources are more logically separated. Where virtual private cloud providers are hosting known applications, tighter security at the network layer can be deployed to further reduce the risk of unnecessary traffic. Security zones and firewall rule sets can be deployed to address multi-tenancy concerns of cloud offerings.

As stated above on availability, there is also a higher degree of security with access to the cloud resources and connectivity. Companies accessing the virtual private cloud via virtual private networks or dedicated circuit can beneficial for firms in highly regulated arenas where enterprise data needs to be protected carefully to demonstrate financial and operational stability to regulators and investors.

Control Comparison
By design, public clouds give users direct control over the volume of computing resources provisioned: you simply provision what you need when you need it. But, you cannot control what other customers in the resource pool access, which may affect your environment and minimize performance predictability.

Public clouds also make modifications to the underlying infrastructure more challenging. For example, if a technical change is needed, such as a software patch or hardware swap, that change impacts everyone because customers are not isolated from each other. Also there is no coordination with the application MTRs running on top of the infrastructure and how the updates may impact functionality. In addition, customers must diligently control the level of computing resources they contract for, monitoring the resources they need and use, and then requesting resources to be turned off when no longer needed; providing less control over computing costs.

Conversely, a virtual private cloud gives you more control over the performance of the technology environment. Customers can work jointly with virtual private cloud providers to adhere to change control policies that may already be established. Resource allocation and load balancing can be finely tuned based on each customer's environment, usage patterns, and resource consumption.

The environment is also more resilient as more sophisticated redundancy and failover capabilities can be incorporated. Virtual private clouds can also more easily provide degrees of data backup for various data retention policies. Customized solutions for disaster recovery customers based on recovery point and recovery time objectives can all be taken into the design criteria for a solution.

Utility and Consistency Requirements Dictate the Ultimate Choice
If your business requires basic computing resources where uptime and system control are not mission-critical, public clouds can serve as an inexpensive method for rapid provisioning of IT infrastructure. As is the case with most utility companies, public cloud providers offer a serviceable, raw platform at a low cost.

But if you want scalability benefits with more control, virtual private cloud services are much more likely to meet your requirements. Virtual private clouds essentially provide a more consistent experience because providers are more in tune with how their customers use the infrastructure and can plan accordingly. This allows for application performance SLAs where customers can shift their focus away from managing the infrastructure and concentrate on their business. Customers receive the benefits of scale and can leverage the cost savings that cloud provides without all the management issues.

More Stories By Kjel Hanson

Kjel Hanson is Director of Infrastructure and Engineering Services at Velocity Technology Solutions, where for the last 14 years he has focused on the delivery of hosting JD Edwards and Infrastructure Managed Services. He has participated in over 75 customer ERP migrations to hosting. Areas of responsibility have included the design and operational delivery of all data center and cloud technologies including network, server platforms, virtualization, and storage.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of I...
Organizations from small to large are increasingly adopting cloud solutions to deliver essential business services at a much lower cost. According to cyber security experts, the frequency and severity of cyber-attacks are on the rise, causing alarm to businesses and customers across a variety of industries. To defend against exploits like these, a company must adopt a comprehensive security defense strategy that is designed for their business. In 2015, organizations such as United Airlines, Sony...
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and a...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
In their Live Hack” presentation at 17th Cloud Expo, Stephen Coty and Paul Fletcher, Chief Security Evangelists at Alert Logic, will provide the audience with a chance to see a live demonstration of the common tools cyber attackers use to attack cloud and traditional IT systems. This “Live Hack” uses open source attack tools that are free and available for download by anybody. Attendees will learn where to find and how to operate these tools for the purpose of testing their own IT infrastructu...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of tech...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes ab...
IBM’s Blue Box Cloud, powered by OpenStack, is now available in any of IBM’s globally integrated cloud data centers running SoftLayer infrastructure. Less than 90 days after its acquisition of Blue Box, IBM has integrated its Blue Box Cloud Dedicated private-cloud-as-a-service into its broader portfolio of OpenStack® based solutions. The announcement, made today at the OpenStack Silicon Valley event, further highlights IBM’s continued support to deliver OpenStack solutions across all cloud depl...
Red Hat is investing in Tesora, the number one contributor to OpenStack Trove Database as a Service (DBaaS) also ranked among the top 20 companies contributing to OpenStack overall. Tesora, the company bringing OpenStack Trove Database as a Service (DBaaS) to the enterprise, has announced that Red Hat and others have invested in the company as a part of Tesora's latest funding round. The funding agreement expands on the ongoing collaboration between Tesora and Red Hat, which dates back to Febr...
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device acce...
SmartBear Software has updated its API tools, ServiceV for API service virtualization and LoadUI NG for API load testing, to accelerate development and testing processes in Agile teams. Updates to ServiceV enable software teams to rapidly build advanced mocks from real-time API traffic and quickly switch between virtualized “mock” services and actual APIs during diagnostic, load or integration testing in the continuous delivery lifecycle.