|By Kjel Hanson||
|April 5, 2012 10:30 AM EDT||
Cloud computing has found its way into many organizations as business leaders and IT departments look to capitalize on the many benefits that cloud offers. As your company considers moving all or part of its IT operation to the cloud, a key decision is whether to rely on public cloud, virtual private cloud, or a combination. Finalizing a cloud strategy must start with understanding your objectives and how they best align with the value each offering can provide.
The public cloud can be characterized by IT resources delivered via the Internet using a standardized, self-service, pay-per-use methodology. Public clouds are designed to provide compute resources virtually at will - similar to that of a utility. Public clouds are highly standardized, allow limited customization, and their respective resources can be oversubscribed and massively shared. Workloads requiring inexpensive storage or compute cycles where known response time to the user community is not critical can be a fit with the public cloud.
Virtual private clouds offer scalable compute resources similar to that of public clouds, but in a more controlled environment. Virtual private cloud providers, especially those with managed services around hosted applications, bring insight into the workload and impacts to the infrastructure. Virtual private cloud providers have the flexibility to customize solutions to meet security and performance requirements. They can also identify where customer data is stored, as in a specific data center or country. The setup allows for more customization and delivers a higher-degree of privacy and security.
As you determine which methodology makes the most sense for your business, here are the three major assessment areas to consider and help guide you in your decision.
When it comes to accessing more computing resources, both virtual private and public clouds are designed to provide highly elastic compute power and data storage. When you need more resources, you can request and receive them almost immediately. However, there is a tradeoff since public cloud customers are competing for the same pool of resources. This can impact the cloud experience with unexpected bursts in demand or seasonal type activity. Virtual private cloud providers are able to introduce a level of segmentation to protect workload for a predictable user experience, but still provide the resiliency and flexibility the cloud provides for availability.
Like the public cloud, virtual private cloud services rely on virtualized computing resources to provide elasticity and scale. However, each customer is given its own private pool of resources rather than sharing them. Resources can be expanded, but it is done in a more controlled manner.
Virtual private clouds can offer a degree of elasticity, but also a higher degree of stability than public clouds. This is why virtual private clouds are more attractive for production environments, where the ability to scale is important, but uptime is just as critical.
Another key component to availability is access to the compute resources in the cloud. Traditionally access to the public cloud is done via the Internet. Virtual private cloud providers can be more accommodating for those customers that want to leverage the private line wide area networks currently deployed. With the potential to leverage the Internet as an alternate path to the environment with a dynamic reroute across a hardware-based VPN solution should any carrier issues arise.
Like any utility, public clouds are easily accessible by the masses. Security controls are in place, but with limits as to how much they can control risk. Public clouds thus can be attractive targets for hackers who enjoy the challenge of breaking into public clouds, which they can then use anonymously to attack other sites.
Virtual private clouds offer more security since computing resources are more logically separated. Where virtual private cloud providers are hosting known applications, tighter security at the network layer can be deployed to further reduce the risk of unnecessary traffic. Security zones and firewall rule sets can be deployed to address multi-tenancy concerns of cloud offerings.
As stated above on availability, there is also a higher degree of security with access to the cloud resources and connectivity. Companies accessing the virtual private cloud via virtual private networks or dedicated circuit can beneficial for firms in highly regulated arenas where enterprise data needs to be protected carefully to demonstrate financial and operational stability to regulators and investors.
By design, public clouds give users direct control over the volume of computing resources provisioned: you simply provision what you need when you need it. But, you cannot control what other customers in the resource pool access, which may affect your environment and minimize performance predictability.
Public clouds also make modifications to the underlying infrastructure more challenging. For example, if a technical change is needed, such as a software patch or hardware swap, that change impacts everyone because customers are not isolated from each other. Also there is no coordination with the application MTRs running on top of the infrastructure and how the updates may impact functionality. In addition, customers must diligently control the level of computing resources they contract for, monitoring the resources they need and use, and then requesting resources to be turned off when no longer needed; providing less control over computing costs.
Conversely, a virtual private cloud gives you more control over the performance of the technology environment. Customers can work jointly with virtual private cloud providers to adhere to change control policies that may already be established. Resource allocation and load balancing can be finely tuned based on each customer's environment, usage patterns, and resource consumption.
The environment is also more resilient as more sophisticated redundancy and failover capabilities can be incorporated. Virtual private clouds can also more easily provide degrees of data backup for various data retention policies. Customized solutions for disaster recovery customers based on recovery point and recovery time objectives can all be taken into the design criteria for a solution.
Utility and Consistency Requirements Dictate the Ultimate Choice
If your business requires basic computing resources where uptime and system control are not mission-critical, public clouds can serve as an inexpensive method for rapid provisioning of IT infrastructure. As is the case with most utility companies, public cloud providers offer a serviceable, raw platform at a low cost.
But if you want scalability benefits with more control, virtual private cloud services are much more likely to meet your requirements. Virtual private clouds essentially provide a more consistent experience because providers are more in tune with how their customers use the infrastructure and can plan accordingly. This allows for application performance SLAs where customers can shift their focus away from managing the infrastructure and concentrate on their business. Customers receive the benefits of scale and can leverage the cost savings that cloud provides without all the management issues.
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust...
May. 29, 2015 06:15 PM EDT Reads: 1,668
2015 predictions circa 1970: houses anticipate our needs and adapt, city infrastructure is citizen and situation aware, office buildings identify and preprocess you. Today smart buildings have no such collective conscience, no shared set of fundamental services to identify, predict and synchronize around us. LiveSpace and M2Mi are changing that. LiveSpace Smart Environment devices deliver over the M2Mi IoT Platform real time presence, awareness and intent analytics as a service to local connecte...
May. 29, 2015 04:27 PM EDT Reads: 519
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for companies without hyper-scale resources. In his session at 15th Cloud Expo, David Cauthron, CTO and Founder of NIMBOXX, highlighted how a mid-sized manufacturer of global industrial equipment bridged the gap from virtualization to software-defined services, streamlining operations and costs while connect...
May. 29, 2015 04:00 PM EDT Reads: 3,724
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...
May. 29, 2015 04:00 PM EDT Reads: 5,577
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
May. 29, 2015 03:45 PM EDT Reads: 5,093
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
May. 29, 2015 03:45 PM EDT Reads: 1,435
paradigm shifts in networking, to cloud and licensure, and all the Internet of Things in between. In 2014 automation was the name of the game. In his session at DevOps Summit, Matthew Joyce, a Sales Engineer at Big Switch, will discuss why in 2015 it’s complexity reduction. Matthew Joyce, a sales engineer at Big Switch, is helping push networking into the 21st century. He is also a hacker at NYC Resistor. Previously he worked at NASA Ames Research Center with the Nebula Project (where OpenSta...
May. 29, 2015 03:21 PM EDT Reads: 642
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will address the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affec...
May. 29, 2015 03:00 PM EDT Reads: 2,265
The term culture has had a polarizing effect among DevOps supporters. Some propose that culture change is critical for success with DevOps, but are remiss to define culture. Some talk about a DevOps culture but then reference activities that could lead to culture change and there are those that talk about culture change as a set of behaviors that need to be adopted by those in IT. There is no question that businesses successful in adopting a DevOps mindset have seen departmental culture change, ...
May. 29, 2015 03:00 PM EDT Reads: 5,180
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In this session, James Kirkland, Red Hat's Chief Architect for the ...
May. 29, 2015 02:33 PM EDT Reads: 647
While there are hundreds of public and private cloud hosting providers to choose from, not all clouds are created equal. If you’re seeking to host enterprise-level mission-critical applications, where Cloud Security is a primary concern, WHOA.com is setting new standards for cloud hosting, and has established itself as a major contender in the marketplace. We are constantly seeking ways to innovate and leverage state-of-the-art technologies. In his session at 16th Cloud Expo, Mike Rivera, Seni...
May. 29, 2015 02:30 PM EDT Reads: 1,481
EMC Corporation on Tuesday announced it has entered into a definitive agreement to acquire privately held Virtustream. When the transaction closes, Virtustream will form EMC’s new managed cloud services business. The acquisition represents a transformational element of EMC’s strategy to help customers move all applications to cloud-based IT environments. With the addition of Virtustream, EMC completes the industry’s most comprehensive hybrid cloud portfolio to support all applications, all workl...
May. 29, 2015 02:00 PM EDT Reads: 1,560
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
May. 29, 2015 02:00 PM EDT Reads: 6,872
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
May. 29, 2015 02:00 PM EDT Reads: 2,516
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
May. 29, 2015 01:15 PM EDT Reads: 2,756
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
May. 29, 2015 01:15 PM EDT Reads: 3,069
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
May. 29, 2015 01:00 PM EDT Reads: 2,471
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
May. 29, 2015 01:00 PM EDT Reads: 7,519
Thanks to widespread Internet adoption and more than 10 billion connected devices around the world, companies became more excited than ever about the Internet of Things in 2014. Add in the hype around Google Glass and the Nest Thermostat, and nearly every business, including those from traditionally low-tech industries, wanted in. But despite the buzz, some very real business questions emerged – mainly, not if a device can be connected, or even when, but why? Why does connecting to the cloud cre...
May. 29, 2015 12:42 PM EDT Reads: 811
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York City, NY. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption...
May. 29, 2015 12:30 PM EDT Reads: 1,341