|By David Linthicum||
|August 16, 2011 10:00 AM EDT||
While the hype rages around cloud computing, most cloud implementations go the way of the private cloud and avoid the public clouds for now. Private clouds are exactly what they sound like. Your own instance of SaaS, PaaS, or IaaS that exists in your own data center, all tucked away, protected and cozy. You own the hardware, you can hug your server.
However, what defines a private cloud these days could also mean systems that are remotely hosted but dedicated to a single enterprise, and, in some cases, provided out of a public cloud data center as a virtual private cloud. Thus any cloud infrastructure that's dedicated to a single organization is getting the "private cloud" label. This includes the emerging relabeling of existing enterprise software and hardware solutions, looking to deliver cloud-in-a-box private clouds.
If this sounds confusing, it is. The technology vendors and the hype clearly load up the term "private cloud" with everything and anything. However, the concept of private cloud computing has the potential to bring a huge amount of value to enterprise IT. That is, if we understand the right approach, and how to leverage the right technology to create the building blocks of the private cloud.
Why Go Private?
Most enterprises are eager to leverage cloud computing, but not so eager to place core business processing and critical business data on public clouds. Indeed, there may even be legal restrictions on where data may exist, as we have seen in the financial and health verticals, where some types of data may not exist outside of the enterprise. Or, the risk of compromised or lost data outweighs the value that public cloud computing will bring.
While the regulations are real, most of those who select private over public cloud computing do so around control issues. Many in enterprise IT don't like to give up control of core business systems since that is where they may place their own value. If these systems are controlled and managed by others outside of the enterprise, they feel their value will be diminished. In most cases these are false perceptions.
Security is another reason to go private cloud. Public clouds provide rudimentary security subsystems that have thus far had a good track record. However, most enterprises do not consider public clouds as secure as systems that exist on site or as those remotely hosted but completely under the enterprise's control. While public cloud security is getting better, private clouds do offer fewer security risks.
Finally, there are performance issues with public clouds that include the natural latency of leveraging the Internet. This is a matter of how the applications and systems are designed more than limitations of the clouds, but in some instances these are valid concerns in problem domains with a high amount of data transfer between the data server and the consumer.
What's a Private Cloud?
NIST defines a private cloud as "The cloud infrastructure is operated solely for an organization. It may be managed by the organization or a third party and may exist on premise or off premise." For the most part, that is the definition that many are running with. However, let's go a few steps farther to define the core attributes of private clouds, and cloud computing in general. They are:
- Multitenancy and resource pooling
- Self or auto-provisioning
- Use-based accounting
First you'll notice that virtualization is not on the list despite the fact that those who leverage virtualization often call clusters of virtualized servers a private cloud. The reality is that virtualization is often used when building a private cloud, and it is described below as a building block. But simple virtualization does not a private cloud make, and you choose to leverage it or not. For example, Google's cloud systems do not leverage virtualization but Amazon's AWS does.
Multitenancy refers to the managed access to resources (such as storage and compute services) in an environment where there is more than one user sharing those resources. This is a critical building block of private cloud computing. We could have hundreds or thousands of users who share the same sets of servers and attached devices. That creates the need to ensure that any particular resource does not get saturated or accessed simultaneously, and that user and application processes stay out of each other's way. The mechanisms and approaches to multitenancy vary greatly from cloud to cloud, but the objectives are much the same.
Related to multitenancy, resource pooling means that the provider's computing resources are pooled to serve multiple consumers using a multitenant model. Different physical and virtual resources are dynamically assigned and reassigned according to consumer demand.
Perhaps the most important concept of private cloud computing is self or auto-provisioning. This is the ability for an application or a user to dynamically allocate resources (such as storage and compute) during operations. This is typically accomplished by invoking a provisioning API, or, in some cases, going to a Web page where the resources can be manually allocated. In some cases the resources are automatically provisioned as needed. In addition, the same mechanisms can de-provision these resources after use.
Because we pay for the minutes of use, even within a private cloud, this means we can allocate the resources required to perform an operation, and then put them back when done. For instance, we could allocate a hundred servers to perform a database extraction in 10 minutes, and then returning those servers back to the cloud for others to use. Thus, we're being as efficient as possible with both the resources and the dollars spent. This provisioning mechanism also provides the elasticity that many count in the advantage column of cloud computing, which is the ability to expand and contract the use of cloud resources as needed to support the application.
Use-based accounting refers to an automated approach to keep track of those who leverage a private cloud, and charge them back for the use. Most private clouds are known resources, typically within the same company or governance agency, so these are budget dollars. Statements are typically sent that describe the use of resources, duration, and the cost. This is also helpful to understand how applications and users consume the private cloud resources, and track the times of day when system loads could be an issue.
Security is required to ensure that only those with authorization, including both humans and machines, can leverage the private cloud. Typically this is user ID and password role-based security. New, more sophisticated security models such as federated identity management have proven to be more effective. We'll cover more about security below.
Governance means that we not only secure our private cloud, but we can create and manage policies to control access to resources and services. We can define limits on when and how resources (such as storage, compute, and database services) are accessed by applications and users who leverage the private cloud.
Private Cloud Configurations
The latest configurations of private clouds are no longer just for data centers. As introduced above, many private clouds may be outsourced as "virtual private clouds" within public cloud computing providers. Amazon Web Services (AWS) provides just such a service, called Virtual Private Cloud or VPC. Using this service you have the ability to logically group Amazon EC2 instances and assign them to a private IP address, and thus control traffic to and from the server. They also offer an additional layer of security that allows you to create and manage network Access Control Lists (ACLs). Finally, you can connect to the AWS data center using a VPN connection, and thus make the VPC an extension of your enterprise network. Moreover, the cloud provider maintains the hardware for you, but you don't have physical access to the servers.
In other offerings, public cloud providers may even provide you with access to a dedicated physical server that you never actually see. Of course this is at an additional cost, but many enterprises feel better if their servers only store their data. In a virtualized and multitenant cloud, you're mixed in with everyone else who uses that cloud. Again, you don't have physical access to the hardware, but the maintenance is handled by the cloud provider.
Other private clouds may exist in colocation data centers, or CoLos. These are data center rentals where you own a cage full of servers that are tied directly back to the enterprise. Unlike virtual private clouds or virtual private instances, you have access to the physical hardware when using this configuration. This means you need to maintain the hardware as well.
Another approach is something called "cloud-in-a-box," which is a server or clusters of servers that have been pre-configured to provide most of the private cloud services listed above. You just purchase the thing as a stand-alone server or appliance, install it in the data center, and you have your private cloud. Oracle's Exalogic private cloud solution is clearly an example of a private cloud-in-a-box that comes with a million dollar starting price.
Don't forget there is the traditional approach to private cloud computing, where software is installed and configured on commodity servers that exist within the data center and that becomes the private cloud. Server-run private cloud software provides most or all of the core private cloud attributes listed above. This is the most popular configuration today, with the configurations above show gains in light of the desire for convenience and speed.
Building Blocks of Private Cloud
The building blocks of private cloud computing include the server virtualization software the many employ as a foundation for creating the private cloud. However, some private cloud solutions don't leverage virtualization, as described above.
A common mistake is to assume that several virtualized servers are a private cloud. Without the addition of multitenancy, use-based accounting, auto or self-provisioning, and other cloudy features we've described above, the private cloud functionality won't be there.
However, many private cloud solutions are ready-made to take advantage of server virtualization, including VMware's vCloud Director which leverages VMWare hypervisors. Or, if you're going to open source, Eucalyptus can use a variety of virtualization technologies including VMware, Xen and KVM hypervisors to implement the cloud abstractions it supports.
Private cloud software is mostly purchased as pre-built packages, although it's possible to roll-your-own using various software components that provide the services defined above. Just as with the public cloud space, we can place private clouds into three core categories: IaaS, PaaS, and SaaS.
IaaS private clouds are perhaps the most popular type of private cloud. They provide self-provisioned access to core infrastructure services including storage and compute. The most popular packaged IaaS systems include VMware's vCloud Director and Eucalyptus Systems, Inc.'s Eucalyptus. However, the popularity of cloud computing is driving newer private cloud software solutions to the market including cloud.com and Nimbula, just to name a few. Moreover, there are private clouds that provide just storage or just database services, but no access to a complete platform of resources.
However, there are also PaaS-based private clouds that are beginning to show up in data centers. Like their public computing counterparts, these platforms provide the benefit of shared application development and deployment platforms. Examples of providers in this space include Microsoft with their private cloud version of Azure.
Finally, there are SaaS versions of private clouds that provide access to common application services using a SaaS model, but deploy from a private cloud. These are typically tactical software instances, such as e-mail and calendaring, but can also be system management and even enterprise applications.
Another building block is cloud service management. Here we leverage mechanisms to manage the private cloud instance, including allocating and de-allocating servers, user management, security management, and other maintenance issues that need to be dealt with during the operations of the private cloud. While you would think that these services would come from the private cloud computing software provider, in some cases they have to be sourced from a third party, such as abstract management of virtualized servers or storage management.
Use-based accounting, as defined above, is the ability to track the usage of the private cloud by humans and machines. Again, in many instances, this feature will be provided by the private cloud software, but third party software can be integrated, or you may even leverage a public cloud service to perform this function.
Security within a private cloud environment is typically pretty basic. To create the proper security solution you need to work from the requirements, which usually involves existing security and compliance policies. While simple role-based security is often fine for most applications, there are requirements for more sophisticated security mechanisms such as advanced encryption, or federated identity solutions that allow for a more granular security configuration. The usual security suspects are where to look here, such as the RSA for encryption and IBM and Oracle for federated identity tech.
Governance solutions for private cloud computing are perhaps the most overlooked component of the private cloud solution, but something that most of those who implement private cloud services will require at some point. Again, the concept is to place rules and policies around cloud services, insuring that they are properly leveraged by authorized clients. There are a few governance solutions that now support private clouds, such as Layer 7, Oracle, and Vordel.
So what does the hardware footprint look like for a private cloud? It's really a matter of the capacity you need to support, and it can be anywhere from one appliance to several dozen racks of servers. They can cost from a few hundred dollars to over a million dollars, depending on the need and configuration.
While private clouds are still very new in our world, some best practices are beginning to emerge around how to define, design, and implement a private cloud.
The first best practice is to focus on the requirements before you begin your journey to a private cloud solution. Many tasked to deploy private clouds often skip the requirements, and thus take a shot in the dark around the best architecture and technology requirements, and thus they often miss the mark. As a rule, make sure to move from the requirements, to the architecture, and then to the solution. While the lure of a private cloud-in-a-box is sometimes too difficult to resist, most solutions require a bit more complex planning process to deliver the value.
Also recommended is the use of service oriented architecture (SOA) approaches around the definition and architecture of private clouds. Many find that the use of SOA concepts, which can deliver solutions as sets of services that can be configured into solutions, is a perfect match for those who design, build, and deploy private clouds.
The second best practice is to define the business value of the private cloud before the project begins. There should be a direct business benefit that is gained from this technology. Many private cloud deployments will cost many millions of dollars, and will thus draw questions from management. You need to be prepared to provide solid answers as to the ROI.
The final best practice is to work in small increments. While it may seem a good idea to fill half the data center with your new private cloud...you'll need the capacity at some point right? Not now. You should only create private cloud instances with the capacity requirements for the next year. If you've designed your private cloud right, and have leveraged the right vendors, increasing capacity should be as easy as adding additional servers as needed.
In Your Future?
Private clouds are really a direct copy of the efficiency of public cloud computing architectures, repurposed for internal use within enterprises. The benefits are somewhat different, as is the technology, architecture, and the way private clouds are deployed. In many respects private clouds are just another internal system, but it's the patterns of use where the value of private clouds really shines through, including access to shared resources that can be allocated on-demand.
Challenges that exist include the confusion around the term "private cloud," which is overused simply as way to push an existing software or hardware product as something that's now "a cloud," and thus relevant and cool. This cloud washing has been going on for some time with everything from disk drives, printers, and scanners being positioned within the emerging space of the private cloud as "clouds."
The only way to counter this confusion is to stick to our guns in terms of what a private cloud is, including its attributes and building blocks as discussed in this article. Without a clear understanding of the concept of a private cloud, and the best practices and approaches to build a private cloud, it won't provide the value we expect.
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Apr. 23, 2017 03:45 AM EDT Reads: 4,450
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
Apr. 23, 2017 03:30 AM EDT
Judith Hurwitz is president and CEO of Hurwitz & Associates, a Needham, Mass., research and consulting firm focused on emerging technology, including big data, cognitive computing and governance. She is co-author of the book Cognitive Computing and Big Data Analytics, published in 2015. Her Cloud Expo session, "What Is the Business Imperative for Cognitive Computing?" is scheduled for Wednesday, June 8, at 8:40 a.m. In it, she puts cognitive computing into perspective with its value to the busin...
Apr. 23, 2017 03:15 AM EDT Reads: 3,379
SYS-CON Events announced today that Hitachi, the leading provider the Internet of Things and Digital Transformation, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., offers an integrated portfolio of services and solutions that enable digital transformation through enhanced data management, governance, mobility and analytics. We help globa...
Apr. 23, 2017 02:45 AM EDT Reads: 1,570
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across supply chain networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost and time for product recall as well as advance trade. Are you curious about Blockchain and how it can provide you with new opportunities for innovation and growth? In her session at 20th Cloud Exp...
Apr. 23, 2017 02:45 AM EDT Reads: 1,138
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Apr. 23, 2017 01:45 AM EDT Reads: 8,539
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Apr. 23, 2017 01:15 AM EDT Reads: 142
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Apr. 22, 2017 11:45 PM EDT Reads: 2,022
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
Apr. 22, 2017 11:15 PM EDT Reads: 1,412
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
Apr. 22, 2017 10:15 PM EDT Reads: 1,816
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Apr. 22, 2017 10:15 PM EDT Reads: 1,776
The age of Digital Disruption is evolving into the next era – Digital Cohesion, an age in which applications securely self-assemble and deliver predictive services that continuously adapt to user behavior. Information from devices, sensors and applications around us will drive services seamlessly across mobile and fixed devices/infrastructure. This evolution is happening now in software defined services and secure networking. Four key drivers – Performance, Economics, Interoperability and Trust ...
Apr. 22, 2017 09:15 PM EDT Reads: 3,432
Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the USA and Europe, we work with a variety of customers from emerging startups to Fortune 1000 companies.
Apr. 22, 2017 08:00 PM EDT Reads: 1,815
Cloud Expo, Inc. has announced today that Aruna Ravichandran, vice president of DevOps Product and Solutions Marketing at CA Technologies, has been named co-conference chair of DevOps at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Apr. 22, 2017 05:30 PM EDT Reads: 2,152
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deli...
Apr. 22, 2017 04:30 PM EDT Reads: 1,689
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Apr. 22, 2017 03:00 PM EDT Reads: 1,385
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, pane...
Apr. 22, 2017 02:45 PM EDT Reads: 1,685
SYS-CON Events announced today that Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Juniper Networks challenges the status quo with products, solutions and services that transform the economics of networking. The company co-innovates with customers and partners to deliver automated, scalable and secure network...
Apr. 22, 2017 02:45 PM EDT Reads: 4,595
SYS-CON Events announced today that Twistlock, the leading provider of cloud container security solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Twistlock is the industry's first enterprise security suite for container security. Twistlock's technology addresses risks on the host and within the application of the container, enabling enterprises to consistently enforce security policies, monitor...
Apr. 22, 2017 02:15 PM EDT Reads: 3,177
@ThingsExpo has been named the Most Influential ‘Smart Cities - IIoT' Account and @BigDataExpo has been named fourteenth by Right Relevance (RR), which provides curated information and intelligence on approximately 50,000 topics. In addition, Right Relevance provides an Insights offering that combines the above Topics and Influencers information with real time conversations to provide actionable intelligence with visualizations to enable decision making. The Insights service is applicable to eve...
Apr. 22, 2017 01:45 PM EDT Reads: 2,156