|By Rajesh Ramchandani||
|February 19, 2017 05:15 AM EST||
Private Clouds - What Enterprises Should Know Before Implementation
Enterprises are rapidly implementing private clouds for sound business reasons. Private clouds offer greater agility, enabling companies to quickly adapt to constantly changing business needs and innovate faster than competition. Digital and cloud services can be provided to developers at a pace of innovation that equals any public cloud. Companies can maintain security and compliance, along with the reliability, availability and stability of IT systems of large enterprises. Private clouds also provide better SLAs for new services than most public clouds offer.
While there is a sense of urgency surrounding implementation, enterprises must still carefully choose the best technology stack to deliver their new dynamic cloud infrastructure and services. They must carefully develop a transition plan, understand and set goals for availability, ease of use, developer experiences, and identify any potential holes that could jeopardize their transition and investment in newer cloud-native architectures.
In this article, we discuss some key considerations for choosing the right technology stack to deploy private and hybrid clouds.
The Foundation of a Private Cloud
The cloud journey for any enterprise begins with defining goals, creating a strategy and detailed implementation plan, followed by identifying key milestones and metrics to track. A well-thought implementation plan that follows the latest trends in cloud computing is critical to success. Enterprises typically benefit from a hybrid cloud strategy and most use public clouds for certain workloads, either enterprise-wide or a workgroup-level application.
Let's look at some of the key criteria for a private enterprise cloud strategy and implementation plan.
Business Case for Private Clouds
Implementing private clouds tends to be capital and operationally expensive, requiring additional levels of technologies, infrastructure and resources. From a business point of view, key considerations for using private clouds include:
- Infrastructure agility to enhance developer productivity and speed release of new services
- Migration to the DevOps process, improving services and applications delivery
- Control over infrastructure, with the ability to define SLAs
- Security and compliance maintained internally, rather than depending on third-parties
- Need to leverage existing infrastructure and data centers, optimizing capital investments. For example, leverage VMware licenses that enterprises have already invested billions of dollars
- Avoid 100 percent dependencies on public cloud or external infrastructure providers, limiting exposure
- Manage capex and opex, with the ability to switch to private cloud vs. public cloud options as needed.
Private Cloud Technology Stack
From discussions with several enterprises that have embarked on their journey to hybrid clouds, it's quite obvious that the right technology stack to build a private cloud may well decide success or failure of the implementation project. Private cloud stack means primarily choosing technology for compute, storage and networking, as well as making sure there are easier integrations for other cloud services, platform-as-a-service or CI/CD and DevOps toolset.
The key criteria to select the technology stack for private clouds include:
Open source vs. closed source technology
Technology for private and hybrid clouds is evolving rapidly, with an unprecedented pace of innovation in next generation data center technologies and new application architectures. It's imperative to bet on a technology that is open source and has a vibrant developer community supported by a large vendor(s).
Driven by a single vendor, closed source (proprietary) technology tends to evolve much slower, which could hamper the speed of innovation. However, the evolution of the technology must focus on solving a key requirement of enterprises. Some open source projects, especially those with a very large presence of several vendors, tend to become multi-focused and driven by respective agendas of the vendors themselves. Over-engineered open source technology is often broad and could cause unnecessary complexity in the implementation phase.
Key vendors that are core leaders of the open source project must be engaged to help implement hybrid clouds. Private cloud technologies tend to be complex and constantly evolving. Core technology developers can help navigate the intricacies of complex features and integration points, providing guidance for migrating and developing new applications on hybrid clouds.
Support for existing workloads
It's likely some or most of existing workloads should move to a private cloud to improve efficiency and agility of applications. Identifying the technology stack's capabilities and potential gaps helps make the best decisions on managing existing applications and services. Enterprises should implement several proof of concepts to understand how the migration will work and any changes that may be necessary during the migration. Not all workloads are easy to migrate because of how compute, storage and networking are implemented on private clouds, compared to legacy data center architectures.
Standardize on IaaS or PaaS
Infrastructure-as-a-service (IaaS) provides flexibility to deploy various types of workloads and architectures, but requires enterprises to manually integrate several point solutions such as monitoring, auto scaling, disaster recovery or failure recovery.
On the other hand, Platform-as-a-Service (PaaS) provides an integrated application platform so applications and architectures can be standardized and developers can focus on developing code, not deploying infrastructure. While PaaS improves developer productivity, it does limit architecture flexibility. Applications must be written to the platform, which may provide a platform and/or vendor lock-in.
Enterprises should consider all the pros and cons of each platform. It's common to use both IaaS and PaaS, identifying the workload to be deployed on each platform. In such cases, enterprises must choose well-integrated platforms that do not create additional operational complexity and cost.
Operational efficiency and Intelligence
Cost and complexity of operating private clouds must be the prime consideration, since the stability and reliability of the cloud determines the value from the private cloud initiative. Operational intelligence built into the platform provides a common framework for log files analysis, event and alert pipeline management, risk and compliance assessment, and capacity planning. Embedded deep analytics provides visibility into the operations and utilization of the platform, which may impact the overall usability and ROI of the platform.
Metering, billing and charge-back
For enterprise-wide deployment, enterprise cost structure requires IT to meter and charge-back the utilization of cloud resources. While this feature may not be part of an integrated open source project, commercial distributions provide fully integrated or a certified third-party solution.
Ongoing development, a strong technology roadmap, and the foundational of an open source project assures investment protection. Enterprises must invest time to understand the technology roadmap, aligning it with their own planned roll out of cloud and cloud services. Enterprises can invest resources and join the open source foundations or project communities to assert influence over the roadmap. Open source projects welcome investment from larger enterprises and this must be considered a top priority before committing to an open source technology. Failure to invest and influence the open source project risks that the project direction will be hijacked by other larger enterprises. The risk is not only to the successful deployment and migration to hybrid clouds, but also leaves doors open for competition to cause financial damage.
Federation with public clouds
Many enterprises need a hybrid cloud strategy, where most of their resources and some cloud services are consumed from public clouds and most sensitive workloads remain on premise, on private clouds. But to ensure a seamless transition for developers and IT ops, it may be mandatory to federate the private and public clouds. Federation eliminates the complexity of managing multiple cloud environments, enabling single sign-on and consolidated metering, billing, and allowing easy migration or workloads on-demand.
The foundation private cloud solution can be extended to additional services and integrations to existing assets through third-party plugins, service brokers or validated and certified integrations. Pre-tested integrations of third-party solutions help eliminate the cost of custom implementations and professional services. It's highly desirable to ensure enough relevant integrations are available, and not limited to networking, storage, load balancers, messaging systems, active directories, login systems, monitoring and other assets that enterprises already have deployed.
The benefits of private clouds are well documented and while enterprises may be eager to begin implementing a private cloud, it's important to take the time to consider open source vs. closed source technology, existing workloads, IaaS or PaaS, technology roadmaps and other critical points. It's the time spent at the beginning that may well determine if the journey to the cloud is a smooth one.
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
Feb. 22, 2017 11:00 AM EST Reads: 3,236
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Feb. 22, 2017 10:45 AM EST Reads: 339
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business. Though, IoT is far more complex than most firms expected with a majority of IoT projects having failed. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, Chief IoTologist at Wipro, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology portfolios and business models to adopt and leverage IoT. He will delve in...
Feb. 22, 2017 10:00 AM EST Reads: 1,499
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
Feb. 22, 2017 09:45 AM EST Reads: 797
Cloud Expo, Inc. has announced today that Aruna Ravichandran, vice president of DevOps Product and Solutions Marketing at CA Technologies, has been named co-conference chair of DevOps at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Feb. 22, 2017 09:45 AM EST Reads: 1,323
In his session at @ThingsExpo, Steve Wilkes, CTO and founder of Striim, will delve into four enterprise-scale, business-critical case studies where streaming analytics serves as the key to enabling real-time data integration and right-time insights in hybrid cloud, IoT, and fog computing environments. As part of this discussion, he will also present a demo based on its partnership with Fujitsu, highlighting their technologies in a healthcare IoT use-case. The demo showcases the tracking of pati...
Feb. 22, 2017 09:29 AM EST Reads: 145
Tricky charts and visually deceptive graphs often make a case for the impact IT performance has on business. The debate isn't around the obvious; of course, IT performance metrics like website load time influence business metrics such as conversions and revenue. Rather, this presentation will explore various data analysis concepts to understand how, and how not to, assert such correlations. In his session at 20th Cloud Expo, Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Sys...
Feb. 22, 2017 09:15 AM EST Reads: 1,379
Stratoscale, the software company developing the next generation data center operating system, exhibited at SYS-CON's 18th International Cloud Expo®, which took place at the Javits Center in New York City, NY, in June 2016.Stratoscale is revolutionizing the data center with a zero-to-cloud-in-minutes solution. With Stratoscale’s hardware-agnostic, Software Defined Data Center (SDDC) solution to store everything, run anything and scale everywhere, IT is empowered to take control of their data ce...
Feb. 22, 2017 08:30 AM EST Reads: 849
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Feb. 22, 2017 08:30 AM EST Reads: 833
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change the...
Feb. 22, 2017 08:15 AM EST Reads: 4,655
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Feb. 22, 2017 08:00 AM EST Reads: 671
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
Feb. 22, 2017 07:00 AM EST Reads: 2,936
To manage complex web services with lots of calls to the cloud, many businesses have invested in Application Performance Management (APM) and Network Performance Management (NPM) tools. Together APM and NPM tools are essential aids in improving a business's infrastructure required to support an effective web experience... but they are missing a critical component - Internet visibility.
Feb. 22, 2017 06:45 AM EST Reads: 1,639
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
Feb. 22, 2017 06:00 AM EST Reads: 5,264
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
Feb. 22, 2017 06:00 AM EST Reads: 1,851
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Feb. 22, 2017 05:45 AM EST Reads: 1,701
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Feb. 22, 2017 03:15 AM EST Reads: 5,535
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Feb. 22, 2017 02:45 AM EST Reads: 2,053
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Feb. 22, 2017 02:00 AM EST Reads: 13,011
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Feb. 22, 2017 01:45 AM EST Reads: 6,082