Welcome!

@CloudExpo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Pat Romanski, Maria C. Horton

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

And the Killer App for Private Cloud Computing Is

Automating components is easy. It’s automating processes that’s hard

The premise that if you don’t have an infrastructure comprised solely of Infrastructure 2.0 components then you cannot realize an automated, on-demand data center is, in fact, wrong. While the capabilities of modern hardware that come with Infrastructure 2.0 such as a standards-based API able to be leveraged by automation systems certainly makes the task all the more simple, it is not the only way that components can be automated. In fact, “legacy” infrastructure has been automated for years using other mechanisms that can certainly be incorporated into the dynamic data center model.

When it’s time to upgrade or purchase new solutions, those components enabled with standards-based APIs should certainly be considered before those without, but there’s no reason that a hybrid data center replete with both legacy and dynamic infrastructure components cannot be automated in such a way as to form the basis for a “private cloud.” The thought that you must have a homogeneous infrastructure is not only unrealistic it’s also indicative of a too-narrow focus on the individual components rather than systems – and processes - that make up data center operations.

In “The Case Against Private Clouds” Bernard Golden blames the inability to automate legacy infrastructure for a yet-to-occur failure in private cloud implementation:

The key to automating the bottom half of the chart -- the infrastructure portion -- is to use equipment that can be configured remotely with automated measures. In other words, the equipment must be capable of exposing an API that an automated configuration system can interact with. This kind of functionality is the hallmark of up-to-date equipment. Unfortunately, most

data centers are full of equipment that does not have this functionality; instead they have a mishmosh of equipment of various vintages, much of which requires manual configuration. In other words, automating much of the existing infrastructure is a non-starter.

The claim that legacy infrastructure is going to require manual configuration and therefore automating most of the infrastructure is a “non-starter” is the problem here. In other words, if you have “legacy” infrastructure in your data center you can’t build a private cloud because there’s no way to automate its configuration and management.

Identity Management Systems (IDMS) focusing on provisioning and process management have long ago solved this particular problem as has a

plethora of automation and scripting-focused vendors that provide automation technology for network and systems management tasks. CMDB (Configuration Management Database) technology, too, has some capabilities around automating the configuration of network-focused devices that could easily be extended to include a wider variety of network and application network infrastructure.

Any network or systems’ administrator worth their salt can whip up a script (PowerShell, bash, korn, whatever) that can automatically SSH into a remote network device or system and launch another script to perform X or Y and Z. This is not rocket science, this isn’t even very hard. We’ve been doing this for as long as we’ve had networked systems that needed management.

What is hard, and what’s going to make “private” clouds difficult to implement is orchestration and management. That’s hard, and largely immature at this stage because you’re automating processes (i.e.  orchestration) not systems.

That’s really what’s key to a cloud implementation, not the automation of individual components in the network and application infrastructure.


AUTOMATION IS EASY. ORCHESTRATION IS WHAT’S HARD.


Anyone can automate a task on an individual data center component. But automating a series of tasks, i.e. a process, is much more difficult because it not only requires an understanding of the process but is also essentially “integration”. And integration of systems, whether on the software side of the data center or the network and application network side of the data center, is painful. It should be a four letter word – a curse – and though it isn’t considered one it’s often vocalized with the same tone and intention as one invoking some form of ancient curse.

But I digress. The point is not that integration is hard – everyone knows that – but that it’s the integration and collaboration of components that comprise the automation of processes, i.e. orchestration, that makes building a “private cloud” difficult.

Management and orchestration solutions that can easily integrate both legacy infrastructure and Infrastructure 2.0 via standards-based APIs and traditional “hacks” requiring secure remote access and remote execution of scripts is the “killer app” for “private cloud computing”.

It’s already been done in the identity management space (IDMS). It’s already been done in the business and application space (BPM). It should be no surprise that it will, eventually, be “done” in the infrastructure world. Folks watching the infrastructure and cloud computing space just have to stop looking at two layers of the stack and broaden their view a bit to realize that the answer isn’t going to be found solely within the confines of infrastructure. Like the model and applications it hosts, it’s going to be found in a collaborative effort involving components, systems, and people.

Follow me on Twitter View Lori's profile on SlideShare friendfeedicon_facebook AddThis Feed Button Bookmark and Share

Related blogs & articles:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
Michael Maximilien, better known as max or Dr. Max, is a computer scientist with IBM. At IBM Research Triangle Park, he was a principal engineer for the worldwide industry point-of-sale standard: JavaPOS. At IBM Research, some highlights include pioneering research on semantic Web services, mashups, and cloud computing, and platform-as-a-service. He joined the IBM Cloud Labs in 2014 and works closely with Pivotal Inc., to help make the Cloud Found the best PaaS.
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Our strategy is to focus on the hyperscale providers - AWS, Azure, and Google. Over the last year we saw that a lot of developers need to learn how to do their job in the cloud and we see this DevOps movement that we are catering to with our content," stated Alessandro Fasan, Head of Global Sales at Cloud Academy, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...