Welcome!

@CloudExpo Authors: John Worthington, Liz McMillan, Xenia von Wedel, Mauro Carniel, William Schmarzo

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Blog Post

User Control and the Private Cloud

Sacrificing some control may mean big benefits

One of the comparison points between the public and private cloud domains is the difference in the level of control and customization over the cloud-based service. In a public cloud environment, users typically receive highly standardized (and in many cases commoditized) IT services from the provider. In the private cloud, it is usually the case that users have a higher degree of control and customization over the cloud-based service. I would wager a bet that to many this makes perfect sense and it is simply a manifestation of who has control over the means of delivery (third party in the public domain, end-users in the private domain).

Presumably, those users who are heading down the public cloud route have come to grips with the fact that they will have less control over certain elements. They are happy to get fast, on-demand access to services at the cost of some control. However, I have come to learn private cloud users have a much different expectation. They want the features they get from a public cloud, but typically do not want to give up one iota of control/customization capability. Not to be the bearer of bad news, but in most cases this is all but impossible.

While I believe it is obvious that private clouds provide a much higher degree of customization and control than their public counterparts, that does not mean every single thing about a service delivered via a private cloud is configurable by the end-user. In fact, some things that are pre-configured and immutable deliver significant value to the user.

As an example, let's consider the scenario of an enterprise working to build a private cloud to run their application platforms. Part of this effort is likely to include the creation of a service catalog that contains different application platforms they can run in their cloud. A significant portion of the service catalog will consist of vendor-supplied, virtualized application platforms. These virtualized offerings have been pre-packaged, pre-integrated, and optimized by the vendors, and they are ready to simply activate and run within the private cloud.

By getting these virtualized packages from vendors, the company spares the cost of developing and maintaining them over time. This is good for multiple reasons, including:

- The company can focus on core competencies, which we can assume do/should not include application infrastructure administration

- The company benefits from pre-packaged application stacks, which the vendor configures, integrates, and optimizes appropriately to achieve the best possible result in terms of service delivery time and service performance.

Of course, these pre-packaged, virtualized application stacks also have another effect on the company. Namely, it means the company gives away some control over the exact configuration of the software bits. This may include things like the location of the installed product on disk (which I've learned is not nearly as inconsequential as I thought), default system users, initial disk allocation, etc. Some of these things may be reconfigurable by the end-user (system users, disk allocation), but invariably some remain burned into the virtualized stack for all of eternity (installed product location).

In this case, end-users sacrifice some control and in turn achieve simplicity, rapidity, and standardization. After all, if the system required the user to specify installation locations for each product in the application stack for every service request, the delivery of said service would not be as simple or fast. In addition, there would be a lesser degree of standardization among the running services, since it would be possible for the software inside those instances to have varying installation directories. This may seem minor, but it could have substantial impacts on the ability to centrally administer these cloud-based services.

In the end, the very specific use case above highlights general themes regarding control in a private cloud environment. Private cloud environments typically offer more control and customization capability than their public cloud counterparts, but that does not mean they offer the same control capability that users have grown accustomed to in their traditional environments. Users must usually sacrifice some control in order to achieve the benefits of simple, fast, and standardized cloud-based service delivery and administration. When confronted with potentially giving up control over a specific capability, users should consider this simple question:

- Do I need control over this element?

It seems like an obvious question, right? However, I emphasize need because users often merely want control. They are used to a certain culture, or a "that's the way we do it" mode of operation. Making decisions based on culture and standard operating procedure does not always yield the best results for the enterprise.

Of course, the entire onus is not on the users when it comes to this dilemma in the private cloud. Vendors must carefully design and implement private cloud solutions and services so that they offer a high degree of customization and configurability, all the while delivering out-of-the-box value. They cannot be overwhelming to use, and at the same time they should accommodate a wide array of use cases. As you may imagine, this is far from easy and the state of this art is just beginning to evolve.

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

@CloudExpo Stories
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, addressed this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of the ...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., introduced you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He explored applications in several industries and discussed technologies that allow the deployment of advanced visualization solutions to the cloud.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
"NetApp is known as a data management leader but we do a lot more than just data management on-prem with the data centers of our customers. We're also big in the hybrid cloud," explained Wes Talbert, Principal Architect at NetApp, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
"We're focused on how to get some of the attributes that you would expect from an Amazon, Azure, Google, and doing that on-prem. We believe today that you can actually get those types of things done with certain architectures available in the market today," explained Steve Conner, VP of Sales at Cloudistics, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...