Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Elizabeth White, Liz McMillan, Pat Romanski, Roger Strukhoff

Related Topics: @CloudExpo, Java IoT, Microservices Expo, IBM Cloud, Containers Expo Blog, Agile Computing

@CloudExpo: Article

PaaS Deployment Models

The contained and referential approaches

Rapid deployment capability is table stakes when we are talking about a PaaS solution. Every vendor touts it, and to be frank, every user simply expects it to be there. While I think it is interesting to talk about rapid deployment and perhaps compare speed of one solution to that of another, I think it is infinitely more interesting to talk about the mechanics of deployment for a particular solution. That is, I think the more interesting and important question is ‘What deployment style does a particular solution take?'

At a very high, black and white level, I think two primary deployment styles permeate the landscape of PaaS today: contained and referential. I want to compare each approach, but before that, let me use a few words to describe each style.

- Contained: In the contained deployment model, PaaS solutions deploy environments based on packages that contain most, if not all, of the desired configuration as well as the logic to apply that configuration. For instance, if a solution were to deploy a virtual image in the contained model, the virtual machine would have the necessary information and logic embedded to configure itself upon start up. It would not necessarily need to contact external systems or wait for instructions from other actors.

- Referential: In the referential deployment model, PaaS solutions deploy environments using a minimal base package. At some point during the deployment process, the deployed environment communicates with a third party in some fashion to procure the necessary configuration information. Going back to the example above, if a virtual image were deployed in the referential model, the virtual machine would start up and then communicate with a third party service (either by initiating a request or waiting for instructions). This third party service would send down the configuration information and instructions for the environment hosted within the virtual machine.

When comparing the two approaches, it is helpful to understand the advantages and drawbacks of each. A closer look at the contained model reveals an obvious benefit: speed. In this model, the deployment package contains most of what it will require in order to fully configure itself. It does not rely on contacting an external service and having to pull down the necessary binaries and configuration information.

This advantage comes with an obvious drawback: management burden. By building more and more into the deployment package, you are increasing the amount of content that must be maintained and updated in said package. While it is not a huge concern if you only have a handful of discrete packages, you may not be able to rely on that luxury. You may expect that after some amount of time, the number of permutations to support necessitate spending an inordinate amount of time updating deployment packages. If this is the case, you can easily end up in a situation where the benefits of rapid deployment are negated by increased administrative costs.

The referential approach avoids the above pitfall. In this model, the deployment package remains fairly skeletal. Instead of packing in all of the content like in the contained model, the deployment packages in the referential model know just enough to integrate with an external system to get the right configuration information (think Chef and Puppet). This means that you only need to update and maintain configuration data and configuration actions in a single location instead of in each and every deployment package. As the number of different required environments increase, this approach can mean a significant reduction in management burden.

There is a flip side to this coin of course. The referential approach typically results in longer deployment times - dependent on the time required to install and configure content for your environments of course. Since the deployment packages contain very little content at deploy-time, they must pull or otherwise receive that data at some point during the deployment. This may or may not be a big issue for your particular use case, but it is a potential drawback worth considering.

So which approach is better? It is my opinion, one derived from numerous user experiences, that there is no way to generalize the answer to that question. In cases where content is infrequently updated and the number of environmental permutations is fairly well constrained, the contained deployment model can be extremely effective and efficient. On the other hand, in cases where content is dynamic and ever-changing, the referential deployment model is a virtual requirement. From a user's standpoint, I strongly suggest pursuing solutions that support both kinds of deployment models. Tools should not dictate your approach. Your requirements for a particular usage scenario should!

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

CloudEXPO Stories
The precious oil is extracted from the seeds of prickly pear cactus plant. After taking out the seeds from the fruits, they are adequately dried and then cold pressed to obtain the oil. Indeed, the prickly seed oil is quite expensive. Well, that is understandable when you consider the fact that the seeds are really tiny and each seed contain only about 5% of oil in it at most, plus the seeds are usually handpicked from the fruits. This means it will take tons of these seeds to produce just one bottle of the oil for commercial purpose. But from its medical properties to its culinary importance, skin lightening, moisturizing, and protection abilities, down to its extraordinary hair care properties, prickly seed oil has got lots of excellent rewards for anyone who pays the price.
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected path for IoT innovators to scale globally, and the smartest path to cross-device synergy in an instrumented, connected world.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.