Welcome!

@CloudExpo Authors: Liz McMillan, Pat Romanski, Automic Blog, Elizabeth White, Nate Vickery

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Article

Accelerating the Move to Cloud-Based Client Computing

Delivering personality into the standardized desktop is essential

For a long time we have known that corporate use of PCs is inefficient and overly expensive: analysts estimate that a typical PC costs around three times as much as its purchase price to manage over the PC's lifetime. But, until recently, there has been little that organizations could do to change the situation while still delivering acceptable service. Virtualization has changed this in a number of important ways: the physical PC need no longer be the key delivery mechanism and, hence, images can be hosted pretty much anywhere. Essentially, we can host copies of a client operating system and deliver a display protocol to users over the network. However, as with many things, the devil and the opportunity are in the details. Let's look in more detail at how cloud-hosted clients can work today and how changes underway will improve the situation in the future.

Today, we can move desktops into the cloud and manage them in much the same way that we currently manage physical machines. A service provider can build a system to deliver client desktops hosted in the cloud. A customer organization can provide copies of its gold build desktops to the provider, who would replicate them for each users and then allocate an image to every user the first time they connect. From that time on the users will be linked to that same image each time they connect. Behind the scenes, the provider will take care of all of the housekeeping, such as:

  • Storing the virtual machine for the user
  • Delivering users' VMs to a hypervisor running on one of the service provider's servers, and starting, stopping and storing the virtual machine
  • Dealing with issues of authentication and integration with customer systems
  • Managing security of data and communications with the users' virtual machines

The attraction of this system is that it is very similar to the current ways of operating and, therefore, familiar to IT organizations: users own a desktop image, the image is patched with existing tools and, at the end of the virtual machine's life, standard processes can be used for the destruction of the image. Not that implementing such a system is easy. As with any new way of working, there are problems to be solved: in this case, who is going to be responsible for each step of the image lifecycle from creation through patching, maintenance and support to eventual destruction and, ultimately, who takes responsibility for any failures.

The strength of this approach - its similarities to current practices - is also its real weakness: it does not change the model sufficiently enough to really change the economics of client computing. If we look at the details of a solution such as this, on the positive side we see that the customer can benefit from the economies of scale that a cloud provider can bring in terms of offering servers cheaply and, if users are widely dispersed, then the provider's networking strength could deliver a better interactive experience than could be achieved if the customer organization was hosting the servers internally. On the downside, the PC still needs to be managed in much the same way as before. While we have reduced the need for desk-side support, we have added a new layer of administration between the customer organization and the provider. Hence, this solution will play well in situations where there is some additional benefit of moving user desktops to a provider and out of the customer organization, such as numbers of widely dispersed users outside the LAN, but not more generally.

I like to think of the above as a "first-stage" approach to desktop hosting because, while it can work, it does not deliver the level of benefit needed to become a solution for the majority of organizations or users in those organizations. The key to the next, higher-value, solution is to recognize that virtualization is a far more powerful concept than just providing a way to run multiple virtual machines on a server. Because of the isolation that virtualization provides, we can think of it as separating and keeping separate the different components of a user's desktop. Instead of thinking of each user as having a software image and managing that as an individual unique asset, recognize that all the users' images are basically the same with some select differences. In this way, we get to benefit from economies of scale across all that is similar and just manage the differences that make each user think that their machine is "theirs."

How does this work in practice? Each time a user logs on she is given a clean copy of an operating system with a standard set of applications already installed. You can think of this as being similar to the gold image that we might have cloned in a first-stage implementation, except that here we make fresh images every time the user logs on rather than just as a one-time thing. This has the side benefit of making patch management and delivery far simpler and less error-prone. Rather than having to patch each and every user image, many of which may be so far from standard that patching fails, we just patch the gold image - the user will get it next time they log on.

The standard image contains the applications that the organization wants delivered to that user, with the exception of any hosted or streamed applications that are delivered into the image in the normal way. It's important not to confuse this completely standardized image with a desktop that the user would find acceptable and productive. At this stage, the desktop is not configured or personalized for the user. This is acceptable as a one-off occurrence, but would be unacceptable if users had to configure their machines each time they logged on - users would not tolerate the diminished experience and the business would not want users wasting time each day making the machine productive. The key is to be able to set up and personalize the standard machine without the user being aware and without taxing IT organizations and resources. This is known as delivering the "personality" to the machine on-demand. The personality contains everything that makes a machine unique for a user. By managing each personality separately from the underlying operating system and applications, you standardize them while giving users a familiar working environment. A simple way to envision this is to think of the operating system as providing the base layer of client computing with the applications being a layer above, and the personality being the third layer on top. Hence, we see that there are three layers to the virtualized desktop and talk in terms of how each is delivered.

We have mentioned how the operating system will be delivered by making a fresh copy and loading it onto a hypervisor each time the user logs on, and about application delivery, both in terms of installed and hosted or virtualized applications, but we have not talked about the delivery of personality. For the operating system and applications, it's easy to see how virtualization keeps the layers separate so that they can be delivered independently. In order to deliver personality, we must first abstract it from the user environment. Once this is done, the personality can be centrally managed and subsequently delivered back when the user next logs on. One difference between the way that the bottom two layers are managed and the way personality is managed is that the personality data is typically more dynamic than the other standardized layers, reflecting users' continued use and refinement of their environments. The delivery of personality can be effectively handled by a User Environment Management product, which takes care of personality abstraction, management and delivery across all of the application delivery technologies.

The personality contains two different types of information that are necessary to deliver a familiar desktop to users. First are policy items. Policy items consist of all the configuration and setup of the machine that is necessary for it to work in the broader environment. For providers, some of this will include things such as network configurations to work with the provider's infrastructure, but the majority will be customer-specific and will break down into fine-grained detail about how the machine is to work. Examples of policy items include controlling where data is stored, setting up access to particular email servers, and detailed configuration of applications. It also includes the ability to restrict user capabilities when these are not required by the user, for security reasons or for a more general operational requirement.

The second aspect of personality is personalization. Personalization is a myriad of small things that make each user productive and gives them a comfortable, personal place to work. Personalization contains all the changes that users have made to their machines to make them comfortable and productive. In some highly regulated environments, users may not be allowed to personalize their machines, but the majority of enterprise users expect to personalize their working environment. For instance, they expect to be able to make comfort changes such as setting desktop images, having a favorites list in their browser and an IM client that logs on automatically. Productivity personalization covers a very wide range but a representative sample includes the ability to set an email signature block, toolbar positions in applications, language selections, and a variety of preferences across all their applications.

Short term, cloud-delivered desktops fit the "first-stage" model where each user has an image allocated to him once and takes that image forward, much like a traditional PC. However, this model will not deliver sufficient benefits for more general, wide-scale deployment. The key to being able to deliver desktops from the cloud is to make use of the economies of scale that can be achieved by standardizing the deliverables across as many users as possible. That scaling is only possible by taking a component-based view of client computing and assembling those components dynamically for the user. However, in the move to standardize we must remember that we are delivering a product - a user's desktop - that is personal to that user. Delivering personality into the standardized desktop is essential to get user acceptance of cloud-delivered desktops. It is only with "standard plus personality" that we will see real success and adoption of cloud-hosted desktops.

More Stories By Martin Ingram

Martin Ingram is vice president of strategy for AppSense, where he's responsible for understanding where the entire desktop computing market is going and deciding where AppSense should direct its products. He is recognized within the industry as an expert on application delivery. Martin has been with AppSense since 2005, previously having built companies around compliance and security including Kalypton, MIMEsweeper, Baltimore Technologies, Tektronix and Avid. He holds an electrical engineering degree from Sheffield University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices t...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...