Welcome!

@CloudExpo Authors: Liz McMillan, Paul Simmons, Pat Romanski, Yeshim Deniz, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Article

The Role of Trusted Infrastructure in Application Deployment

Users need to treat trust as an operating cost of the cloud

While there is much talk about how to secure data and systems in cloud computing, many gaps remain in implementing such security. As users of any physical security systems can attest, it just takes one unsecured door or window to provide easy access to an intruder and loss of valued property. Of course with data in the cloud, the situation is much worse. Cloud users can have their data stolen by more than one intruder. And, the cost of a security breach can easily run into the millions of dollars range - far more than most thieves could steal in hardware.

The root problem of cloud security is caused by users' inability to verify directly the trusted computing base. In the cloud, users have the choice of either verifying to establish trust prior to implementing cloud services or simply trust to begin with and then hope for the best. The latter approach is not recommended. The recommended process of initial verification involves standards, technology and certification. Deeper verification involves the use of integrity measurement and remote attestation of the components of the cloud computing base.

Using legacy applications in the shared IT environment of cloud computing requires replacing physical with virtual boundaries. This is the infrastructure consolidation challenge for cloud users and providers. Today, most of the products and existing standards address specific devices or functionality within the overall end to end process. While many standards and products contribute to the ability to solve various portions of the problem, no comprehensive framework exists to describe the needs of various businesses and validate the compliance of the entire solution. As applications migrate into the cloud and the service-oriented components of composite applications are hosted on infrastructure that is shared with other tenants and potentially provided by multiple providers, it is critical to the integrity of the overall composite application to establish a trust domain that describes a coherent policy for the infrastructure services that become part of the application domain.

Trusted Computing Group (TCG) experts have determined that in a Trusted Multi-Tenant Infrastructure (TMI), potential users are looking for protection of processing and information in motion and at rest, as well as the ability to share physical platforms among tenant domain components or shared services. In addition, users want visibility and auditability of actions across the enterprise. This imposes a few constraints on the solution.

Among the constraints is the ability to manage physical resources independently of domain resources as well as the ability to run legacy workload unmodified within a secure context. This also requires the ability to control the flow of information between tenant domains within policy constraints. A further implication is a loosely coupled architecture that can be managed using the application of appropriate policy and trust considerations. In addition, it requires the ability to address various security models to protect the integrity and confidentiality of services and data exchanges within the enterprise.

Closing Security Gaps
Today, the classic approach to deploying SOA application services involves separate infrastructure for each component or client. This approach makes end-to-end visibility, security and management very difficult. Existing solutions also increase space, power usage and cooling requirements. In contrast, the target multi-tenant solution supports more efficient use of space, power and hardware, as well as a consistent security and management policy across shared infrastructure and providers.

The trusted multi-tenant solution is being developed by the Trusted Computing Group (TCG), which consists of member companies from over 100 key technology providers of hardware, software and services as well as several security conscious users. TCG has developed several specifications to establish and improve trust within organizations. The initial specification defined a Trusted Platform Module or TPM - a hardware root of trust for the computing platform. Today, TPM 1.2 is an ISO standard.

The TPM is security hardware that is resident on the motherboard. Based on open specifications from TCG, the TPM resists tampering and software attacks. It is an integral part of almost all enterprise PCs but it is off by default, requiring users to opt in. It features secure key storage cryptographic functions and integrity checking and remote attestation. As a result, the TPM is used for strong user and the machine authentication, secure storage and for trusted/secure boot.

TCG specifications also define a chain of trust architecture that enables attestation of trusted platform properties. Figure 1 shows a visual example of the various TCG standards that have been established for trusted systems. One missing element, especially relevant for cloud computing, is the reference model that brings these existing standards together.

Figure 1: Trusted Computing Group Standards address key data security issues within small to large enterprises.

To address the missing piece and close the gap, TCG is describing the overall framework for integrated Trusted Multi-Tenant Infrastructure solutions. Its Trusted Multi-Tenant Infrastructure (TMI) solution-oriented work group (WG) is focused on the business or mission outcomes using the output of TCG's technical work groups. The intent is to produce a logical reference model consisting of components interfaces, use cases, standards alignment and gaps as well as compliance validation. This open model will be used by TMI WG industry experts as a framework for building secure enterprise solutions.

In addition to identifying and addressing gaps in existing standards, the TMI work group's objectives include developing a standards framework for implementing shared infrastructures and multi-provider infrastructures as well as developing reference models and implementation guidance.

As part of its somewhat different approach to standards, the TMI WG focuses on a new class of membership including IT product vendors and integrators. The goal of this committee's efforts is an IT solution based on multi-tenant shared infrastructure that establishes trustworthiness in the provider of the IT services. Additionally, the solution will address establishing and monitoring compliance of the changing IT policies as well as accessing and monitoring compliance across policy and performance objectives - all in a multi-sourced, multi-supplier ecosystem.

Use Cases within the TMI Architecture
The use cases that have been developed for the TMI reference architecture are based on a small set of core primitive capabilities. The first core function establishes a trusted context for interaction across the multi-tenant domain by aligning the Public Key Infrastructure (PKIv3) security architecture and the TPM. This function allows the establishment of a level of trust including identity of the components in a solution, integrity measurement and compliance attestations that support the degree and type of information to be accepted between parties. The second core function requires exchanging information within the established trusted context. The trusted context provides the ability to ensure that the information is exchanged between parties within the boundaries of the trusted relationship.

Finally, the third core function requires definition and application of consistent policy governing the state, configuration, interaction and management of components within the TMI.

In a broad sense, the TMI WG is defining use cases using the vocabulary associated with the core primitive functions. This will allow transmitting information from a confidential source to a recipient in another tenant domain with the assurance of the ability to trust that the information is reliable. Also, it will allow users and providers to determine if workload from a tenant domain can be provisioned to an external cloud provider in accordance with the policies of both provider and the consumer of services.

Figure 2 shows an example of a generic use case from TCG Trusted Multi-Tenant Infrastructure Use Cases Version 1.0. The use case identifies the relationship between the various components involved in the provisioning of a Peripheral Device in a provider's environment while maintaining compliance with the published policies of the Trusted Systems Domain.

Figure 2. A generic use case identifies how to provision a peripheral device within the Trusted Systems Domain.

With the use cases established, the next step involves deriving a reference model. A logical view of the TMI reference architecture is shown in Figure 3. In this figure, the User Access Device (UAD) supports connecting to one or more concurrent domains. Servers indicate a federated data center of servers that can host multiple independent domains. Exchange represents logical components both physical and virtual that define cross domain information flow rules. Storage is also federated and the network represents devices that can transfer data from multiple domains.

Figure 3: The logical view of the TMI reference architecture shows the separation of the consumers in various cloud aspects.

The management view of the TMI reference model framework shows that both consumers and providers have management responsibilities within the TMI. Specifically, the consumer manages his policy and resources allocated within his domain. In contrast, the provider manages resource allocation to the consumers as well as the platform. A consumer has no direct control over the underlying platform and the provider has no insight into the information and processing within the resources allocated to the consumer. In addition, the provider establishes trust relationships with other providers and consumers to enable migration, bursting and interaction between tenant consumers. Figure 4 shows these relationships.

Figure 4: The management view of the TMI reference architecture shows the consumer and provider management responsibilities within the TMI.

A New Paradigm for Trusted Applications
A company that wants to add cloud resources to its existing resources without compromising security usually has several specific requirements. To proceed without compromising security, a company needs a means to interrogate the infrastructure and obtain answers that have a level of integrity to determine that the responses are true. The information exchange involves the platform information, assured compliance to agreed policies and measurement of platform state.

With the TMI solution, patterns are brought together to establish a trust relationship with a potential provider and then overlaid with the three core functions identified previously. This provides the process to have a policy discussion and establish a trust domain. The set of policies that need to be enforced are established through the use cases.  Using this approach, a user can interview potential suppliers and verify which ones are willing to provide resources that are compliant with the organization's policies. With these security policies in place, the user can be confident that the cloud security is essentially the same as the security within that organization.

Building upon the concepts the TMI reference model describes a set of requirements and implementation patterns that can be used to construct solutions. A requirement is mapped to a set of patterns that describe a number of ways that the requirement can be met. Each pattern is then mapped to industry standards and practices that can then be mapped to products that implement the standards. This mapping allows architects to construct valid solutions within a shared infrastructure cloud.

For example, a consumer of cloud services may determine a need for virtual machines to host components of a service-oriented application. Certain of the services must run on dedicated hardware, others can share hardware, but must be located in a specific geography and others may be shared only if strong separation is enforced by the hypervisor. The hardware must be able to prove that it has not been compromised by malware through attestation of a trusted boot and integrity measurement of key modules within the bios, hypervisor and OS. The application runtime environment must be at a certain patch level. These policies are sent to the environment provider, who returns signed attestations of compliance within the trusted context established between the provider and consumer. The consumer then determines that the application components can be provisioned on the resources and devices for which the provider has attested policy compliance. The consumer may choose to re-validate compliance at various times to ensure nothing has changed.

While not yet a part of the TMI reference model, there is no reason why the same process could not be applied to application components using code signing and attestation of integrity measurement of running application services.

The value of using a trusted infrastructure for service-oriented application deployment should be clear. Even the best written applications can be compromised if the platform can be intercepted maliciously or the content of memory or storage changed. If an application can validate the integrity of the platform upon which the code is executing, then it becomes a less difficult decision to use shared resources or cloud hosting to deploy and scale the application. Combine this with trusted coding and application integrity measurement and the decision process for a consumer to entrust their data to a cloud based application becomes easier.

Looking at Trust Differently than in the Past
According to Ponemon Institutes' Cost of a Data Breach 2010 study, the average cost of a data breach is about $3.44 million and ranges from $1.83 to $6.75 million in different regions of the world.  Enterprises must consider the cost of protecting business data, whether in the internal data center or the cloud just like the cost of higher security door locks or hiring security staff.

Instead of considering trust as an afterthought, users need to treat trust as an operating cost of the cloud. Standards are one way of minimizing this cost. While a trusted secure cloud may not be the lowest initial cost option, in the long run, the risk of exposure must be balanced against the compromise of the information that a user puts into the cloud. TCG's Trusted Multi-Tenant Infrastructure WG has made significant progress towards providing the industry with a standard for IT managers.

The TMI solution working group encourages vendors and end-user organizations to get involved in defining the requirements for the TMI specification.

More Stories By Michael Donovan

Michael Donovan is the Chief Technologist for Strategic Capabilities with HP Enterprise Services, responsible for framework implementation to support capabilities and offering development for clients across the U.S. Public Sector. His responsibilities include harvesting existing solutions for re-use and developing new capabilities to meet the complex needs of federal, state and local governments, leveraging the best of our current account and corporate capabilities and those supported by our partner ecosystem and HP Labs. He also co-chairs the Trusted Multi-Tenant Infrastructure Work Group of the Trusted Computing Group.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
HyperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let's say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
From 2013, NTT Communications has been providing cPaaS service, SkyWay. Its customer’s expectations for leveraging WebRTC technology are not only typical real-time communication use cases such as Web conference, remote education, but also IoT use cases such as remote camera monitoring, smart-glass, and robotic. Because of this, NTT Communications has numerous IoT business use-cases that its customers are developing on top of PaaS. WebRTC will lead IoT businesses to be more innovative and address...
Evan Kirstel is an internationally recognized thought leader and social media influencer in IoT (#1 in 2017), Cloud, Data Security (2016), Health Tech (#9 in 2017), Digital Health (#6 in 2016), B2B Marketing (#5 in 2015), AI, Smart Home, Digital (2017), IIoT (#1 in 2017) and Telecom/Wireless/5G. His connections are a "Who's Who" in these technologies, He is in the top 10 most mentioned/re-tweeted by CMOs and CIOs (2016) and have been recently named 5th most influential B2B marketeer in the US. H...
In this presentation, you will learn first hand what works and what doesn't while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 C...
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In a world where the internet rules all, where 94% of business buyers conduct online research, and where e-commerce sales are poised to fall between $427 billion and $443 billion by the end of this year, we think it's safe to say that your website is a vital part of your business strategy. Whether you're a B2B company, a local business, or an e-commerce site, digital presence is key to maintain in your drive towards success. Digital Performance will take priority in 2018 for the following reason...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
In his session at Cloud Expo, Alan Winters, U.S. Head of Business Development at MobiDev, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to maximize project result...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Our strategy is to focus on the hyperscale providers - AWS, Azure, and Google. Over the last year we saw that a lot of developers need to learn how to do their job in the cloud and we see this DevOps movement that we are catering to with our content," stated Alessandro Fasan, Head of Global Sales at Cloud Academy, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.