Welcome!

@CloudExpo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Carmen Gonzalez, Harry Trott

Blog Feed Post

COTS Cloud security reference design and related NIST workshop

By

NISTSince the beginning of the modern Cloud movement (which we trace to November 2006 — see here if you want to know why) technologists have been seeking ways to mitigate key risks. Top on our list include

1) The increased risk due to multi-tenancy

2) The mission needs of availability (including the need for always available path to resources)

3) New and at times nuanced challenges regarding data confidentiality

4) New challenges regarding integrity of data.

There are many other policy related risks that planners must consider, including how to establish the best user authentication methods and how to ensure compliance with regulations and laws of the geography that holds the data. But for a technologist, the four above are a continual concern, and if those technical concerns are mitigated it makes other concerns so much easier to deal with.

That is why we read with such great pleasure a recent announcement that NIST is continuing to work with industry to ensure advancements are being made in cloud security. The NIST National Cyber Center of Excellence (NNCOE) in Rockville, MD is a focal point for many great industry/government interactions, including a workshop at their facility January 14 that we are especially excited about.

This workshop is on the topic of  Trusted Geo location in the Cloud. It is a proof of concept implementation that uses technology that has proven to be the most scalable technology on the globe: Intel processors.  Technologists presenting and discussing these developments come from Intel, EMC-RSA, NIST and the NCCoE. This will be a great workshop that includes hands-on demonstrations of this technology, and we believe it will show ways to help mitigate all four of the challenges we  provide above.

Following the workshop the NCCoE will have a two day cloud computing event (details can be found on that here)

From the workshop flyer:

An upcoming workshop to be held at the NIST National Cyber Center of Excellence (NNCOE) facility in Rockville, MD on Monday, January 14th on Trusted Geo location in the Cloud : Proof of Concept Implementation.

There is a very interesting workshop being provided to a technical audience next week on Monday the 14th by NIST and private industry on a cloud use case embracing the security challenges involving Infrastructure as a Service (IaaS) cloud computing technologies and geolocation.

The motivation behind this use case is to improve the security of cloud computing and accelerate the adoption of cloud computing technologies by establishing an automated hardware root of trust method for enforcing and monitoring geolocation restrictions for cloud servers. A hardware root of trust is an inherently trusted combination of hardware and firmware that maintains the integrity of the geolocation information and the platform. This information is accessed using secure protocols to assert the integrity of the platform and confirm the location of the host.

At the heart of the solution is a reference design provided through the utilization of commercial off the shelf (COTS) products provided by Intel, VmWare and RSA Archer. The use case is of significant relevance to US Federal agencies in solving the security problem in question: improving the security of virtualized infrastructure cloud computing technologies by enforcing geolocation restrictions.

NIST now moves in conjunction with private industry in a workshop specific to this research (attached to this email) that explains and details how to implement this trusted cloud solution on January 14th at the NIST National Cyber Center of Excellence (NCCOE).

Audience 

This workshop and IR document has been created for security researchers, cloud computing practitioners, system integrators, and other parties interested in techniques for solving the security problem in question: improving the security of virtualized infrastructure cloud computing technologies by enforcing geolocation restrictions. 2:00 PM – 2:15 PM  NCCoE Introduction NIST 
2:15 PM – 2:30 PM  Trusted Cloud Description NIST 
2:30 PM – 2:45 PM  Trusted Geolocation in the Cloud Implementation – Trusted Measurement and Remote Attestation Intel Corporation 
2:45 PM – 3:00 PM  Trusted Geolocation in the Cloud Trusted – Monitoring of Measurements in a Governance, Risk, and Compliance Dashboard EMC-RSA 
3:00 PM – 3:15 PM  Trusted Cloud Demonstration Intel, EMC-RSA, and NIST 
3:15 PM – 4:00 PM  Questions and Answers / Hands-on Session Intel, EMC-RSA, and NIST 

 

Participation from all parties is welcomed and to register for this workshop: Please send an email with the attendee’s name, affiliation, and email address in the body of the message to [email protected], with the subject “Trusted Location in the cloud” by January 13, 2013.

This workshop is now part of their Big Data and Cloud Computing Workshop to be held at the NIST HQ in Gaithersburg, MD on January 15-17. http://www.nist.gov/itl/cloud/cloudbdworkshop.cfm

The importance of this secure cloud computing proof of concept can be found in the NIST Draft publication at the following link to the publication which details this reference design and clearry delineates how to stand up this secure cloud structure. The NIST Interagency Report (NISTIR) is a public/ private collaboration with co-authors from both NIST and private industry authors and is now taking public comments: http://csrc.nist.gov/publications/drafts/ir7904/draft_nistir_7904.pdf

____________________________________________________________________________________

Background Information taken from NISTIR 7904:

Shared cloud computing technologies are designed to be very agile and flexible, transparently using whatever resources are available to process workloads for their customers. However, there are security and privacy concerns with allowing unrestricted workload migration. Whenever multiple workloads are present on a single cloud server, there is a need to segregate those workloads from each other so that they do not interfere with each other, gain access to each other’s sensitive data, or otherwise compromise the security or privacy of the workloads. Imagine two rival companies with workloads on the same server; each company would want to ensure that the server can be trusted to protect their information from the other company.

Another concern with shared cloud computing is that workloads could move from cloud servers located in one country to servers located in another country. Each country has its own laws for data security, privacy, and other aspects of information technology (IT). Because the requirements of these laws may conflict with an organization’s policies or mandates (e.g., laws, regulations), an organization may decide that it needs to restrict which cloud servers it uses based on their location. A common desire is to only use cloud servers physically located within the same country as the organization. Determining the approximate physical location of an object, such as a cloud computing server, is generally known as geolocation. Geolocation can be accomplished in many ways, with varying degrees of accuracy, but traditional geolocation methods are not secured and they are enforced through management and operational controls that cannot be automated and scaled, and therefore traditional geolocation methods cannot be trusted to meet cloud security needs.

The motivation behind this use case is to improve the security of cloud computing and accelerate the adoption of cloud computing technologies by establishing an automated hardware root of trust method for enforcing and monitoring geolocation restrictions for cloud servers. A hardware root of trust is an inherently trusted combination of hardware and firmware that maintains the integrity of the geolocation information and the platform. The hardware root of trust is seeded by the organization, with the host’s unique identifier and platform metadata stored in tamperproof hardware. This information is accessed using secure protocols to assert the integrity of the platform and confirm the location of the host.

NIST now moves in conjunction with private industry in a workshop specific to this research (attached to this email) that explains and details how to implement this trusted cloud solution on January 14th at the NIST National Cyber Center of Excellence (NCCOE). This workshop is now part of their Big Data and Cloud Computing Workshop to be held at the NIST HQ in Gaithersburg, MD on January 15-17. http://www.nist.gov/itl/cloud/cloudbdworkshop.cfm

Here is the link to the publication from both NIST and private industry authors that is now taking public comments: http://csrc.nist.gov/publications/drafts/ir7904/draft_nistir_7904.pdf

For media interviews and comments, please contact:

Kevin Fiftal

Intel Corporation

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@CloudExpo Stories
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...