|By Elad Yoran||
|November 19, 2012 08:00 AM EST||
Cloud computing has dramatically altered how IT infrastructure is delivered and managed, as well as how IT functionality is consumed. However, security and privacy concerns continue to be major inhibitors for risk-conscious organizations to adoption of cloud computing - whether infrastructure as a service, software as a service applications or email as a service.
Cloud service providers, in response, have made strategic decisions on the investment they make in directly addressing these concerns in order to encourage broader adoption of cloud-based services. By implementing controls and processes to further improve security, cloud service providers are increasingly aiming to deliver more safeguards for the cloud environment than individual customer could within on-premise environments. However, a significant consideration for many organizations as they look to best exploit the benefits of the cloud is whether they can retain ownership and control of data processed by third party services.
Defining Roles, Responsibilities and Data Control Borders
The value proposition delivered by cloud service providers is in managing IT infrastructure in a more flexible, scalable and cost-efficient manner than an organization could do independently. The basic roles and responsibilities of the cloud service provider therefore should focus on the security, resiliency, scalability and manageability of their service. Security encompasses not only physical datacenter security, but also the means to limit administrator access across a multi-tenant environment and customer instances based on the principle of least privilege. However, at best, the cloud service provider can only provide a set of tools and options for customers looking to encrypt data in place.
Maintaining ownership and control of data is discrete from the underlying security and processes implemented by the cloud service provider. Even though the data resides on their infrastructure, cloud service providers are compelled to maintain that an organization retains responsibility for its own data. The not-for-profit Cloud Security Alliance notes in its most recent Email Security Implementation Guidance that it is critical that the customer - not the cloud service provider - be responsible for the security and encryption protection controls necessary to meet their organization's requirements.
By contrast, the roles and responsibilities of organization in regards to corporate data remain the same regardless of where it resides or is processed: specifically, maintaining ownership and direct control of that data. When corporate data is moved from on-premise to the cloud, compliance and security requirements dictate that the organization cannot relinquish ownership or control of its data. Also, the loss of visibility into who has access to that data implies that it can be subpoenaed and handed over to law enforcement agencies without its knowledge.
Principal Business Challenges of Migrating Data to the Cloud
The principal business challenges that organizations typically face when migrating data to the cloud encompass data security, regulatory compliance, unauthorized data disclosure and access, and international privacy/ data residency regulations. These issues need to be resolved to address the requirements of the legal team, as well security or compliance officers, before moving an organization's data to the cloud.
Data Security and Risk Mitigation
In cloud computing applications, data is frequently stored and processed at the cloud provider in the clear - unless customers themselves encrypt the data-at-rest and in-use. This brings up numerous data ownership and control responsibilities/concerns for an organization.
From a structural perspective, cloud-based services pose a challenge to traditional methods of securing data. Traditionally, encryption has been used to secure data resident on internal systems, or to protect data moving from one point to another. Ensuring that data remains encrypted in place within a third-party provider's environment and throughout the data lifecycle, but is seamlessly available to authorized users presents a new set of technical challenges.
In order to satisfy the new set of requirements introduced by migration to cloud-based services, cloud data must remain in encrypted cipher format. Also, data should be encrypted before it leaves the corporate or trusted network in order to meet data residency and privacy requirements. To maintain control of data that is no longer resident on a trusted network, the encryption keys remain under the organization's control and ownership.
Regulatory Compliance Requirements for Safeguards on Sensitive Data
Organizations are subject to a broad array of regulatory requirements including federal laws such as Sarbanes-Oxley, varying state data protection measures, The USA Patriot Act and vertical-specific regulations (HIPAA, HITECH, Basel II, GLBA and PCI DSS), in addition to potential international data privacy and residency requirements such as the EU Data Protection Directive.
Although the specifics vary according to the compliance requirements specified, a common stipulation is that organizations retain control over their data and maintain mechanisms to prevent unauthorized access. For instance, HIPAA regulations require technical safeguards to ensure that each covered entity is responsible for ensuring that the data within its systems has not been changed or erased in an unauthorized manner. The GLBA specifies that financial institutions within the US are mandated to protect against any anticipated threats or hazards to the security or integrity of customer records and information. Likewise, in terms of the requirements spelled out by PCI Data Security Standards, stored cardholder data needs to be protected by strong encryption.
Unauthorized Data Disclosure and Access
In the US, personal information is protected by the Fourth Amendment. However once it is shared, it is no longer protected. Until legal guidelines are established to address the application of the Fourth Amendment in cloud computing, uploaded data is not considered private.
Cloud service providers are compelled by law to comply with subpoenas and other requests by the government to turn over customer data, including data subject to attorney-client privilege and other protected data. Often, cloud providers will only notify customers that data was turned over to the government after the fact, if at all. In some instances, they may even be expressly prohibited from notifying customers. This risk prevents many organizations from migrating sensitive data to the cloud.
International Privacy/ Data Residency Regulations
Data protection laws and privacy regulations mandate the direct control of an organization's information and safeguards for moving data outside of defined jurisdictions. These laws are broad, and are continually being implemented in a growing number of countries across the globe -- making it difficult for some organizations to fully realize the promise of cloud computing.
To comply with specific data protection laws and international privacy regulations, organizations often pay cloud providers a premium to add costly infrastructure in each location of interest, resulting in a sharp increase in costs and decrease in efficiency. Furthermore, most providers are unwilling to duplicate infrastructure in all locations, making it difficult for customers to comply with these regulations.
Implementing Best Practices for Cloud Data Control: Data-in-Use Encryption
Encryption of data-in-transit and data-at-rest has long been recognized as best practices to enforce the security and privacy of data, regardless of where it resides. However, these two states of encryption are no longer sufficient as they do not protect data while it is being processed in the cloud.
According to the Cloud Security Alliance's Encryption Implementation Guidance, organizations should implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use). To prevent unauthorized access and maintain the state of encryption even when processed in a third-party environment, enterprise IT should retain ownership of the encryption keys. As a result, the cloud provider never has access to customer data in an unencrypted form, and an organization's cloud data remains unreadable if an unauthorized third-party attempts access -- or even if the data is disclosed in response to a government request.
Figure 1: The not-for-profit industry association, the Cloud Security Alliance, recommends that organizations implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use).
Traditionally, if cloud-hosted data was encrypted, basic server-side operations such as indexing, searching and sorting records became impossible. Once cipher text was put into a SaaS application, some of the features of the program no longer worked, and the user experience suffered as a result. The implementation of data-in-use encryption supports dynamic operations such as search, sort and index of encrypted data in the cloud. Even as the data is processed by a cloud-based service, the IT department of the organization that owns the data or a trusted third party retains control of the encryption keys. As a result, application functionality is preserved and decryption is policy-driven and automated.
The Implementation of Data-in-Use Encryption Enables Organizations to Seamlessly Harness the Power of the Cloud
By addressing the concerns associated with control and ownership of proprietary data residing on third-party cloud-based servers, data-in-use encryption technology directly addresses material concerns related to compliance requirements, separation of data controls through key retention, data residency and unauthorized disclosure of data in response to a government request.
Data-in-use encryption is of particular value for organizations with the desire to independently manage data disclosure requests from law enforcement agencies. Equally, cloud service provides are not eager to be in the undesirable position of being compelled to disclose customer data. The cloud provider will still turn over customer data when presented with a subpoena or other government request because they have no choice but to comply. However, because all of the data was encrypted before it was received by the cloud provider, and the organization is holding the encryption keys, they cannot decrypt that data. Therefore, when complying with an order, the cloud provider can only turn over cipher text. If the government wants to decrypt the data, it must go directly to the organization that owns the data.
Figure 2: Sample of an authorized \ unauthorized view of sensitive data in a hosted Exchange email application.
In geographically distributed environments, smart encryption also creates a paradigm shift from requiring the data to remain locally to only requiring the encryption keys to remain locally for data. Organizations with multiple data residency requirements can deploy and maintain an instance of the encryption appliance in each jurisdiction. Once the data is encrypted with keys that are maintained in that jurisdiction, the encrypted data can lawfully reside in any location.
The addition of encryption-in-use empowers the organization to retain full ownership and control during the entire process, including when the data is out of its network and in the cloud, while ensuring maximum security and regulatory compliance.
Industry analysts agree. According to Ahmad Zeffirelli, Industry Analyst at Frost & Sullivan, "This solution with its ability to encrypt data-in-use, data-in-transit, and data-at-rest, would bring immense benefits to a vast majority of organizations concerned about data security while leveraging cloud computing."
Building Commercially Viable Encryption
One of the most difficult technical challenges in developing encryption for commercial applications running in the cloud is to establish the right balance between the competing goals of encryption/security on the one hand versus features/performance on the other. In commercial markets, especially in the cloud, introducing additional steps for users to follow in order to address security requirements both undermines the ease of use value propositions of cloud-based services and creates the likelihood that users will look for ways to circumvent controls.
The entire process should be transparent to the end-user. Specifically, the security functionality should not require the installation of an application or agent on the end user's client device or mobile phone. Also, there should be no impact to the end-user experience in terms of functionality, performance, or task workflow. Furthermore, commercially viable encryption capabilities should not interfere with standard email security features such as malware and anti-virus protection.
By effectively addressing data control, compliance and security requirements, while ensuring preservation of application functionality including search, sort and index capabilities and a seamless user experience, technology that enables the encryption of data-at-rest, data-in-transit and data-in-use within the cloud environment functions as an enabler for cloud adoption for organizations worldwide.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi’s VP Business Development and Engineering, will explore the IoT cloud-based platform technologies drivi...
Jul. 2, 2015 02:00 PM EDT Reads: 982
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of...
Jul. 2, 2015 01:45 PM EDT Reads: 2,263
Live Webinar with 451 Research Analyst Peter Christy. Join us on Wednesday July 22, 2015, at 10 am PT / 1 pm ET In a world where users are on the Internet and the applications are in the cloud, how do you maintain your historic SLA with your users? Peter Christy, Research Director, Networks at 451 Research, will discuss this new network paradigm, one in which there is no LAN and no WAN, and discuss what users and network administrators gain and give up when migrating to the agile world of clo...
Jul. 2, 2015 01:36 PM EDT Reads: 186
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Jul. 2, 2015 01:15 PM EDT Reads: 656
SYS-CON Events announced today that JFrog, maker of Artifactory, the popular Binary Repository Manager, will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based in California, Israel and France, founded by longtime field-experts, JFrog, creator of Artifactory and Bintray, has provided the market with the first Binary Repository solution and a software distribution social platform.
Jul. 2, 2015 01:00 PM EDT Reads: 1,192
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 2, 2015 01:00 PM EDT Reads: 1,022
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Jul. 2, 2015 12:00 PM EDT Reads: 1,999
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult – let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and liv...
Jul. 2, 2015 11:50 AM EDT Reads: 355
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
Jul. 2, 2015 11:17 AM EDT Reads: 376
"We have a tagline - "Power in the API Economy." What that means is everything that is built in applications and connected applications is done through APIs," explained Roberto Medrano, Executive Vice President at Akana, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 2, 2015 11:15 AM EDT Reads: 1,024
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Jul. 2, 2015 11:00 AM EDT Reads: 2,101
Malicious agents are moving faster than the speed of business. Even more worrisome, most companies are relying on legacy approaches to security that are no longer capable of meeting current threats. In the modern cloud, threat diversity is rapidly expanding, necessitating more sophisticated security protocols than those used in the past or in desktop environments. Yet companies are falling for cloud security myths that were truths at one time but have evolved out of existence.
Jul. 2, 2015 11:00 AM EDT Reads: 2,185
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes. In his session at DevOps Summit, Michael Demmer, VP of Engineering at Jut, will discuss how this can...
Jul. 2, 2015 10:46 AM EDT Reads: 340
"A lot of the enterprises that have been using our systems for many years are reaching out to the cloud - the public cloud, the private cloud and hybrid," stated Reuven Harrison, CTO and Co-Founder of Tufin, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 2, 2015 10:45 AM EDT Reads: 716
In his session at 16th Cloud Expo, Simone Brunozzi, VP and Chief Technologist of Cloud Services at VMware, reviewed the changes that the cloud computing industry has gone through over the last five years and shared insights into what the next five will bring. He also chronicled the challenges enterprise companies are facing as they move to the public cloud. He delved into the "Hybrid Cloud" space and explained why every CIO should consider ‘hybrid cloud' as part of their future strategy to achi...
Jul. 2, 2015 10:00 AM EDT Reads: 792
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 2, 2015 10:00 AM EDT Reads: 1,110
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgenthal...
Jul. 2, 2015 10:00 AM EDT Reads: 770
In the midst of the widespread popularity and adoption of cloud computing, it seems like everything is being offered “as a Service” these days: Infrastructure? Check. Platform? You bet. Software? Absolutely. Toaster? It’s only a matter of time. With service providers positioning vastly differing offerings under a generic “cloud” umbrella, it’s all too easy to get confused about what’s actually being offered. In his session at 16th Cloud Expo, Kevin Hazard, Director of Digital Content for SoftL...
Jul. 2, 2015 09:45 AM EDT Reads: 2,240
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Jul. 2, 2015 09:30 AM EDT Reads: 808
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
Jul. 2, 2015 09:24 AM EDT Reads: 518