Welcome!

@CloudExpo Authors: Elizabeth White, Kevin Benedict, David H Deans, Pat Romanski, Leon Fayer

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing, Release Management , Cloud Security

@CloudExpo: Article

Encryption of Data-in-Use to Harness the Power of the Cloud

Enabling cloud adoption for organizations worldwide

Cloud computing has dramatically altered how IT infrastructure is delivered and managed, as well as how IT functionality is consumed. However, security and privacy concerns continue to be major inhibitors for risk-conscious organizations to adoption of cloud computing - whether infrastructure as a service, software as a service applications or email as a service.

Cloud service providers, in response, have made strategic decisions on the investment they make in directly addressing these concerns in order to encourage broader adoption of cloud-based services. By implementing controls and processes to further improve security, cloud service providers are increasingly aiming to deliver more safeguards for the cloud environment than individual customer could within on-premise environments. However, a significant consideration for many organizations as they look to best exploit the benefits of the cloud is whether they can retain ownership and control of data processed by third party services.

Defining Roles, Responsibilities and Data Control Borders
The value proposition delivered by cloud service providers is in managing IT infrastructure in a more flexible, scalable and cost-efficient manner than an organization could do independently. The basic roles and responsibilities of the cloud service provider therefore should focus on the security, resiliency, scalability and manageability of their service. Security encompasses not only physical datacenter security, but also the means to limit administrator access across a multi-tenant environment and customer instances based on the principle of least privilege. However, at best, the cloud service provider can only provide a set of tools and options for customers looking to encrypt data in place.

Maintaining ownership and control of data is discrete from the underlying security and processes implemented by the cloud service provider. Even though the data resides on their infrastructure, cloud service providers are compelled to maintain that an organization retains responsibility for its own data. The not-for-profit Cloud Security Alliance notes in its most recent Email Security Implementation Guidance that it is critical that the customer - not the cloud service provider - be responsible for the security and encryption protection controls necessary to meet their organization's requirements.

By contrast, the roles and responsibilities of organization in regards to corporate data remain the same regardless of where it resides or is processed: specifically, maintaining ownership and direct control of that data. When corporate data is moved from on-premise to the cloud, compliance and security requirements dictate that the organization cannot relinquish ownership or control of its data. Also, the loss of visibility into who has access to that data implies that it can be subpoenaed and handed over to law enforcement agencies without its knowledge.

Principal Business Challenges of Migrating Data to the Cloud
The principal business challenges that organizations typically face when migrating data to the cloud encompass data security, regulatory compliance, unauthorized data disclosure and access, and international privacy/ data residency regulations. These issues need to be resolved to address the requirements of the legal team, as well security or compliance officers, before moving an organization's data to the cloud.

Data Security and Risk Mitigation
In cloud computing applications, data is frequently stored and processed at the cloud provider in the clear - unless customers themselves encrypt the data-at-rest and in-use. This brings up numerous data ownership and control responsibilities/concerns for an organization.

From a structural perspective, cloud-based services pose a challenge to traditional methods of securing data. Traditionally, encryption has been used to secure data resident on internal systems, or to protect data moving from one point to another. Ensuring that data remains encrypted in place within a third-party provider's environment and throughout the data lifecycle, but is seamlessly available to authorized users presents a new set of technical challenges.

In order to satisfy the new set of requirements introduced by migration to cloud-based services, cloud data must remain in encrypted cipher format. Also, data should be encrypted before it leaves the corporate or trusted network in order to meet data residency and privacy requirements. To maintain control of data that is no longer resident on a trusted network, the encryption keys remain under the organization's control and ownership.

Regulatory Compliance Requirements for Safeguards on Sensitive Data
Organizations are subject to a broad array of regulatory requirements including federal laws such as Sarbanes-Oxley, varying state data protection measures, The USA Patriot Act and vertical-specific regulations (HIPAA, HITECH, Basel II, GLBA and PCI DSS), in addition to potential international data privacy and residency requirements such as the EU Data Protection Directive.

Although the specifics vary according to the compliance requirements specified, a common stipulation is that organizations retain control over their data and maintain mechanisms to prevent unauthorized access. For instance, HIPAA regulations require technical safeguards to ensure that each covered entity is responsible for ensuring that the data within its systems has not been changed or erased in an unauthorized manner. The GLBA specifies that financial institutions within the US are mandated to protect against any anticipated threats or hazards to the security or integrity of customer records and information. Likewise, in terms of the requirements spelled out by PCI Data Security Standards, stored cardholder data needs to be protected by strong encryption.

Unauthorized Data Disclosure and Access
In the US, personal information is protected by the Fourth Amendment. However once it is shared, it is no longer protected. Until legal guidelines are established to address the application of the Fourth Amendment in cloud computing, uploaded data is not considered private.

Cloud service providers are compelled by law to comply with subpoenas and other requests by the government to turn over customer data, including data subject to attorney-client privilege and other protected data. Often, cloud providers will only notify customers that data was turned over to the government after the fact, if at all. In some instances, they may even be expressly prohibited from notifying customers. This risk prevents many organizations from migrating sensitive data to the cloud.

International Privacy/ Data Residency Regulations
Data protection laws and privacy regulations mandate the direct control of an organization's information and safeguards for moving data outside of defined jurisdictions. These laws are broad, and are continually being implemented in a growing number of countries across the globe -- making it difficult for some organizations to fully realize the promise of cloud computing.

To comply with specific data protection laws and international privacy regulations, organizations often pay cloud providers a premium to add costly infrastructure in each location of interest, resulting in a sharp increase in costs and decrease in efficiency. Furthermore, most providers are unwilling to duplicate infrastructure in all locations, making it difficult for customers to comply with these regulations.

Implementing Best Practices for Cloud Data Control: Data-in-Use Encryption
Encryption of data-in-transit and data-at-rest has long been recognized as best practices to enforce the security and privacy of data, regardless of where it resides. However, these two states of encryption are no longer sufficient as they do not protect data while it is being processed in the cloud.

According to the Cloud Security Alliance's Encryption Implementation Guidance, organizations should implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use). To prevent unauthorized access and maintain the state of encryption even when processed in a third-party environment, enterprise IT should retain ownership of the encryption keys. As a result, the cloud provider never has access to customer data in an unencrypted form, and an organization's cloud data remains unreadable if an unauthorized third-party attempts access -- or even if the data is disclosed in response to a government request.

Figure 1: The not-for-profit industry association, the Cloud Security Alliance, recommends that organizations implement encryption of data-in-use to ensure that data is secured for the entire duration of its lifecycle (at-rest, in-transit and in-use).

Traditionally, if cloud-hosted data was encrypted, basic server-side operations such as indexing, searching and sorting records became impossible. Once cipher text was put into a SaaS application, some of the features of the program no longer worked, and the user experience suffered as a result. The implementation of data-in-use encryption supports dynamic operations such as search, sort and index of encrypted data in the cloud. Even as the data is processed by a cloud-based service, the IT department of the organization that owns the data or a trusted third party retains control of the encryption keys. As a result, application functionality is preserved and decryption is policy-driven and automated.

The Implementation of Data-in-Use Encryption Enables Organizations to Seamlessly Harness the Power of the Cloud
By addressing the concerns associated with control and ownership of proprietary data residing on third-party cloud-based servers, data-in-use encryption technology directly addresses material concerns related to compliance requirements, separation of data controls through key retention, data residency and unauthorized disclosure of data in response to a government request.

Data-in-use encryption is of particular value for organizations with the desire to independently manage data disclosure requests from law enforcement agencies. Equally, cloud service provides are not eager to be in the undesirable position of being compelled to disclose customer data. The cloud provider will still turn over customer data when presented with a subpoena or other government request because they have no choice but to comply. However, because all of the data was encrypted before it was received by the cloud provider, and the organization is holding the encryption keys, they cannot decrypt that data. Therefore, when complying with an order, the cloud provider can only turn over cipher text. If the government wants to decrypt the data, it must go directly to the organization that owns the data.

Figure 2: Sample of an authorized \ unauthorized view of sensitive data in a hosted Exchange email application.

In geographically distributed environments, smart encryption also creates a paradigm shift from requiring the data to remain locally to only requiring the encryption keys to remain locally for data. Organizations with multiple data residency requirements can deploy and maintain an instance of the encryption appliance in each jurisdiction. Once the data is encrypted with keys that are maintained in that jurisdiction, the encrypted data can lawfully reside in any location.

The addition of encryption-in-use empowers the organization to retain full ownership and control during the entire process, including when the data is out of its network and in the cloud, while ensuring maximum security and regulatory compliance.

Industry analysts agree. According to Ahmad Zeffirelli, Industry Analyst at Frost & Sullivan, "This solution with its ability to encrypt data-in-use, data-in-transit, and data-at-rest, would bring immense benefits to a vast majority of organizations concerned about data security while leveraging cloud computing."

Building Commercially Viable Encryption
One of the most difficult technical challenges in developing encryption for commercial applications running in the cloud is to establish the right balance between the competing goals of encryption/security on the one hand versus features/performance on the other. In commercial markets, especially in the cloud, introducing additional steps for users to follow in order to address security requirements both undermines the ease of use value propositions of cloud-based services and creates the likelihood that users will look for ways to circumvent controls.

The entire process should be transparent to the end-user. Specifically, the security functionality should not require the installation of an application or agent on the end user's client device or mobile phone. Also, there should be no impact to the end-user experience in terms of functionality, performance, or task workflow. Furthermore, commercially viable encryption capabilities should not interfere with standard email security features such as malware and anti-virus protection.

Conclusion
By effectively addressing data control, compliance and security requirements, while ensuring preservation of application functionality including search, sort and index capabilities and a seamless user experience, technology that enables the encryption of data-at-rest, data-in-transit and data-in-use within the cloud environment functions as an enabler for cloud adoption for organizations worldwide.

More Stories By Elad Yoran

Elad Yoran is the CEO of Vaultive, Inc. He is a recognized expert on information security market and technology trends. Yoran has 20 years of experience in the cyber security industry as an executive, consultant, investor, investment banker and several-time successful entrepreneur. He is also a member of a number of technology, security and community Boards, including FBI Information Technology Advisory Council (ITAC); Department of Homeland Security Advisory Board for Command, Control and Interoperability for Advanced Data Analysis (CCICADA); and Cloud Security Alliance New York Metro Chapter.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change the...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
SYS-CON Events announced today that IoT Now has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, SSL, peer-to-peer, mob...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
With billions of sensors deployed worldwide, the amount of machine-generated data will soon exceed what our networks can handle. But consumers and businesses will expect seamless experiences and real-time responsiveness. What does this mean for IoT devices and the infrastructure that supports them? More of the data will need to be handled at - or closer to - the devices themselves.
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Jared Parker, Director of Financial Services at Kinetica, will discuss how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich inf...