|By Gilad Parann-Nissany||
|June 4, 2014 12:00 PM EDT||
Suppose you want to hire someone to do a job for you, but you suspect she might leak sensitive details of your business to your enemies. The immediate solution: don’t hire her. But real life is often not so clear cut. Your prospective employee may be very good at a particular job, and you don’t want to lose her skills; while the risk that she turns bad may be low and you need the job done.
This general real life problem has had many solutions over the centuries. People have written contracts with confidentiality clauses, threatened court action, come up with rules for the workplace, tried to instill pride in their workforce, and sometimes they’ve even begged for good behavior.
One particularly nice concept: suppose you could give the job to the talented lady, but seal it in an opaque black sack. She wouldn’t be able to see inside, but would manipulate pieces by putting her hands into the dark sack to get the job done. This way, she would use her talents without discovering your secrets.
The disadvantage is obvious: very few jobs can actually be done blindly, without knowledge and mutual trust. But there are real life examples of the black bag approach. For example, spies work in “cells” designed so that they know as little as possible about other “cells.”
Cloud computing and homomorphic encryption
Cloud computing can suffer from some of the problems we described above. Suppose you have sensitive data information: health data, financial data, or military secrets. And imagine you need to use a cloud “as a service” solution that takes your data and produces a report. The goal is to compare all the pieces of data and find the “important flukes.” In healthcare, an important fluke could be overpayment for a patient’s treatment. In finance, it could be a credit card fraud event. In the military, it could be data about a cyber-attack hiding between normal pieces of data.
In each of these cases, the “normal” data is sensitive: health, money, “normal” military activity. Still, for valid reason, you cannot do the analysis yourself in your own “secure” space. Somebody else has the expertise to do this particular analysis for you, and is using cloud computing because the job involves massive amounts of computational power.
Your find yourself with sensitive data in a cloud, on servers you do not own, and a programs running there which you may not own either.
Cloud encryption is always about hiding secrets.
You take a bunch of readable data and make it unreadable by scrambling the characters and numbers in the data so they don’t make sense. Only someone who has the “encryption key” can reverse the process, decrypt the data, and make it readable again. A typical decryption key could be 64 random characters long, yet by using known encryption techniques (like AES, “Advanced Encryption Standard”), that key can make terabytes completely scrambled and unreadable. Only if you have those 64 characters, the key, can you reverse the process and make the terabytes readable again.
In our case, however, there is a twist. We want the data to be completely unreadable, but we want the program mentioned before (the one running on servers you do not own and doing a job that requires expertise and computing power the cloud provides) to be able to do its job.
Ideally, we want the program to do its job on data that is always encrypted (and remains encrypted) and successfully come up with a useful result.
Homomorphic encryption makes this possible. Homomorphic encryption is a set of encryption techniques that will scramble the data based on a key, but allow a program to do some useful work on the unreadable encrypted data. The program cannot understand the data, but it can do something useful on it, without giving it away: an opaque black sack for cloud computing.
Steps to homomorphic encryption: examples of “something useful”
People have started figuring out useful things that can be done to encrypted data by taking a practical approach. Instead of saying that anything can be done in the black sack, people are starting out by saying that something can be done. Here are some examples of “something:”
- A specific statistical function, perhaps taking the average of two numbers.
Both numbers are encrypted and the result is also encrypted. Only the holder of the encryption key can know the real values of the numbers and the result. However, a machine in the cloud can take the encrypted values and average them, sending an encrypted result back to the “owner.” The owner can then decrypt that result back at home.
- Searching a person’s genome data to find whether that person has a specific gene that is especially harmful to health while protecting privacy.
The data that describes the genome is encrypted as is the description of the specific gene being searched. Both are uploaded to the cloud servers and the result comes back: the gene is either found or not. But only the person who owns the encryption key knows the genome (and the identity of the human being with that genome); and only they know what they were looking for. Was it a gene that may signal cancer? Or perhaps a gene that may signal high blood pressure? Only they know.
Techniques like these have been described in academic literature. They have not yet been commercialized, but they can really work. You’ll notice that the researchers have been smart. They focused on a useful and specific problem that can most likely be solved. This is sometimes called “partially homomorphic encryption” and sometimes called “somewhat homomorphic encryption” because the scheme is constructed with one focused goal in mind. It doesn’t try to solve everything.
Fully homomorphic encryption
A more ambitious approach is fully homomorphic encryption (FHE). The Holy Grail is to encrypt any data for any purpose, so that a program running on cloud servers could do almost anything to the data and generate useful results. But only the owner of the data could actually understand the data and understand the result, because only the owner has the key. The running cloud servers would run on encrypted data and produce an encrypted result.
This would be great to have. Recently, Gentry made progress in this field by showing that any function that is made up of additions and multiplications can be computed on homomorphically encrypted data, using what has naturally become called the Gentry scheme.
“Any combination of additions and multiplications” really is quite a lot – in fact almost any math function can be broken down into additions and multiplications, so Gentry’s result is quite strong.
However, there are some practical limitations that have thus far, stopped us from using FHE. One is performance; which can be billions of times slower than the “somewhat” or “partial” approach. Though it is currently being worked on, a big gap persists.
Another limitation to FHE is the kind of problem it can solve. Suppose you wanted to use FHE with a medical expert system. Such systems work more with questions and answers then with addition and multiplication. They try to diagnose a disease based on gathering patient information, applying some logical rules and then asking more questions. This is a logical process and a major part of it is interactive – asking questions and learning. The best of these systems also constantly change their rules, as they learn more about how patients and diseases really work. Is a learning system based on rules and interactive back-and-forth questions and reasoning – easily broken down into addition and multiplication?
So FHE has limitations based on performance and limitations based on type of problem it might solve.
Are there any commercial implementations of homomorphic encryption?
The world’s first commercial implementation is Homomorphic Key Management, a “partially homomorphic” approach. Homomorphic Key Management is focused on the specific problem of encrypting the encryption keys themselves and keeping them safe and secret in the cloud, while still being able to use them for encrypting data. This work was done by Porticor in 2011-2012 and a commercial product entered the market at the end of 2012.
Several academic teams are now working on bringing additional “somewhat homomorphic” approaches to market, though as far as we know, none other has yet been commercialized. We hopefully look toward the coming years to produce commercial results relevant to the study of many fields, such as genomics and specific computations within databases.
In parallel, Academia continues to work hard on making FHE more efficient.
The post Homomorphic Encryption and Cloud Security: The Practicalities appeared first on Porticor Cloud Security.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Dec. 2, 2016 02:15 PM EST Reads: 1,494
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Dec. 2, 2016 02:02 PM EST Reads: 186
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Dec. 2, 2016 02:00 PM EST Reads: 4,803
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Dec. 2, 2016 01:30 PM EST Reads: 5,686
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Dec. 2, 2016 01:30 PM EST Reads: 1,803
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Dec. 2, 2016 01:15 PM EST Reads: 2,069
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Dec. 2, 2016 01:00 PM EST Reads: 2,440
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Dec. 2, 2016 01:00 PM EST Reads: 1,381
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Dec. 2, 2016 12:11 PM EST Reads: 221
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Dec. 2, 2016 12:00 PM EST Reads: 1,824
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 2, 2016 12:00 PM EST Reads: 171
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
Dec. 2, 2016 11:45 AM EST Reads: 1,455
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
Dec. 2, 2016 11:30 AM EST Reads: 1,845
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 2, 2016 11:30 AM EST Reads: 3,739
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Dec. 2, 2016 11:00 AM EST Reads: 402
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 2, 2016 11:00 AM EST Reads: 400
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Dec. 2, 2016 10:45 AM EST Reads: 1,593
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Dec. 2, 2016 10:30 AM EST Reads: 1,901
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Dec. 2, 2016 10:15 AM EST Reads: 2,027
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Dec. 2, 2016 10:00 AM EST Reads: 613