Welcome!

@CloudExpo Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski, Stackify Blog

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security

@CloudExpo: Blog Feed Post

Securing Your ‘Data at Rest’ in the Cloud

Threat vectors in private, hybrid and public clouds

We’re all hungry for best practices and tips for securing data in the cloud and also how shared computer resources can and should work to ensure privacy and protection. The focus is on data security, especially data at rest.

Cloud computing is all about increased scalability and productivity. However, new cloud security threats such as “snapshotting” a virtual disk are emerging. These create new threats to private data, compared to when data was stored and secured between the four walls of a datacenter.

With cloud computing, multiple customers may share a single physical disk, although logically separated from each other. In theory, one can share the same physical disk with a competitor – without data crossing over. The same is true for physical servers. Equally, within a single cloud account, different projects may be sharing the same physical disks or physical servers. This virtualized approach is at the heart of cloud computing; it provides many of its benefits.

Best practices for such logical separation do exist and are well implemented by the best cloud providers. While essential, they do require the cloud user – the customer – to take some responsibility for securing data. To understand why, consider some of the threat vectors that do remain, even when logical separation is done right.

Threat vectors in private, hybrid and public clouds
Cloud technology is very powerful. It allows legitimate cloud customers to manage all of their disks and servers through a browser; for example it allows customers to easily copy or “snapshot” their disks, with a single cloud command. But consider a hacker who has obtained web access to a cloud account by stealing the Web User Interface (WUI) credentials or exposing a web vulnerability. That hacker can now also “snapshot” the disks.

Threat vectors can also exist because cloud accounts may contain multiple projects. This is a subtle point, because it applies to private clouds as well as public clouds.

People sometimes have pat answers for difficult questions; the “private vs. public” cloud debate is such a case. Some people claim private clouds answer all security questions. Private clouds are good, but the claim is exaggerated. This specific threat vector is an example. So here is how it goes.

You are a responsible professional and have properly secured your project in your company’s cloud account (whether public or private). However, your colleague at a different division has also set up a project, and – unfortunately – his virtual servers have a vulnerability. A hacker gains access to those servers and now that hacker is in your company’s cloud account. Depending on the exact details of the exploit, the damage to your project’s data security could be small or large. Obviously, this can happen even in a private cloud.

People often like to talk of the “insider” threat. Malicious insiders are rare – the insider threat is perhaps overplayed – but they are a painful possibility and these cases can cause huge damage when they do occur. Insiders can be at your own company or at the cloud provider. Just as with hackers, malicious insiders could misuse account credentials to access and misuse your data, or exploit their existing projects within your shared environment for malicious ends.

So what is an IT administrator going to do to secure data stored in the cloud?

The practicalities of trust and control for data in the cloud
One way to think of the cloud is as a great environment for outsourcing. Even private clouds mean that you are outsourcing your computing environment to some IT group inside your company, and public clouds are an obvious outsourcing situation.

Like any outsourcing situation, you want to outsource the hassle but keep the control. This is basically the attitude to take when going to the cloud.

It is important to remember that keeping control must happen at the project level – your project must be under your control. Segregating projects and defining how each project is protected independently of other projects is a good way to avoid many threat vectors.

Some of the rules for enforcing data security in the cloud are true in any data center scenario, not necessarily virtualized; and some are new or at least have a new emphasis.

Make sure your cloud, your software and your security tools allow you to enforce these general rules.

Define who has administrative access to the project.These are the people with the power to make big sweeping changes. Make sure it’s not just one person; what if that person is on vacation? But it must not be many people, and each person should have clear rules on what he or she can do.

Define who has user-level access to the project. Users may have access to data that administrators may not see. In fact, that is best practice. Make sure you manage your users and their rights to data.

Define fine-grained network controls. Make sure you can segregate your projects using networking techniques such as sub-nets and firewalls.

Define fine-grained authorization and authentication. Users and administrators should have clear identities, so you can control access to the resources they need; it should be possible to define fine-grained permissions for these identities. Some projects may require defining permissions for disk access, others for files, and still others for tables, rows or columns in a database.

Encrypt all sensitive data. Data in the cloud must be encrypted for security. There is simply no way to deal with the threat vectors mentioned above if data is not encrypted. This is an accepted fact by cloud data security experts.

Manage encryption keys carefully; otherwise, achieving confidentiality can prove difficult in the cloud. This is actually a major gotcha in cloud data security. The essential point is that encryption keys are the key to the kingdom, and you cannot trust anyone with your encryption keys.

It is obvious that you should not save your encryption keys in “plain”, on a disk in the cloud. But more than that, if you desire confidentiality, you cannot give your encryption keys to your cloud provider. The best providers will themselves tell you this, in a frank and helpful way – so ask them.

When it comes to encryption and encryption keys, you cannot trust anyone. There are several offers on the market that will try to tell you that you can trust them with your keys, so be aware.

There are several other offers on the market which do key management responsibly and well, but only by taking all of your encryption keys back to your physical data center. That kind of interferes with your goal of doing a cloud project.

Keeping trust and control while outsourcing complexity
While securing data at rest in cloud projects is entirely possible, it’s also a lot of work. Ideally, you’d like a solution which packages all this complexity.

The technological breakthroughs that enable pure cloud key management are split-key encryption and homomorphic key encryption. They provide the only way, currently on the market, to maintain complete confidentiality of data while staying 100 percent in the cloud.

This makes a large variety of projects possible and secure, including everything from disaster recovery and cloud bursting to pure cloud solutions. These types of encryption also maintain standards such as HIPAA, PCI DSS, SOX and SOC2.

But you need more than technology. Vendors need to be integrated with leading clouds and operating systems. This way, you can leverage valuable tools appropriate for the cloud of your choice, such as firewalls, virtual private networks, security groups and roles, authentication and authorization – together with your encryption and key management solution. A full solution is truly possible.

Securing a variety of data storage technologies
When it comes to data storage in the cloud, there is a wide range of options customers can choose from. These range from “plain” virtual disks, file systems (for Windows, Linux and UNIX), relational databases (Oracle, MySQL, MS SQL, IBM DB2 and others) and new and unique cloud options such as distributed storage (e.g. Simple Storage Service) or “NoSQL” databases (e.g. MongoDB). Furthermore, when it comes to databases, there is a choice to be made between fully encrypting the entire database, or encrypting at the finer granular level – at the table, row or column level, for example.

Such wide-ranging support requires “plugging in” to the cloud operating system at a very deep level, where the solution is transparent and fits well with most anything using the cloud environment. It also requires a cloud-enabled Application Programming Interface (API). A convenient User Interface doesn’t hurt either.

Gilad Parann-Nissany is the founder and CEO of Porticor (Ramat Hasharon, Israel). www.porticor.com

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

@CloudExpo Stories
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
SYS-CON Events announced today that Elastifile will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Elastifile Cloud File System (ECFS) is software-defined data infrastructure designed for seamless and efficient management of dynamic workloads across heterogeneous environments. Elastifile provides the architecture needed to optimize your hybrid cloud environment, by facilitating efficient...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
SYS-CON Events announced today that Golden Gate University will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Since 1901, non-profit Golden Gate University (GGU) has been helping adults achieve their professional goals by providing high quality, practice-based undergraduate and graduate educational programs in law, taxation, business and related professions. Many of its courses are taug...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
yperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let’s say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it....
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, will address this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of t...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Connecting to major cloud service providers is becoming central to doing business. But your cloud provider’s performance is only as good as your connectivity solution. Massive Networks will place you in the driver's seat by exposing how you can extend your LAN from any location to include any cloud platform through an advanced high-performance connection that is secure and dedicated to your business-critical data. In his session at 21st Cloud Expo, Paul Mako, CEO & CIO of Massive Networks, wil...
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...