Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Liz McMillan, Derek Weeks, John Rauser

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security

@CloudExpo: Blog Feed Post

Securing Your ‘Data at Rest’ in the Cloud

Threat vectors in private, hybrid and public clouds

We’re all hungry for best practices and tips for securing data in the cloud and also how shared computer resources can and should work to ensure privacy and protection. The focus is on data security, especially data at rest.

Cloud computing is all about increased scalability and productivity. However, new cloud security threats such as “snapshotting” a virtual disk are emerging. These create new threats to private data, compared to when data was stored and secured between the four walls of a datacenter.

With cloud computing, multiple customers may share a single physical disk, although logically separated from each other. In theory, one can share the same physical disk with a competitor – without data crossing over. The same is true for physical servers. Equally, within a single cloud account, different projects may be sharing the same physical disks or physical servers. This virtualized approach is at the heart of cloud computing; it provides many of its benefits.

Best practices for such logical separation do exist and are well implemented by the best cloud providers. While essential, they do require the cloud user – the customer – to take some responsibility for securing data. To understand why, consider some of the threat vectors that do remain, even when logical separation is done right.

Threat vectors in private, hybrid and public clouds
Cloud technology is very powerful. It allows legitimate cloud customers to manage all of their disks and servers through a browser; for example it allows customers to easily copy or “snapshot” their disks, with a single cloud command. But consider a hacker who has obtained web access to a cloud account by stealing the Web User Interface (WUI) credentials or exposing a web vulnerability. That hacker can now also “snapshot” the disks.

Threat vectors can also exist because cloud accounts may contain multiple projects. This is a subtle point, because it applies to private clouds as well as public clouds.

People sometimes have pat answers for difficult questions; the “private vs. public” cloud debate is such a case. Some people claim private clouds answer all security questions. Private clouds are good, but the claim is exaggerated. This specific threat vector is an example. So here is how it goes.

You are a responsible professional and have properly secured your project in your company’s cloud account (whether public or private). However, your colleague at a different division has also set up a project, and – unfortunately – his virtual servers have a vulnerability. A hacker gains access to those servers and now that hacker is in your company’s cloud account. Depending on the exact details of the exploit, the damage to your project’s data security could be small or large. Obviously, this can happen even in a private cloud.

People often like to talk of the “insider” threat. Malicious insiders are rare – the insider threat is perhaps overplayed – but they are a painful possibility and these cases can cause huge damage when they do occur. Insiders can be at your own company or at the cloud provider. Just as with hackers, malicious insiders could misuse account credentials to access and misuse your data, or exploit their existing projects within your shared environment for malicious ends.

So what is an IT administrator going to do to secure data stored in the cloud?

The practicalities of trust and control for data in the cloud
One way to think of the cloud is as a great environment for outsourcing. Even private clouds mean that you are outsourcing your computing environment to some IT group inside your company, and public clouds are an obvious outsourcing situation.

Like any outsourcing situation, you want to outsource the hassle but keep the control. This is basically the attitude to take when going to the cloud.

It is important to remember that keeping control must happen at the project level – your project must be under your control. Segregating projects and defining how each project is protected independently of other projects is a good way to avoid many threat vectors.

Some of the rules for enforcing data security in the cloud are true in any data center scenario, not necessarily virtualized; and some are new or at least have a new emphasis.

Make sure your cloud, your software and your security tools allow you to enforce these general rules.

Define who has administrative access to the project.These are the people with the power to make big sweeping changes. Make sure it’s not just one person; what if that person is on vacation? But it must not be many people, and each person should have clear rules on what he or she can do.

Define who has user-level access to the project. Users may have access to data that administrators may not see. In fact, that is best practice. Make sure you manage your users and their rights to data.

Define fine-grained network controls. Make sure you can segregate your projects using networking techniques such as sub-nets and firewalls.

Define fine-grained authorization and authentication. Users and administrators should have clear identities, so you can control access to the resources they need; it should be possible to define fine-grained permissions for these identities. Some projects may require defining permissions for disk access, others for files, and still others for tables, rows or columns in a database.

Encrypt all sensitive data. Data in the cloud must be encrypted for security. There is simply no way to deal with the threat vectors mentioned above if data is not encrypted. This is an accepted fact by cloud data security experts.

Manage encryption keys carefully; otherwise, achieving confidentiality can prove difficult in the cloud. This is actually a major gotcha in cloud data security. The essential point is that encryption keys are the key to the kingdom, and you cannot trust anyone with your encryption keys.

It is obvious that you should not save your encryption keys in “plain”, on a disk in the cloud. But more than that, if you desire confidentiality, you cannot give your encryption keys to your cloud provider. The best providers will themselves tell you this, in a frank and helpful way – so ask them.

When it comes to encryption and encryption keys, you cannot trust anyone. There are several offers on the market that will try to tell you that you can trust them with your keys, so be aware.

There are several other offers on the market which do key management responsibly and well, but only by taking all of your encryption keys back to your physical data center. That kind of interferes with your goal of doing a cloud project.

Keeping trust and control while outsourcing complexity
While securing data at rest in cloud projects is entirely possible, it’s also a lot of work. Ideally, you’d like a solution which packages all this complexity.

The technological breakthroughs that enable pure cloud key management are split-key encryption and homomorphic key encryption. They provide the only way, currently on the market, to maintain complete confidentiality of data while staying 100 percent in the cloud.

This makes a large variety of projects possible and secure, including everything from disaster recovery and cloud bursting to pure cloud solutions. These types of encryption also maintain standards such as HIPAA, PCI DSS, SOX and SOC2.

But you need more than technology. Vendors need to be integrated with leading clouds and operating systems. This way, you can leverage valuable tools appropriate for the cloud of your choice, such as firewalls, virtual private networks, security groups and roles, authentication and authorization – together with your encryption and key management solution. A full solution is truly possible.

Securing a variety of data storage technologies
When it comes to data storage in the cloud, there is a wide range of options customers can choose from. These range from “plain” virtual disks, file systems (for Windows, Linux and UNIX), relational databases (Oracle, MySQL, MS SQL, IBM DB2 and others) and new and unique cloud options such as distributed storage (e.g. Simple Storage Service) or “NoSQL” databases (e.g. MongoDB). Furthermore, when it comes to databases, there is a choice to be made between fully encrypting the entire database, or encrypting at the finer granular level – at the table, row or column level, for example.

Such wide-ranging support requires “plugging in” to the cloud operating system at a very deep level, where the solution is transparent and fits well with most anything using the cloud environment. It also requires a cloud-enabled Application Programming Interface (API). A convenient User Interface doesn’t hurt either.

Gilad Parann-Nissany is the founder and CEO of Porticor (Ramat Hasharon, Israel). www.porticor.com

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

@CloudExpo Stories
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
SYS-CON Events announced today that Silicon India has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Published in Silicon Valley, Silicon India magazine is the premiere platform for CIOs to discuss their innovative enterprise solutions and allows IT vendors to learn about new solutions that can help grow their business.
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...