Click here to close now.

Welcome!

Cloud Expo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Carmen Gonzalez, Trevor Parsons

Blog Feed Post

Answering Common Cloud Security Questions from CIOs

Cloud Security Cloud Key Management Cloud Encryption CIO  infoq cloud security Answering Common Cloud Security Questions from CIOsfrom InfoQ.com: With the news stories of possible data breaches at enterprises like Target, and the current trend of companies migrating to cloud environments for the flexibility, scalability, agility, and cost-effectiveness they offer, CIOs have been asking hard questions about cloud security.

As CIO, protecting your data (and your users) is one of your key responsibilities. Whether you already have some cloud projects running or are starting your first cloud project, these questions and answers may provide you with solutions and introduce some new techniques.

InfoQ: Is the cloud safe?

Gilad: The cloud, by definition, is not more or less safe than your own data center. As an interesting note, the recent media storm around the NSA, which started as a “cloud computing security” story, has morphed into a more general discussion. It turns out the NSA is able to eavesdrop on physical servers in physical data centers and has actually done so at many of the world’s most secure organizations.

Today, cloud computing has been discovered as safe and effective for a wide range of projects and data types, ranging across most vertical industries and market niches. Regulated, sensitive areas such as finance, health, legal, retail or government – are all in various stages of going to the cloud..

However, just like certain security precautions are taken in the physical world, cloud security also entails taking the appropriate precautions.

InfoQ: How does migrating to the cloud change my risks?

Gilad: Migrating applications and data to the cloud obviously shifts some responsibilities from your own data center to the cloud provider. It is an act of outsourcing. As such, it always involves a shift of control. Taking back control involves procedures and technology.

Cloud computing may be seen – in some aspects – as revolutionary; yet in other aspects it is evolutionary. Any study of controlling risks should start out by understanding this point. Many of the things we have learned in data centers evolve naturally to the cloud. The need for proper procedures is unchanged. Many of the technologies are also evolving naturally.

You should therefore start by mapping out your current procedures and current security-related technologies, and see how they evolve to the cloud. In many cases you’ll see a correspondence.

You’ll find however, that some areas really are a revolution. Clouds do not have walls, so physical security does not map well from the data center to the cloud. Clouds involve employees of the cloud service provider, so you need to find ways to control people who do not work for you. These are significant changes, and they require new technology and new procedures.

InfoQ: What are the most important aspects of a cloud security policy?

Gilad: Continuing the themes of evolution and revolution, some aspects of cloud security will seem familiar. Firewalls, antivirus, and authentication – are evolving to the world of cloud computing. You will find that your cloud provider often offers you solutions in these areas; and traditional vendors are evolving their solutions as well.

Some aspects may change your current thinking. Since clouds do not have walls, and cloud employees could see your data – you must create metaphoric walls around your data. In cloud scenarios, data encryption is the recognized best practice for these new needs.

Incidentally, data encryption also helps with a traditional data center need – most data breaches happen from the inside, so the threat is not just from cloud employees. However, there is no question that the threat from cloud insiders has shined a new spotlight on the need for data encryption.

InfoQ: What is the best practice for encrypting cloud data?

Gilad: You should encrypt data at rest and in motion. Encrypting “in motion” is already well known to you – the standards of HTTPS/SSL and IPSEC apply equally well in the data center and in the cloud.

Encrypting “at rest” means that the data must be encrypted when it resides on a disk, in a database, on a file system, in storage, and of course if it is backed up. In the real world, people have not always done this in data centers – often relying on physical security as a replacement. In the cloud, physical security is no alternative – you must encrypt sensitive data.

This actually means data must be encrypted constantly as it is being written, and decrypted only when it is going to be used (i.e. just before a specific calculation, and only in memory). Standards such as Advanced Encryption Standard (AES) are commonly used for data encryption at rest.

InfoQ: Does cloud encryption singlehandedly protect data?

Gilad: If data is properly encrypted it is, in a sense, locked and cannot be used if it falls into the wrong hands. Unless, of course, those hands have a key.

Proper management of encryption keys is as important as the encryption itself. In fact, if you keep your encryption keys to yourself – you keep ownership of your data. This is an interesting and fundamental point – in the cloud you are outsourcing your infrastructure, but you can maintain ownership by keeping the encryption keys.

If encryption keys are stored alongside the data, any breach that discloses the data will also disclose the key to access it. If encryption keys are stored with cloud providers, they own your data.

Think of your data like a safe deposit box – would you leave your key with the banker? What if he gets robbed? What if his employees are paid to make copies of your key?

A best practice is split key encryption. With this method, your data is encrypted (e.g. with AES), and then the encryption key is split into parts. One part is managed with a cloud security provider and one part stays only with you. This way, only you control access to your data.

Even if your encrypted data is compromised, the perpetrators will not be able to decrypt it and it will be useless to them.

InfoQ: How can encryption keys be protected while they are in use?

Gilad: Keys in use in the cloud do not have to be vulnerable. They can be protected using homomorphic key management. This cryptographic technique gives the application access to the data store without ever exposing the master keys to the encryption – in an unencrypted state. It also ensures that if such (encrypted) keys are stolen, they can still never be used to access your data store

InfoQ: Is cloud data encryption in compliance with regulations?

Gilad: Regulations like Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), and many others (GLBA, FINRA, PIPEDA, et al) require or encourage cloud data to be properly encrypted and encryption keys to be properly managed. Some of these regulations even provide for a sort of “safe harbor” – that is, if your data is breached, but you can prove that you took the necessary steps to encrypt it and maintain control of the encryption keys, you may save the financial burden, the bureaucratic reporting requirements, and the damage to reputation involved with such an event.

InfoQ: Is cloud security cost-prohibitive and will it harm system performance?

Gilad: The cloud is often chosen for its lower operational overhead, and sometimes for actual dollar savings, compared with traditional data centers. Securing a cloud project does not need to negate the cloud’s ease of use nor make the project prohibitively expensive.

There are security solutions that require no hardware and, therefore, no large cap-ex investment. Pay-as-you-go business models make it easy to scale security up (or down) with the size of your project, as you add (or remove) virtual machines and data.

Performance can also be good. Modern cloud security virtual appliances and virtual agents – are optimized for cloud throughput and latency. You’ll be able to dial up performance as your cloud project scales up. To take a concrete example – data encryption – good solutions will include a capability to stream data as it is being encrypted (or decrypted), and do so inside your cloud account. Such approaches mean that virtual CPUs available in your cloud will be able to handle your performance needs with low latency.

InfoQ: Is there a way to protect cloud backups and disaster recovery?

Gilad: Data must be secured throughout its lifecycle. Properly encrypting data while it is in use, but then offering hackers unencrypted replicas as backups defeats the purpose of encrypting in the first place. You must encrypt and own the encryption keys for every point of the lifecycle of your information. Fortunately solutions that are built for the cloud do exist, and they should cover backups as well as primary copies.

InfoQ: What it more secure: a public cloud or a private cloud?

Gilad: Public and private clouds each have pros and cons in terms of ownership, control, cost, convenience and multi-tenancy. We have found that private clouds often require security controls similar to public ones. Use cases may involve users external to your company; or large “virtual” deployments with multiple internal projects, each with a need for strong security segregation. Your data can be properly encrypted, your keys can be properly managed, and you can be safe in all the major cloud scenarios: private, public, or hybrid.

InfoQ: If my data is in the cloud, my security is in the cloud, and my backup is in the cloud, what do I control?

Gilad: If you use encryption properly and maintain control of the encryption keys, you have replaced your physical walls with mathematical walls. You will own your data. Even though you do not control the physical resources, you maintain control of what they contain. This is one reason why encryption in the cloud is the best practice.

By properly using multiple regions or even multiple cloud providers, you can also ensure that you always have availability and access to your project and your data.

By combining such techniques, you do take back control. As CIO and owner of your data, you must always control your data – from beginning to end. Your control does not need to be sacrificed when you migrate to the cloud, though it may need to be managed differently.

 

 

 

The post Answering Common Cloud Security Questions from CIOs appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

@CloudExpo Stories
XebiaLabs has announced record growth and major highlights for 2014. These milestones include: Triple-digit worldwide revenue growth. Record number of new customers including 3M, Allianz, American Express, Credit Agricole, Digital Globe, Electronic Arts, EMC, Expedia, Fandango / NBC Universal, GATX, General Electric, ING, KPMG, Liberty Mutual, Natixis, Paychex, Providence Health, Societe General, Thomson Reuters TIIA-CREF and Umpqua Bank.
CodeFutures has announced Dan Lynn as its new CEO. Lynn assumes the role from Founder Cory Isaacson, who has joined RMS and will now serve as chairman of CodeFutures. Lynn brings more than 14 years of advanced technology and business success experience, and will help CodeFutures build on its industry leadership around its Agile Big Data initiatives. His technical expertise will be invaluable in advancing CodeFutures’ AgilData platform and new processes for streamlining and gaining value from gro...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco S...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
“We just completed the roll out of our first public and private cloud offerings, which are a combination of public, hybrid, and private cloud,” stated Erik Levitt, CEO of Open Data Centers, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applicatio...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, de...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - is now accepting submissions to demo smart cars on the Expo Floor. Smart car sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem.
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...