|By Gilad Parann-Nissany||
|February 13, 2014 12:00 PM EST||
from InfoQ.com: With the news stories of possible data breaches at enterprises like Target, and the current trend of companies migrating to cloud environments for the flexibility, scalability, agility, and cost-effectiveness they offer, CIOs have been asking hard questions about cloud security.
As CIO, protecting your data (and your users) is one of your key responsibilities. Whether you already have some cloud projects running or are starting your first cloud project, these questions and answers may provide you with solutions and introduce some new techniques.
InfoQ: Is the cloud safe?
Gilad: The cloud, by definition, is not more or less safe than your own data center. As an interesting note, the recent media storm around the NSA, which started as a “cloud computing security” story, has morphed into a more general discussion. It turns out the NSA is able to eavesdrop on physical servers in physical data centers and has actually done so at many of the world’s most secure organizations.
Today, cloud computing has been discovered as safe and effective for a wide range of projects and data types, ranging across most vertical industries and market niches. Regulated, sensitive areas such as finance, health, legal, retail or government – are all in various stages of going to the cloud..
However, just like certain security precautions are taken in the physical world, cloud security also entails taking the appropriate precautions.
InfoQ: How does migrating to the cloud change my risks?
Gilad: Migrating applications and data to the cloud obviously shifts some responsibilities from your own data center to the cloud provider. It is an act of outsourcing. As such, it always involves a shift of control. Taking back control involves procedures and technology.
Cloud computing may be seen – in some aspects – as revolutionary; yet in other aspects it is evolutionary. Any study of controlling risks should start out by understanding this point. Many of the things we have learned in data centers evolve naturally to the cloud. The need for proper procedures is unchanged. Many of the technologies are also evolving naturally.
You should therefore start by mapping out your current procedures and current security-related technologies, and see how they evolve to the cloud. In many cases you’ll see a correspondence.
You’ll find however, that some areas really are a revolution. Clouds do not have walls, so physical security does not map well from the data center to the cloud. Clouds involve employees of the cloud service provider, so you need to find ways to control people who do not work for you. These are significant changes, and they require new technology and new procedures.
InfoQ: What are the most important aspects of a cloud security policy?
Gilad: Continuing the themes of evolution and revolution, some aspects of cloud security will seem familiar. Firewalls, antivirus, and authentication – are evolving to the world of cloud computing. You will find that your cloud provider often offers you solutions in these areas; and traditional vendors are evolving their solutions as well.
Some aspects may change your current thinking. Since clouds do not have walls, and cloud employees could see your data – you must create metaphoric walls around your data. In cloud scenarios, data encryption is the recognized best practice for these new needs.
Incidentally, data encryption also helps with a traditional data center need – most data breaches happen from the inside, so the threat is not just from cloud employees. However, there is no question that the threat from cloud insiders has shined a new spotlight on the need for data encryption.
InfoQ: What is the best practice for encrypting cloud data?
Gilad: You should encrypt data at rest and in motion. Encrypting “in motion” is already well known to you – the standards of HTTPS/SSL and IPSEC apply equally well in the data center and in the cloud.
Encrypting “at rest” means that the data must be encrypted when it resides on a disk, in a database, on a file system, in storage, and of course if it is backed up. In the real world, people have not always done this in data centers – often relying on physical security as a replacement. In the cloud, physical security is no alternative – you must encrypt sensitive data.
This actually means data must be encrypted constantly as it is being written, and decrypted only when it is going to be used (i.e. just before a specific calculation, and only in memory). Standards such as Advanced Encryption Standard (AES) are commonly used for data encryption at rest.
InfoQ: Does cloud encryption singlehandedly protect data?
Gilad: If data is properly encrypted it is, in a sense, locked and cannot be used if it falls into the wrong hands. Unless, of course, those hands have a key.
Proper management of encryption keys is as important as the encryption itself. In fact, if you keep your encryption keys to yourself – you keep ownership of your data. This is an interesting and fundamental point – in the cloud you are outsourcing your infrastructure, but you can maintain ownership by keeping the encryption keys.
If encryption keys are stored alongside the data, any breach that discloses the data will also disclose the key to access it. If encryption keys are stored with cloud providers, they own your data.
Think of your data like a safe deposit box – would you leave your key with the banker? What if he gets robbed? What if his employees are paid to make copies of your key?
A best practice is split key encryption. With this method, your data is encrypted (e.g. with AES), and then the encryption key is split into parts. One part is managed with a cloud security provider and one part stays only with you. This way, only you control access to your data.
Even if your encrypted data is compromised, the perpetrators will not be able to decrypt it and it will be useless to them.
InfoQ: How can encryption keys be protected while they are in use?
Gilad: Keys in use in the cloud do not have to be vulnerable. They can be protected using homomorphic key management. This cryptographic technique gives the application access to the data store without ever exposing the master keys to the encryption – in an unencrypted state. It also ensures that if such (encrypted) keys are stolen, they can still never be used to access your data store
InfoQ: Is cloud data encryption in compliance with regulations?
Gilad: Regulations like Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), and many others (GLBA, FINRA, PIPEDA, et al) require or encourage cloud data to be properly encrypted and encryption keys to be properly managed. Some of these regulations even provide for a sort of “safe harbor” – that is, if your data is breached, but you can prove that you took the necessary steps to encrypt it and maintain control of the encryption keys, you may save the financial burden, the bureaucratic reporting requirements, and the damage to reputation involved with such an event.
InfoQ: Is cloud security cost-prohibitive and will it harm system performance?
Gilad: The cloud is often chosen for its lower operational overhead, and sometimes for actual dollar savings, compared with traditional data centers. Securing a cloud project does not need to negate the cloud’s ease of use nor make the project prohibitively expensive.
There are security solutions that require no hardware and, therefore, no large cap-ex investment. Pay-as-you-go business models make it easy to scale security up (or down) with the size of your project, as you add (or remove) virtual machines and data.
Performance can also be good. Modern cloud security virtual appliances and virtual agents – are optimized for cloud throughput and latency. You’ll be able to dial up performance as your cloud project scales up. To take a concrete example – data encryption – good solutions will include a capability to stream data as it is being encrypted (or decrypted), and do so inside your cloud account. Such approaches mean that virtual CPUs available in your cloud will be able to handle your performance needs with low latency.
InfoQ: Is there a way to protect cloud backups and disaster recovery?
Gilad: Data must be secured throughout its lifecycle. Properly encrypting data while it is in use, but then offering hackers unencrypted replicas as backups defeats the purpose of encrypting in the first place. You must encrypt and own the encryption keys for every point of the lifecycle of your information. Fortunately solutions that are built for the cloud do exist, and they should cover backups as well as primary copies.
InfoQ: What it more secure: a public cloud or a private cloud?
Gilad: Public and private clouds each have pros and cons in terms of ownership, control, cost, convenience and multi-tenancy. We have found that private clouds often require security controls similar to public ones. Use cases may involve users external to your company; or large “virtual” deployments with multiple internal projects, each with a need for strong security segregation. Your data can be properly encrypted, your keys can be properly managed, and you can be safe in all the major cloud scenarios: private, public, or hybrid.
InfoQ: If my data is in the cloud, my security is in the cloud, and my backup is in the cloud, what do I control?
Gilad: If you use encryption properly and maintain control of the encryption keys, you have replaced your physical walls with mathematical walls. You will own your data. Even though you do not control the physical resources, you maintain control of what they contain. This is one reason why encryption in the cloud is the best practice.
By properly using multiple regions or even multiple cloud providers, you can also ensure that you always have availability and access to your project and your data.
By combining such techniques, you do take back control. As CIO and owner of your data, you must always control your data – from beginning to end. Your control does not need to be sacrificed when you migrate to the cloud, though it may need to be managed differently.
The post Answering Common Cloud Security Questions from CIOs appeared first on Porticor Cloud Security.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Jun. 30, 2016 08:00 AM EDT Reads: 510
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 30, 2016 07:00 AM EDT Reads: 414
SaaS companies can greatly expand revenue potential by pushing beyond their own borders. The challenge is how to do this without degrading service quality. In his session at 18th Cloud Expo, Adam Rogers, Managing Director at Anexia, discussed how IaaS providers with a global presence and both virtual and dedicated infrastructure can help companies expand their service footprint with low “go-to-market” costs.
Jun. 30, 2016 05:00 AM EDT Reads: 1,141
So you’ve heard how click-to-call widgets can really enhance a website’s potential for customer interaction and you want to try it out for yourself. Or you’re considering offloading pieces of your VoIP infrastructure, but want to see how that would unfold first. Where can you find this technology, that’s free and available to try out? Spotting the potential in a space where customers can experiment with these types of features, Voxbone is launching The Workshop.
Jun. 29, 2016 09:15 PM EDT Reads: 399
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Jun. 29, 2016 07:15 PM EDT Reads: 476
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jun. 29, 2016 05:30 PM EDT Reads: 403
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Jun. 29, 2016 05:30 PM EDT Reads: 1,055
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors ...
Jun. 29, 2016 04:30 PM EDT Reads: 1,152
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Jun. 29, 2016 04:15 PM EDT Reads: 486
"We work in the area of Big Data analytics and Big Data analytics is a very crowded space - you have Hadoop, ETL, warehousing, visualization and there's a lot of effort trying to get these tools to talk to each other," explained Mukund Deshpande, head of the Analytics practice at Accelerite, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jun. 29, 2016 04:15 PM EDT Reads: 455
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
Jun. 29, 2016 04:00 PM EDT Reads: 414
The initial debate is over: Any enterprise with a serious commitment to IT is migrating to the cloud. But things are not so simple. There is a complex mix of on-premises, colocated, and public-cloud deployments. In this power panel at 18th Cloud Expo, moderated by Conference Chair Roger Strukhoff, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships at Commvault; Dave Landa, Chief Operating Officer at kintone; William Morrish, General Manager Product Sales at Interou...
Jun. 29, 2016 03:45 PM EDT Reads: 1,046
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
Jun. 29, 2016 03:00 PM EDT Reads: 628
Dialogic has announced that ZVRS chose Dialogic® PowerMedia™ XMS software media server as part of its latest video relay and translation service offering. ZVRS uses Dialogic’s PowerMedia XMS technology to provide a robust solution that supports a broad range of legacy devices and any-to-any video capabilities with its flagship Z70 videophone. ZVRS selected Dialogic’s solution to facilitate a release of Z70 that met its stringent requirements for legacy device support (H.263 and H.264) with high...
Jun. 29, 2016 02:45 PM EDT Reads: 498
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jun. 29, 2016 02:00 PM EDT Reads: 425
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jun. 29, 2016 02:00 PM EDT Reads: 683
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Jun. 29, 2016 01:15 PM EDT Reads: 392
You are moving to the Cloud. The question is not if, it’s when. Now that your competitors are in the cloud and lapping you, your “when” better hurry up and get here. But saying and doing are two different things. In his session at @DevOpsSummit at 18th Cloud Expo, Robert Reeves, CTO of Datical, explained how DevOps can be your onramp to the cloud. By adopting simple, platform independent DevOps strategies, you can accelerate your move to the cloud. Spoiler Alert: He also makes sure you don’t...
Jun. 29, 2016 01:00 PM EDT Reads: 1,198
Edge Hosting has announced a partnership with and the availability of CloudFlare, a web application firewall, CDN and DDoS mitigation service. “This partnership enhances Edge Hosting’s world class, perimeter layer, application (layer 7) defensive mechanism,” said Mark Houpt, Edge Hosting CISO. “The goal was to enable a new layer of customer controlled defense and compliance through the application of DDoS filters and mitigations, the web application firewall (WAF) feature and the added benefit ...
Jun. 29, 2016 12:15 PM EDT Reads: 663
Digital Initiatives create new ways of conducting business, which drive the need for increasingly advanced security and regulatory compliance challenges with exponentially more damaging consequences. In the BMC and Forbes Insights Survey in 2016, 97% of executives said they expect a rise in data breach attempts in the next 12 months. Sixty percent said operations and security teams have only a general understanding of each other’s requirements, resulting in a “SecOps gap” leaving organizations u...
Jun. 29, 2016 12:00 PM EDT Reads: 1,220