Click here to close now.

Welcome!

Cloud Expo Authors: Lori MacVittie, Elizabeth White, Liz McMillan, Jason Bloomberg, Ian Goldsmith

Related Topics: Cloud Expo, Containers, Web 2.0, Security

Cloud Expo: Article

How to Make Public and Private Clouds Secure

What controls can you put in place to restrict data loss?

With Gartner predicting $150 billion in cloud-related revenues by 2013, the march towards "the cloud" is not abating. As the market grows, "Is the cloud secure?" is a very familiar refrain frequently asked by corporate management. While those in IT will certainly tell you no environment will be completely secure, there are measures you can take when moving to the cloud to mitigate security risks to reasonable levels. Transitioning to the cloud can often be more secure than a traditional in-house solution. Cloud providers have collectively invested billions in implementing standards and procedures to safeguard data. They need to compete based on not only price, but the breadth of their security, driving innovation to the benefit of the customer.

In a public cloud environment the end user has a solution that is highly automated. Customers can put their applications in the cloud and control all of the individual attributes of their services. If you develop products and services in a testing or development environment, the high level of scalability offered by an on-demand computing solution makes it easy to clone server images and make ad-hoc changes to your infrastructure.

The public cloud of course lacks the visibility of control of a private model. Choosing the public cloud also means giving up a measure of control, in terms of where the processing takes place. With a single tenant private cloud, you have more specialized control with fewer people sharing resources. Each environment poses security challenges that can be managed by following standards and choosing the right partners.

Ensuring Security
What controls can you put in place to restrict data loss?

A sophisticated identity management system is crucial for protection from password hacking. Instituting and following a real password management system with true randomization is essential for data security. While it seems like a relic from the 1990s, it is shocking to see the number of IT staff or administrators who still use "123456" or "admin" as a password; a practice that should be ruthlessly weeded out. Consider using a secure password management service that protects user ID and password data and can flag users that repeat passwords across various systems. Using LDAP controls and administering credentials will keep access information from being scattered around. Additional controls such as running scripts to remove access when employees leave the organization are also recommended for identity management security.

After your internal processes such as identity management are better implemented and followed, you should turn your attention to the best practices of your various outsourcers. You may be at the point where you are working with several different SaaS providers. Do they follow your preferred procedures for identity management security? If possible, the centralization of these practices under your review can provide an added measure of security. Also, when choosing a solution provider, you should ask not only about their identity management practices, but also hiring and background check procedures for their administrators and about how access to data is controlled.

Over time, as cloud technology evolves, providers are standardizing policies that dictate where data physically resides. You might see user-defined policies with restrictions on crossing certain state or country boundaries as companies become increasingly globalized.

Specifically for public environments, data in the cloud is typically shared in proximity to other customers. While providers encrypt this data, it's still important to look at how the data is being segregated. Ask your solution provider to follow best encryption practices to be sure your data is both safe and usable. When your data is no longer needed, the cloud provider should take the right steps for deletion. In addition, you want a provider to offer managed services, including firewalls and the latest intrusion detection systems.

Another important consideration is the legal ramifications and jurisdiction that cover data standards. While your data containing PII (Personally Identifiable Information) might be considered "secure" in one country, it may fall under different regulations in another. European governments often have very strict rules for privacy protections compared to other countries. When choosing a cloud solution provider, you need to make sure your data can be quickly relocated in case your service agreement ends. Knowing the physical location of your data is important for such recovery efforts.

Availability and uptime are of course important for end customer satisfaction, but as a client, you need guarantees for data availability. Will internal IT staff have consistent access to company data to perform their daily job functions? Defined service-level agreements should detail availability and include penalty clauses if the agreement's terms are not upheld.

According to Gartner research, 40 or more states have formal regulations in place governing how companies can protect PII. In addition to the traditional focus on credit card, bank account, and other finance-centric information, there are also concerns around Social Security numbers and any other type of data that warrants privacy restrictions. The implications are that a customer should choose an established cloud solution provider that places system controls on the movement of PII within their cloud network. If sensitive data ends up on servers outside of the United States, it can create serious legal issues. Beyond PII, many companies run the risk of exposing their own intellectual property or other trade secrets.

Companies need their cloud providers to implement and follow strict controls that are routinely checked by independent auditors. Auditors exist to validate reporting to make sure procedures are in place to protect PII and other data. Performing thorough reviews of physical and logical access controls, auditors can proactively alert companies to security holes before there is a data breach. Auditors can review if background checks aren't performed or are not completed properly. Backup procedures of customer data are also intensely scrutinized. Who has access to backup data? Does more than one organization touch the backup data?

As companies utilize more and more SaaS solutions to handle business needs, standards such as SAS 70 become more and more prevalent across multiple industries. As a flexible accounting standard that can be altered to fit the custom needs of SaaS providers, SAS 70 is becoming a de-facto standard in the space. While it is indeed a little disingenuous for companies to dictate their own control objectives to the auditing firm, those that take the auditing seriously can proactively find and fix minor issues before they become massive problems.

Choosing the Right Vendor and Managing Outsourcers
The barriers to entry for cloud solution providers are quite low. Less-established players might not be as fastidious about where your data might travel, or who has access to analyze that data. You can't go just on the cost of the service if the tradeoff is lack of security oversight or a broader risk of the company going under.

You need to ask potential solution providers a lot of questions, digging beneath their standard marketing literature. What about business continuity? Is there a documented process for this? If one of their data centers is destroyed, what does that mean for your business? Do they only have one location? If so, you need to explore their backup and disaster recovery procedures, as well as the security risks of those procedures. Another important consideration is the company's actions after a security breach. Do you trust them to tell you security has been compromised so you can take steps to mitigate damage?

Negotiating with the provider can afford extra levels of protection. Strengthened layers of encryption and set standards of data storage can be put in the contract as a safeguard.

You also need to look beyond the cloud provider at any other SaaS type provider, whether a CRM solution or any other kind. A complete cloud solution and other business processes are often enabled by a chain of outsourcers. For customers that manage very sensitive data, they should request independent security audits of outsourcers, for instance any hosting companies used by the cloud provider.

Nightmare scenarios develop when an outsourcer in the fourth degree of separation exposes confidential information. You need to properly review the data standards for all of these outsourcers and have the right to refuse certain unreliable outsourcers from having any contact with your data. All of these SaaS companies have an obligation to enforce and monitor where customer data goes and how it is accessed.

Outsourcers should follow defined password assignment standards that decrease the likelihood of password hijacking. With multi-tenant cloud environments, the risks are greater so, to decrease these risks, the vendor needs to illustrate the controls they put in place to afford some separation between tenants.

Putting It Together
Maintaining optimal security is a two-step process: first, outline data requirements in terms of privacy and user access; and second, vet the right solution provider that can implement both technical and philosophical strategies to minimize risks. With the rate of technological innovation across all sectors, new tools to protect and manage cloud-based data are being researched and developed. As these strategies move beyond development into the implementation stage, cloud providers will have additional weapons to safeguard customer data and ensure security.

More Stories By Lucas Roh

Lucas Roh is CEO and President of Hostway. From Hostway's very beginning, he has been the driving force in the company's dedication to reliability and easy-to-use services through an ongoing commitment to emerging technology. Since starting the company in 1998, he has charted Hostway's growth to achieve an international presence, rank as one of the top-five Web hosting companies globally and remain profitable every quarter since its founding. Recognizing this success, the Chicago Business Hall of Fame named Roh Chicago's Young Business Leader of the Year in 2004.

Previously, Roh was a computer scientist at Argonne National Laboratory, conducting pioneering research in the emerging field of computer technologies integrated into mathematics. For six years, he worked in software and hardware engineering for several companies, including Tektronics and Hewlett-Packard. In the late 1980s, he founded Wavetech Corp, a software company. He has taught at Colorado State University and has authored more than 20 academic papers.

Roh received a Ph.D. in computer science from Colorado State University in 1995 and an undergraduate degree in physics from the University of Chicago in 1988.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
giladpn 03/13/11 04:28:00 AM EDT

Specifically for Data in the Cloud, if you are looking for a Data Security & Privacy solution here is a new one (disclaimer: we're the proud creators):

@CloudExpo Stories
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will address the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affec...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. Acco...
You often hear the two titles of "DevOps" and "Immutable Infrastructure" used independently. In his session at DevOps Summit, John Willis, Technical Evangelist for Docker, will cover the union between the two topics and why this is important. He will cover an overview of Immutable Infrastructure then show how an Immutable Continuous Delivery pipeline can be applied as a best practice for "DevOps." He will end the session with some interesting case study examples.
SYS-CON Media named Andi Mann editor of DevOps Journal. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. DevOps Journal brings valuable information to DevOps professionals who are transforming the way enterprise IT is done. Andi Mann, Vice President, Strategic Solutions, at CA Technologies, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, communicator, and thought lea...
Even though it’s now Microservices Journal, long-time fans of SOA World Magazine can take comfort in the fact that the URL – soa.sys-con.com – remains unchanged. And that’s no mistake, as microservices are really nothing more than a new and improved take on the Service-Oriented Architecture (SOA) best practices we struggled to hammer out over the last decade. Skeptics, however, might say that this change is nothing more than an exercise in buzzword-hopping. SOA is passé, and now that people are ...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile ...
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...
It's time to face reality: "Americans are from Mars, Europeans are from Venus," and in today's increasingly connected world, understanding "inter-planetary" alignments and deviations is mission-critical for cloud. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, discussed cultural expectations of privacy based on new research across these elements
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, discussed how a user-centric Application Performance Management solution can help inspire your users with every applicati...
As enterprises engage with Big Data technologies to develop applications needed to meet operational demands, new computation fabrics are continually being introduced. To leverage these new innovations, organizations are sacrificing market opportunities to gain expertise in learning new systems. In his session at Big Data Expo, Supreet Oberoi, Vice President of Field Engineering at Concurrent, Inc., discussed how to leverage existing infrastructure and investments and future-proof them against e...
The consumption economy is here and so are cloud applications and solutions that offer more than subscription and flat fee models and at the same time are available on a pure consumption model, which not only reduces IT spend but also lowers infrastructure costs, and offers ease of use and availability. In their session at 15th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, discussed this shifting dynamic with an ...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackI...
Once the decision has been made to move part or all of a workload to the cloud, a methodology for selecting that workload needs to be established. How do you move to the cloud? What does the discovery, assessment and planning look like? What workloads make sense? Which cloud model makes sense for each workload? What are the considerations for how to select the right cloud model? And how does that fit in with the overall IT transformation?