Welcome!

@CloudExpo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Pat Romanski, Zakia Bouachraoui

Related Topics: @CloudExpo, Containers Expo Blog, Agile Computing, Cloud Security

@CloudExpo: Article

How to Make Public and Private Clouds Secure

What controls can you put in place to restrict data loss?

With Gartner predicting $150 billion in cloud-related revenues by 2013, the march towards "the cloud" is not abating. As the market grows, "Is the cloud secure?" is a very familiar refrain frequently asked by corporate management. While those in IT will certainly tell you no environment will be completely secure, there are measures you can take when moving to the cloud to mitigate security risks to reasonable levels. Transitioning to the cloud can often be more secure than a traditional in-house solution. Cloud providers have collectively invested billions in implementing standards and procedures to safeguard data. They need to compete based on not only price, but the breadth of their security, driving innovation to the benefit of the customer.

In a public cloud environment the end user has a solution that is highly automated. Customers can put their applications in the cloud and control all of the individual attributes of their services. If you develop products and services in a testing or development environment, the high level of scalability offered by an on-demand computing solution makes it easy to clone server images and make ad-hoc changes to your infrastructure.

The public cloud of course lacks the visibility of control of a private model. Choosing the public cloud also means giving up a measure of control, in terms of where the processing takes place. With a single tenant private cloud, you have more specialized control with fewer people sharing resources. Each environment poses security challenges that can be managed by following standards and choosing the right partners.

Ensuring Security
What controls can you put in place to restrict data loss?

A sophisticated identity management system is crucial for protection from password hacking. Instituting and following a real password management system with true randomization is essential for data security. While it seems like a relic from the 1990s, it is shocking to see the number of IT staff or administrators who still use "123456" or "admin" as a password; a practice that should be ruthlessly weeded out. Consider using a secure password management service that protects user ID and password data and can flag users that repeat passwords across various systems. Using LDAP controls and administering credentials will keep access information from being scattered around. Additional controls such as running scripts to remove access when employees leave the organization are also recommended for identity management security.

After your internal processes such as identity management are better implemented and followed, you should turn your attention to the best practices of your various outsourcers. You may be at the point where you are working with several different SaaS providers. Do they follow your preferred procedures for identity management security? If possible, the centralization of these practices under your review can provide an added measure of security. Also, when choosing a solution provider, you should ask not only about their identity management practices, but also hiring and background check procedures for their administrators and about how access to data is controlled.

Over time, as cloud technology evolves, providers are standardizing policies that dictate where data physically resides. You might see user-defined policies with restrictions on crossing certain state or country boundaries as companies become increasingly globalized.

Specifically for public environments, data in the cloud is typically shared in proximity to other customers. While providers encrypt this data, it's still important to look at how the data is being segregated. Ask your solution provider to follow best encryption practices to be sure your data is both safe and usable. When your data is no longer needed, the cloud provider should take the right steps for deletion. In addition, you want a provider to offer managed services, including firewalls and the latest intrusion detection systems.

Another important consideration is the legal ramifications and jurisdiction that cover data standards. While your data containing PII (Personally Identifiable Information) might be considered "secure" in one country, it may fall under different regulations in another. European governments often have very strict rules for privacy protections compared to other countries. When choosing a cloud solution provider, you need to make sure your data can be quickly relocated in case your service agreement ends. Knowing the physical location of your data is important for such recovery efforts.

Availability and uptime are of course important for end customer satisfaction, but as a client, you need guarantees for data availability. Will internal IT staff have consistent access to company data to perform their daily job functions? Defined service-level agreements should detail availability and include penalty clauses if the agreement's terms are not upheld.

According to Gartner research, 40 or more states have formal regulations in place governing how companies can protect PII. In addition to the traditional focus on credit card, bank account, and other finance-centric information, there are also concerns around Social Security numbers and any other type of data that warrants privacy restrictions. The implications are that a customer should choose an established cloud solution provider that places system controls on the movement of PII within their cloud network. If sensitive data ends up on servers outside of the United States, it can create serious legal issues. Beyond PII, many companies run the risk of exposing their own intellectual property or other trade secrets.

Companies need their cloud providers to implement and follow strict controls that are routinely checked by independent auditors. Auditors exist to validate reporting to make sure procedures are in place to protect PII and other data. Performing thorough reviews of physical and logical access controls, auditors can proactively alert companies to security holes before there is a data breach. Auditors can review if background checks aren't performed or are not completed properly. Backup procedures of customer data are also intensely scrutinized. Who has access to backup data? Does more than one organization touch the backup data?

As companies utilize more and more SaaS solutions to handle business needs, standards such as SAS 70 become more and more prevalent across multiple industries. As a flexible accounting standard that can be altered to fit the custom needs of SaaS providers, SAS 70 is becoming a de-facto standard in the space. While it is indeed a little disingenuous for companies to dictate their own control objectives to the auditing firm, those that take the auditing seriously can proactively find and fix minor issues before they become massive problems.

Choosing the Right Vendor and Managing Outsourcers
The barriers to entry for cloud solution providers are quite low. Less-established players might not be as fastidious about where your data might travel, or who has access to analyze that data. You can't go just on the cost of the service if the tradeoff is lack of security oversight or a broader risk of the company going under.

You need to ask potential solution providers a lot of questions, digging beneath their standard marketing literature. What about business continuity? Is there a documented process for this? If one of their data centers is destroyed, what does that mean for your business? Do they only have one location? If so, you need to explore their backup and disaster recovery procedures, as well as the security risks of those procedures. Another important consideration is the company's actions after a security breach. Do you trust them to tell you security has been compromised so you can take steps to mitigate damage?

Negotiating with the provider can afford extra levels of protection. Strengthened layers of encryption and set standards of data storage can be put in the contract as a safeguard.

You also need to look beyond the cloud provider at any other SaaS type provider, whether a CRM solution or any other kind. A complete cloud solution and other business processes are often enabled by a chain of outsourcers. For customers that manage very sensitive data, they should request independent security audits of outsourcers, for instance any hosting companies used by the cloud provider.

Nightmare scenarios develop when an outsourcer in the fourth degree of separation exposes confidential information. You need to properly review the data standards for all of these outsourcers and have the right to refuse certain unreliable outsourcers from having any contact with your data. All of these SaaS companies have an obligation to enforce and monitor where customer data goes and how it is accessed.

Outsourcers should follow defined password assignment standards that decrease the likelihood of password hijacking. With multi-tenant cloud environments, the risks are greater so, to decrease these risks, the vendor needs to illustrate the controls they put in place to afford some separation between tenants.

Putting It Together
Maintaining optimal security is a two-step process: first, outline data requirements in terms of privacy and user access; and second, vet the right solution provider that can implement both technical and philosophical strategies to minimize risks. With the rate of technological innovation across all sectors, new tools to protect and manage cloud-based data are being researched and developed. As these strategies move beyond development into the implementation stage, cloud providers will have additional weapons to safeguard customer data and ensure security.

More Stories By Lucas Roh

Lucas Roh is CEO and President of Hostway. From Hostway's very beginning, he has been the driving force in the company's dedication to reliability and easy-to-use services through an ongoing commitment to emerging technology. Since starting the company in 1998, he has charted Hostway's growth to achieve an international presence, rank as one of the top-five Web hosting companies globally and remain profitable every quarter since its founding. Recognizing this success, the Chicago Business Hall of Fame named Roh Chicago's Young Business Leader of the Year in 2004.

Previously, Roh was a computer scientist at Argonne National Laboratory, conducting pioneering research in the emerging field of computer technologies integrated into mathematics. For six years, he worked in software and hardware engineering for several companies, including Tektronics and Hewlett-Packard. In the late 1980s, he founded Wavetech Corp, a software company. He has taught at Colorado State University and has authored more than 20 academic papers.

Roh received a Ph.D. in computer science from Colorado State University in 1995 and an undergraduate degree in physics from the University of Chicago in 1988.

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of recent developments, including containers and Docker, distributed resource management, and DevOps tool chains and processes. The resulting infrastructure and management framework must be optimized for distributed and scalable applications, take advantage of innovation stemming from a wide variety of open source projects, span hybrid environments, and be adaptable to equally fundamental changes happen...
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage of data and analytics in the cloud, Architecture, integration, governance and security scenarios and Key challenges and success factors of moving data and analytics to the cloud
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San Francisco which creates an "Outcomes-Centric Business Analytics" degree." Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how developers and operators work together to streamline cohesive systems.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.