Click here to close now.


@CloudExpo Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Chris Fleck, Jason Bloomberg

Related Topics: Cloud Security, @CloudExpo

Cloud Security: Article

The Cloud Challenge: Security

It's the end-user’s duty to understand what processes and methodologies the cloud vendor is using

Cloud Security on Ulitzer

Safeguarding a cloud infrastructure from unmonitored access, malware and intruder attacks grows more challenging for service providers as their operations evolve. And as a cloud infrastructure grows, so too does the presence of unsecured privileged identities – those so-called super-user accounts that hold elevated permission to access sensitive data, run programs, and change configuration settings on virtually every IT component. Privileged identities exist on all physical and virtual operating systems, on network devices such as routers, switches, and firewalls, and in programs and services including databases, line-of-business applications, Web services, middleware, VM hypervisors and more.

Left unsecured, privileged accounts leave an organization vulnerable to IT staff members who have unmonitored access to sensitive customer data and can change configuration settings on critical components of your infrastructure through anonymous, unaudited access. Unsecured privileged accounts can also lead to financial loss from failed regulatory audits such as PCI-DSS, HIPAA, SOX and other standards that require privileged identity controls.

One of the largest challenges for consumers of cloud services is attaining transparency into how a public cloud provider is securing its infrastructure. For example, how are identities being managed and secured? Many cloud providers won’t give their customers much more of an answer than a SAS 70 certification. How can we trust in the cloud if the vendors of cloud-based infrastructures neglect to implement both the process and technology to assure that segregation of duties are enforced, and customer and vendor identities are secured?

The Cloud Vendor’s Challenge: Accountability
Cloud computing has the potential to transform business technology, but it brings security issues that IT organizations should consider before trusting their sensitive data to the cloud. These issues should cause security experts and auditors to rethink many fundamental assumptions about Privileged Identity Management in terms of who is responsible for managing these powerful privileged accounts, how they manage them, and who exactly is in control.

Historically, IT data centers have always been in secured physical locations. Now, with cloud computing, those locations are no longer maintained directly by the IT organization. So the questions are these: how do you get accountability for management of physical assets that are no longer under your physical control, and exactly what control mechanisms are in place? Can you trust your cloud vendor to secure your most sensitive data? Moreover, if there’s a security breach in the cloud, who is to blame? Is it the cloud vendor that disclaims all legal liability in its contract, or an enterprise that relinquishes control of its sensitive data in the first place?

Cloud computing promises to make IT more efficient and deliver more consistent service levels. However, there’s a paradox that when it comes to security (and control over privileged identities in particular) cloud services are often among the least efficient. Many cloud service providers’ processes – based on ad-hoc techniques like scripting of password changes – are slow, expensive and unreliable. And that’s dangerous.

Fortunately the industry is starting to move beyond paralyzing discussions about the security and compliance problems that arise from cloud computing to address them head on. One example is the Trusted Cloud Initiative, which was launched at RSA Security Conference 2010. The goal of the initiative is “to help cloud providers develop industry-recommended, secure and interoperable identity, access and compliance management configurations, and practices.” However, only time will tell if it will help standardize cloud computing or turn out to be a technology certification of little use.

Several major cloud vendors and ISPs have begun the task of integrating security solutions that are capable of managing the large number of privileged identities that make up their infrastructure (hardware, VM hosts, VM Image OS, application stacks). This has really broken the fundamental model of IT being in control of security and has started to blur the lines between vendor and customer when it comes to the management of security.

Today, some privileged identity management frameworks are capable of managing “from iron to application,” giving cloud customers a full measure of control over credentials used in each physical and virtual layer of the stack and the potential to gain full visibility into who has access. In contrast, scripts and other ad-hoc methods to manage privileged identities can no longer keep pace or meet regulatory requirements in fast-changing and highly virtualized cloud computing environments.

In addition, cloud vendors must move to become identity providers of authentication services, multi-tenancy control, and X.509 certificate issuance for applications, end-points, users, and encrypted sessions. It is inappropriate for cloud vendors to expect their customers to use disconnected and third party providers of certificate services for what should be an inherent and integrated feature of every cloud vendor’s offering.

The End User’s Challenge: Transparency
In my opinion, the cloud is a really good, compelling idea. It can reduce the cost of IT dramatically. Given that cloud computing is available, the idea of building new data centers these days seems like a last-century way of doing things. And since many organizations lack the appropriate personnel to manage the IT resources they have, they’re willing to forego seeing and touching their own systems in their secured data centers – and the corresponding feeling of control – and have turned to outsourcing. Cloud computing is essentially the next generation of outsourcing, so we’re not only reducing manpower but also getting rid of our hard assets entirely. By moving these services to data centers anywhere on the planet we’re offered the potential for service delivery that costs far less than the alternatives. And the idea of outsourcing security and liability is extraordinarily compelling.

However, enterprises should ask the right questions of their cloud providers before taking the leap into the cloud and blindly assuming that their data is safe there. You should ask your cloud service provider to meet every point of compliance that your IT organization is required to meet, and should ask your cloud service provider every question that your IT auditors ask you.

Auditors, too, share a responsibility to verify that client organizations are able to track the usage and control of their data and resources inside the cloud. In keeping with major regulatory mandates, auditors are obligated to confirm segregation of duties and the enforcement of “need to know” and “need to access” policies. And, potential cloud customers should ask what provisions have been made to provide the required trail of access to the user’s auditors on demand – and what provisions are in place to allow the sharing of privileged control between cloud vendor and user for appropriate reporting and verification.

Because today’s cloud vendors offer literally no transparency and little information, don’t be surprised if you don’t like the answers you get. Most cloud vendors would say that for security purposes, it’s on a “need to know” basis and you don’t need to know. Others state that they’re SAS 70 compliant, but that’s really just a self-certification. And because each measure of security adds to cloud vendor costs, it is appropriate for consumers of cloud services to demand to know precisely what measures are in place – and what auditing processes are supported – as part of the service agreement.

Be persistent. What kind of security does the cloud service provider have in place to protect your privileged accounts and most sensitive data? Do they have Privileged Identity Management technology in place? How do they control privileged accounts used in cloud infrastructure to manage sensitive systems and data? How do they manage cloud stacks at the physical layer and application stack layers ? What is your access to audit records?

Whatever regulatory standards your organization must meet, so too must your cloud vendor. So if you think that by venturing into the cloud you’re saving yourself from regulatory headaches, think again.

Security is the greatest barrier towards adoption of the cloud, and it’s no great surprise that cloud security was a major theme at this year’s RSA Conference. Unfortunately, improvements in cloud security won’t be seen as a priority until a major breach has a significant impact on one or more cloud service vendors and their customers. This needs to change. When it comes to cloud security, it is the end-user’s duty to understand what processes and methodologies the cloud vendor is using to protect the customer’s most sensitive assets.

More Stories By Philip Lieberman

Philip Lieberman is President & CEO of Lieberman Software. You can reach him and learn more about Privileged Identity Management in the cloud by contacting Lieberman Software

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
douglas.barbin 03/31/10 06:09:00 PM EDT


Very good article and very comprehensive view of the assurance issues surrounding identity management in the cloud. One clarification (and I could see what you were getting at so its not as if you misconstrued) but I wanted to clarify that SAS 70 is not a self-certification.

First, SAS 70 is not a certification at all although I agree with you that technology marketers love to issue press releases saying that it is. Second, you are correct in that there are no prescriptive standards and that what is being tested are the control activities and objectives set by the provider.

That said, the two do have to interrelate for a CPA to render an unqualified opinion. For instance, if the (high-level) control objective provides reasonable assurance against unauthorized access and the (detailed) control activities tested by the auditor were only paper-based (policies) with no technical preventive or detective controls, the result would likely be a qualified or adverse opinion on that objective if not the broader controls.

The bottom line is while yes, the cloud provider dictates what the objectives and activities are, you won't get an unqualified (some refer to as clean) opinion if the controls are not suitably designed and/or fairly presented.

Best Regards,

@CloudExpo Stories
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Interested in leveraging automation technologies and a cloud architecture to make developers more productive? Learn how PaaS can benefit your organization to help you streamline your application development, allow you to use existing infrastructure and improve operational efficiencies. Begin charting your path to PaaS with OpenShift Enterprise.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
JFrog has announced a powerful technology for managing software packages from development into production. JFrog Artifactory 4 represents disruptive innovation in its groundbreaking ability to help development and DevOps teams deliver increasingly complex solutions on ever-shorter deadlines across multiple platforms JFrog Artifactory 4 establishes a new category – the Universal Artifact Repository – that reflects JFrog's unique commitment to enable faster software releases through the first pla...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes ab...
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.