|By Philip Lieberman||
|March 29, 2010 05:00 PM EDT||
Cloud Security on Ulitzer
Safeguarding a cloud infrastructure from unmonitored access, malware and intruder attacks grows more challenging for service providers as their operations evolve. And as a cloud infrastructure grows, so too does the presence of unsecured privileged identities – those so-called super-user accounts that hold elevated permission to access sensitive data, run programs, and change configuration settings on virtually every IT component. Privileged identities exist on all physical and virtual operating systems, on network devices such as routers, switches, and firewalls, and in programs and services including databases, line-of-business applications, Web services, middleware, VM hypervisors and more.
Left unsecured, privileged accounts leave an organization vulnerable to IT staff members who have unmonitored access to sensitive customer data and can change configuration settings on critical components of your infrastructure through anonymous, unaudited access. Unsecured privileged accounts can also lead to financial loss from failed regulatory audits such as PCI-DSS, HIPAA, SOX and other standards that require privileged identity controls.
One of the largest challenges for consumers of cloud services is attaining transparency into how a public cloud provider is securing its infrastructure. For example, how are identities being managed and secured? Many cloud providers won’t give their customers much more of an answer than a SAS 70 certification. How can we trust in the cloud if the vendors of cloud-based infrastructures neglect to implement both the process and technology to assure that segregation of duties are enforced, and customer and vendor identities are secured?
The Cloud Vendor’s Challenge: Accountability
Cloud computing has the potential to transform business technology, but it brings security issues that IT organizations should consider before trusting their sensitive data to the cloud. These issues should cause security experts and auditors to rethink many fundamental assumptions about Privileged Identity Management in terms of who is responsible for managing these powerful privileged accounts, how they manage them, and who exactly is in control.
Historically, IT data centers have always been in secured physical locations. Now, with cloud computing, those locations are no longer maintained directly by the IT organization. So the questions are these: how do you get accountability for management of physical assets that are no longer under your physical control, and exactly what control mechanisms are in place? Can you trust your cloud vendor to secure your most sensitive data? Moreover, if there’s a security breach in the cloud, who is to blame? Is it the cloud vendor that disclaims all legal liability in its contract, or an enterprise that relinquishes control of its sensitive data in the first place?
Cloud computing promises to make IT more efficient and deliver more consistent service levels. However, there’s a paradox that when it comes to security (and control over privileged identities in particular) cloud services are often among the least efficient. Many cloud service providers’ processes – based on ad-hoc techniques like scripting of password changes – are slow, expensive and unreliable. And that’s dangerous.
Fortunately the industry is starting to move beyond paralyzing discussions about the security and compliance problems that arise from cloud computing to address them head on. One example is the Trusted Cloud Initiative, which was launched at RSA Security Conference 2010. The goal of the initiative is “to help cloud providers develop industry-recommended, secure and interoperable identity, access and compliance management configurations, and practices.” However, only time will tell if it will help standardize cloud computing or turn out to be a technology certification of little use.
Several major cloud vendors and ISPs have begun the task of integrating security solutions that are capable of managing the large number of privileged identities that make up their infrastructure (hardware, VM hosts, VM Image OS, application stacks). This has really broken the fundamental model of IT being in control of security and has started to blur the lines between vendor and customer when it comes to the management of security.
Today, some privileged identity management frameworks are capable of managing “from iron to application,” giving cloud customers a full measure of control over credentials used in each physical and virtual layer of the stack and the potential to gain full visibility into who has access. In contrast, scripts and other ad-hoc methods to manage privileged identities can no longer keep pace or meet regulatory requirements in fast-changing and highly virtualized cloud computing environments.
In addition, cloud vendors must move to become identity providers of authentication services, multi-tenancy control, and X.509 certificate issuance for applications, end-points, users, and encrypted sessions. It is inappropriate for cloud vendors to expect their customers to use disconnected and third party providers of certificate services for what should be an inherent and integrated feature of every cloud vendor’s offering.
The End User’s Challenge: Transparency
In my opinion, the cloud is a really good, compelling idea. It can reduce the cost of IT dramatically. Given that cloud computing is available, the idea of building new data centers these days seems like a last-century way of doing things. And since many organizations lack the appropriate personnel to manage the IT resources they have, they’re willing to forego seeing and touching their own systems in their secured data centers – and the corresponding feeling of control – and have turned to outsourcing. Cloud computing is essentially the next generation of outsourcing, so we’re not only reducing manpower but also getting rid of our hard assets entirely. By moving these services to data centers anywhere on the planet we’re offered the potential for service delivery that costs far less than the alternatives. And the idea of outsourcing security and liability is extraordinarily compelling.
However, enterprises should ask the right questions of their cloud providers before taking the leap into the cloud and blindly assuming that their data is safe there. You should ask your cloud service provider to meet every point of compliance that your IT organization is required to meet, and should ask your cloud service provider every question that your IT auditors ask you.
Auditors, too, share a responsibility to verify that client organizations are able to track the usage and control of their data and resources inside the cloud. In keeping with major regulatory mandates, auditors are obligated to confirm segregation of duties and the enforcement of “need to know” and “need to access” policies. And, potential cloud customers should ask what provisions have been made to provide the required trail of access to the user’s auditors on demand – and what provisions are in place to allow the sharing of privileged control between cloud vendor and user for appropriate reporting and verification.
Because today’s cloud vendors offer literally no transparency and little information, don’t be surprised if you don’t like the answers you get. Most cloud vendors would say that for security purposes, it’s on a “need to know” basis and you don’t need to know. Others state that they’re SAS 70 compliant, but that’s really just a self-certification. And because each measure of security adds to cloud vendor costs, it is appropriate for consumers of cloud services to demand to know precisely what measures are in place – and what auditing processes are supported – as part of the service agreement.
Be persistent. What kind of security does the cloud service provider have in place to protect your privileged accounts and most sensitive data? Do they have Privileged Identity Management technology in place? How do they control privileged accounts used in cloud infrastructure to manage sensitive systems and data? How do they manage cloud stacks at the physical layer and application stack layers ? What is your access to audit records?
Whatever regulatory standards your organization must meet, so too must your cloud vendor. So if you think that by venturing into the cloud you’re saving yourself from regulatory headaches, think again.
Security is the greatest barrier towards adoption of the cloud, and it’s no great surprise that cloud security was a major theme at this year’s RSA Conference. Unfortunately, improvements in cloud security won’t be seen as a priority until a major breach has a significant impact on one or more cloud service vendors and their customers. This needs to change. When it comes to cloud security, it is the end-user’s duty to understand what processes and methodologies the cloud vendor is using to protect the customer’s most sensitive assets.
|douglas.barbin 03/31/10 06:09:00 PM EDT|
Very good article and very comprehensive view of the assurance issues surrounding identity management in the cloud. One clarification (and I could see what you were getting at so its not as if you misconstrued) but I wanted to clarify that SAS 70 is not a self-certification.
First, SAS 70 is not a certification at all although I agree with you that technology marketers love to issue press releases saying that it is. Second, you are correct in that there are no prescriptive standards and that what is being tested are the control activities and objectives set by the provider.
That said, the two do have to interrelate for a CPA to render an unqualified opinion. For instance, if the (high-level) control objective provides reasonable assurance against unauthorized access and the (detailed) control activities tested by the auditor were only paper-based (policies) with no technical preventive or detective controls, the result would likely be a qualified or adverse opinion on that objective if not the broader controls.
The bottom line is while yes, the cloud provider dictates what the objectives and activities are, you won't get an unqualified (some refer to as clean) opinion if the controls are not suitably designed and/or fairly presented.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Jan. 18, 2017 03:45 AM EST Reads: 3,993
Technology vendors and analysts are eager to paint a rosy picture of how wonderful IoT is and why your deployment will be great with the use of their products and services. While it is easy to showcase successful IoT solutions, identifying IoT systems that missed the mark or failed can often provide more in the way of key lessons learned. In his session at @ThingsExpo, Peter Vanderminden, Principal Industry Analyst for IoT & Digital Supply Chain to Flatiron Strategies, will focus on how IoT depl...
Jan. 18, 2017 02:30 AM EST Reads: 1,821
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at Dell EMC, introduced a methodology for capturing, enriching and sharing data (and analytics) across the organization...
Jan. 18, 2017 02:15 AM EST Reads: 3,216
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Jan. 18, 2017 01:30 AM EST Reads: 4,203
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 18, 2017 01:15 AM EST Reads: 4,892
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 01:00 AM EST Reads: 2,015
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Jan. 18, 2017 01:00 AM EST Reads: 6,046
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:45 AM EST Reads: 6,254
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 18, 2017 12:45 AM EST Reads: 5,953
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:00 AM EST Reads: 2,260
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Jan. 18, 2017 12:00 AM EST Reads: 7,742
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Jan. 17, 2017 11:30 PM EST Reads: 4,369
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Jan. 17, 2017 11:15 PM EST Reads: 4,947
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 17, 2017 11:00 PM EST Reads: 638
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Jan. 17, 2017 10:30 PM EST Reads: 748
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal ...
Jan. 17, 2017 10:30 PM EST Reads: 2,374
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
Jan. 17, 2017 09:30 PM EST Reads: 1,806
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 17, 2017 09:15 PM EST Reads: 7,567
"There is a huge interest in Kubernetes. People are now starting to use Kubernetes and implement it," stated Sebastian Scheele, co-founder of Loodse, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 17, 2017 08:45 PM EST Reads: 1,960
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Jan. 17, 2017 08:00 PM EST Reads: 11,649