|By Philip Lieberman||
|March 29, 2010 05:00 PM EDT||
Cloud Security on Ulitzer
Safeguarding a cloud infrastructure from unmonitored access, malware and intruder attacks grows more challenging for service providers as their operations evolve. And as a cloud infrastructure grows, so too does the presence of unsecured privileged identities – those so-called super-user accounts that hold elevated permission to access sensitive data, run programs, and change configuration settings on virtually every IT component. Privileged identities exist on all physical and virtual operating systems, on network devices such as routers, switches, and firewalls, and in programs and services including databases, line-of-business applications, Web services, middleware, VM hypervisors and more.
Left unsecured, privileged accounts leave an organization vulnerable to IT staff members who have unmonitored access to sensitive customer data and can change configuration settings on critical components of your infrastructure through anonymous, unaudited access. Unsecured privileged accounts can also lead to financial loss from failed regulatory audits such as PCI-DSS, HIPAA, SOX and other standards that require privileged identity controls.
One of the largest challenges for consumers of cloud services is attaining transparency into how a public cloud provider is securing its infrastructure. For example, how are identities being managed and secured? Many cloud providers won’t give their customers much more of an answer than a SAS 70 certification. How can we trust in the cloud if the vendors of cloud-based infrastructures neglect to implement both the process and technology to assure that segregation of duties are enforced, and customer and vendor identities are secured?
The Cloud Vendor’s Challenge: Accountability
Cloud computing has the potential to transform business technology, but it brings security issues that IT organizations should consider before trusting their sensitive data to the cloud. These issues should cause security experts and auditors to rethink many fundamental assumptions about Privileged Identity Management in terms of who is responsible for managing these powerful privileged accounts, how they manage them, and who exactly is in control.
Historically, IT data centers have always been in secured physical locations. Now, with cloud computing, those locations are no longer maintained directly by the IT organization. So the questions are these: how do you get accountability for management of physical assets that are no longer under your physical control, and exactly what control mechanisms are in place? Can you trust your cloud vendor to secure your most sensitive data? Moreover, if there’s a security breach in the cloud, who is to blame? Is it the cloud vendor that disclaims all legal liability in its contract, or an enterprise that relinquishes control of its sensitive data in the first place?
Cloud computing promises to make IT more efficient and deliver more consistent service levels. However, there’s a paradox that when it comes to security (and control over privileged identities in particular) cloud services are often among the least efficient. Many cloud service providers’ processes – based on ad-hoc techniques like scripting of password changes – are slow, expensive and unreliable. And that’s dangerous.
Fortunately the industry is starting to move beyond paralyzing discussions about the security and compliance problems that arise from cloud computing to address them head on. One example is the Trusted Cloud Initiative, which was launched at RSA Security Conference 2010. The goal of the initiative is “to help cloud providers develop industry-recommended, secure and interoperable identity, access and compliance management configurations, and practices.” However, only time will tell if it will help standardize cloud computing or turn out to be a technology certification of little use.
Several major cloud vendors and ISPs have begun the task of integrating security solutions that are capable of managing the large number of privileged identities that make up their infrastructure (hardware, VM hosts, VM Image OS, application stacks). This has really broken the fundamental model of IT being in control of security and has started to blur the lines between vendor and customer when it comes to the management of security.
Today, some privileged identity management frameworks are capable of managing “from iron to application,” giving cloud customers a full measure of control over credentials used in each physical and virtual layer of the stack and the potential to gain full visibility into who has access. In contrast, scripts and other ad-hoc methods to manage privileged identities can no longer keep pace or meet regulatory requirements in fast-changing and highly virtualized cloud computing environments.
In addition, cloud vendors must move to become identity providers of authentication services, multi-tenancy control, and X.509 certificate issuance for applications, end-points, users, and encrypted sessions. It is inappropriate for cloud vendors to expect their customers to use disconnected and third party providers of certificate services for what should be an inherent and integrated feature of every cloud vendor’s offering.
The End User’s Challenge: Transparency
In my opinion, the cloud is a really good, compelling idea. It can reduce the cost of IT dramatically. Given that cloud computing is available, the idea of building new data centers these days seems like a last-century way of doing things. And since many organizations lack the appropriate personnel to manage the IT resources they have, they’re willing to forego seeing and touching their own systems in their secured data centers – and the corresponding feeling of control – and have turned to outsourcing. Cloud computing is essentially the next generation of outsourcing, so we’re not only reducing manpower but also getting rid of our hard assets entirely. By moving these services to data centers anywhere on the planet we’re offered the potential for service delivery that costs far less than the alternatives. And the idea of outsourcing security and liability is extraordinarily compelling.
However, enterprises should ask the right questions of their cloud providers before taking the leap into the cloud and blindly assuming that their data is safe there. You should ask your cloud service provider to meet every point of compliance that your IT organization is required to meet, and should ask your cloud service provider every question that your IT auditors ask you.
Auditors, too, share a responsibility to verify that client organizations are able to track the usage and control of their data and resources inside the cloud. In keeping with major regulatory mandates, auditors are obligated to confirm segregation of duties and the enforcement of “need to know” and “need to access” policies. And, potential cloud customers should ask what provisions have been made to provide the required trail of access to the user’s auditors on demand – and what provisions are in place to allow the sharing of privileged control between cloud vendor and user for appropriate reporting and verification.
Because today’s cloud vendors offer literally no transparency and little information, don’t be surprised if you don’t like the answers you get. Most cloud vendors would say that for security purposes, it’s on a “need to know” basis and you don’t need to know. Others state that they’re SAS 70 compliant, but that’s really just a self-certification. And because each measure of security adds to cloud vendor costs, it is appropriate for consumers of cloud services to demand to know precisely what measures are in place – and what auditing processes are supported – as part of the service agreement.
Be persistent. What kind of security does the cloud service provider have in place to protect your privileged accounts and most sensitive data? Do they have Privileged Identity Management technology in place? How do they control privileged accounts used in cloud infrastructure to manage sensitive systems and data? How do they manage cloud stacks at the physical layer and application stack layers ? What is your access to audit records?
Whatever regulatory standards your organization must meet, so too must your cloud vendor. So if you think that by venturing into the cloud you’re saving yourself from regulatory headaches, think again.
Security is the greatest barrier towards adoption of the cloud, and it’s no great surprise that cloud security was a major theme at this year’s RSA Conference. Unfortunately, improvements in cloud security won’t be seen as a priority until a major breach has a significant impact on one or more cloud service vendors and their customers. This needs to change. When it comes to cloud security, it is the end-user’s duty to understand what processes and methodologies the cloud vendor is using to protect the customer’s most sensitive assets.
|douglas.barbin 03/31/10 06:09:00 PM EDT|
Very good article and very comprehensive view of the assurance issues surrounding identity management in the cloud. One clarification (and I could see what you were getting at so its not as if you misconstrued) but I wanted to clarify that SAS 70 is not a self-certification.
First, SAS 70 is not a certification at all although I agree with you that technology marketers love to issue press releases saying that it is. Second, you are correct in that there are no prescriptive standards and that what is being tested are the control activities and objectives set by the provider.
That said, the two do have to interrelate for a CPA to render an unqualified opinion. For instance, if the (high-level) control objective provides reasonable assurance against unauthorized access and the (detailed) control activities tested by the auditor were only paper-based (policies) with no technical preventive or detective controls, the result would likely be a qualified or adverse opinion on that objective if not the broader controls.
The bottom line is while yes, the cloud provider dictates what the objectives and activities are, you won't get an unqualified (some refer to as clean) opinion if the controls are not suitably designed and/or fairly presented.
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Feb. 26, 2015 07:00 PM EST Reads: 683
“We just completed the roll out of our first public and private cloud offerings, which are a combination of public, hybrid, and private cloud,” stated Erik Levitt, CEO of Open Data Centers, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the...
Feb. 26, 2015 07:00 PM EST Reads: 5,893
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Feb. 26, 2015 06:45 PM EST Reads: 573
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco S...
Feb. 26, 2015 06:30 PM EST Reads: 666
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Feb. 26, 2015 06:00 PM EST Reads: 598
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
Feb. 26, 2015 06:00 PM EST Reads: 655
SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applicatio...
Feb. 26, 2015 05:00 PM EST Reads: 1,112
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Feb. 26, 2015 05:00 PM EST Reads: 997
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Feb. 26, 2015 04:00 PM EST Reads: 1,152
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.
Feb. 26, 2015 03:45 PM EST Reads: 2,029
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Feb. 26, 2015 03:45 PM EST Reads: 678
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, de...
Feb. 26, 2015 03:30 PM EST Reads: 3,828
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Feb. 26, 2015 03:15 PM EST Reads: 1,007
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Feb. 26, 2015 03:04 PM EST Reads: 466
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Feb. 26, 2015 03:00 PM EST Reads: 2,032
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - is now accepting submissions to demo smart cars on the Expo Floor. Smart car sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem.
Feb. 26, 2015 02:00 PM EST Reads: 1,565
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Feb. 26, 2015 02:00 PM EST Reads: 1,006
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Feb. 26, 2015 01:15 PM EST Reads: 2,236
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Feb. 26, 2015 01:15 PM EST Reads: 2,749
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
Feb. 26, 2015 01:00 PM EST Reads: 1,291