|By Philip Lieberman||
|March 29, 2010 05:00 PM EDT||
Cloud Security on Ulitzer
Safeguarding a cloud infrastructure from unmonitored access, malware and intruder attacks grows more challenging for service providers as their operations evolve. And as a cloud infrastructure grows, so too does the presence of unsecured privileged identities – those so-called super-user accounts that hold elevated permission to access sensitive data, run programs, and change configuration settings on virtually every IT component. Privileged identities exist on all physical and virtual operating systems, on network devices such as routers, switches, and firewalls, and in programs and services including databases, line-of-business applications, Web services, middleware, VM hypervisors and more.
Left unsecured, privileged accounts leave an organization vulnerable to IT staff members who have unmonitored access to sensitive customer data and can change configuration settings on critical components of your infrastructure through anonymous, unaudited access. Unsecured privileged accounts can also lead to financial loss from failed regulatory audits such as PCI-DSS, HIPAA, SOX and other standards that require privileged identity controls.
One of the largest challenges for consumers of cloud services is attaining transparency into how a public cloud provider is securing its infrastructure. For example, how are identities being managed and secured? Many cloud providers won’t give their customers much more of an answer than a SAS 70 certification. How can we trust in the cloud if the vendors of cloud-based infrastructures neglect to implement both the process and technology to assure that segregation of duties are enforced, and customer and vendor identities are secured?
The Cloud Vendor’s Challenge: Accountability
Cloud computing has the potential to transform business technology, but it brings security issues that IT organizations should consider before trusting their sensitive data to the cloud. These issues should cause security experts and auditors to rethink many fundamental assumptions about Privileged Identity Management in terms of who is responsible for managing these powerful privileged accounts, how they manage them, and who exactly is in control.
Historically, IT data centers have always been in secured physical locations. Now, with cloud computing, those locations are no longer maintained directly by the IT organization. So the questions are these: how do you get accountability for management of physical assets that are no longer under your physical control, and exactly what control mechanisms are in place? Can you trust your cloud vendor to secure your most sensitive data? Moreover, if there’s a security breach in the cloud, who is to blame? Is it the cloud vendor that disclaims all legal liability in its contract, or an enterprise that relinquishes control of its sensitive data in the first place?
Cloud computing promises to make IT more efficient and deliver more consistent service levels. However, there’s a paradox that when it comes to security (and control over privileged identities in particular) cloud services are often among the least efficient. Many cloud service providers’ processes – based on ad-hoc techniques like scripting of password changes – are slow, expensive and unreliable. And that’s dangerous.
Fortunately the industry is starting to move beyond paralyzing discussions about the security and compliance problems that arise from cloud computing to address them head on. One example is the Trusted Cloud Initiative, which was launched at RSA Security Conference 2010. The goal of the initiative is “to help cloud providers develop industry-recommended, secure and interoperable identity, access and compliance management configurations, and practices.” However, only time will tell if it will help standardize cloud computing or turn out to be a technology certification of little use.
Several major cloud vendors and ISPs have begun the task of integrating security solutions that are capable of managing the large number of privileged identities that make up their infrastructure (hardware, VM hosts, VM Image OS, application stacks). This has really broken the fundamental model of IT being in control of security and has started to blur the lines between vendor and customer when it comes to the management of security.
Today, some privileged identity management frameworks are capable of managing “from iron to application,” giving cloud customers a full measure of control over credentials used in each physical and virtual layer of the stack and the potential to gain full visibility into who has access. In contrast, scripts and other ad-hoc methods to manage privileged identities can no longer keep pace or meet regulatory requirements in fast-changing and highly virtualized cloud computing environments.
In addition, cloud vendors must move to become identity providers of authentication services, multi-tenancy control, and X.509 certificate issuance for applications, end-points, users, and encrypted sessions. It is inappropriate for cloud vendors to expect their customers to use disconnected and third party providers of certificate services for what should be an inherent and integrated feature of every cloud vendor’s offering.
The End User’s Challenge: Transparency
In my opinion, the cloud is a really good, compelling idea. It can reduce the cost of IT dramatically. Given that cloud computing is available, the idea of building new data centers these days seems like a last-century way of doing things. And since many organizations lack the appropriate personnel to manage the IT resources they have, they’re willing to forego seeing and touching their own systems in their secured data centers – and the corresponding feeling of control – and have turned to outsourcing. Cloud computing is essentially the next generation of outsourcing, so we’re not only reducing manpower but also getting rid of our hard assets entirely. By moving these services to data centers anywhere on the planet we’re offered the potential for service delivery that costs far less than the alternatives. And the idea of outsourcing security and liability is extraordinarily compelling.
However, enterprises should ask the right questions of their cloud providers before taking the leap into the cloud and blindly assuming that their data is safe there. You should ask your cloud service provider to meet every point of compliance that your IT organization is required to meet, and should ask your cloud service provider every question that your IT auditors ask you.
Auditors, too, share a responsibility to verify that client organizations are able to track the usage and control of their data and resources inside the cloud. In keeping with major regulatory mandates, auditors are obligated to confirm segregation of duties and the enforcement of “need to know” and “need to access” policies. And, potential cloud customers should ask what provisions have been made to provide the required trail of access to the user’s auditors on demand – and what provisions are in place to allow the sharing of privileged control between cloud vendor and user for appropriate reporting and verification.
Because today’s cloud vendors offer literally no transparency and little information, don’t be surprised if you don’t like the answers you get. Most cloud vendors would say that for security purposes, it’s on a “need to know” basis and you don’t need to know. Others state that they’re SAS 70 compliant, but that’s really just a self-certification. And because each measure of security adds to cloud vendor costs, it is appropriate for consumers of cloud services to demand to know precisely what measures are in place – and what auditing processes are supported – as part of the service agreement.
Be persistent. What kind of security does the cloud service provider have in place to protect your privileged accounts and most sensitive data? Do they have Privileged Identity Management technology in place? How do they control privileged accounts used in cloud infrastructure to manage sensitive systems and data? How do they manage cloud stacks at the physical layer and application stack layers ? What is your access to audit records?
Whatever regulatory standards your organization must meet, so too must your cloud vendor. So if you think that by venturing into the cloud you’re saving yourself from regulatory headaches, think again.
Security is the greatest barrier towards adoption of the cloud, and it’s no great surprise that cloud security was a major theme at this year’s RSA Conference. Unfortunately, improvements in cloud security won’t be seen as a priority until a major breach has a significant impact on one or more cloud service vendors and their customers. This needs to change. When it comes to cloud security, it is the end-user’s duty to understand what processes and methodologies the cloud vendor is using to protect the customer’s most sensitive assets.
|douglas.barbin 03/31/10 06:09:00 PM EDT|
Very good article and very comprehensive view of the assurance issues surrounding identity management in the cloud. One clarification (and I could see what you were getting at so its not as if you misconstrued) but I wanted to clarify that SAS 70 is not a self-certification.
First, SAS 70 is not a certification at all although I agree with you that technology marketers love to issue press releases saying that it is. Second, you are correct in that there are no prescriptive standards and that what is being tested are the control activities and objectives set by the provider.
That said, the two do have to interrelate for a CPA to render an unqualified opinion. For instance, if the (high-level) control objective provides reasonable assurance against unauthorized access and the (detailed) control activities tested by the auditor were only paper-based (policies) with no technical preventive or detective controls, the result would likely be a qualified or adverse opinion on that objective if not the broader controls.
The bottom line is while yes, the cloud provider dictates what the objectives and activities are, you won't get an unqualified (some refer to as clean) opinion if the controls are not suitably designed and/or fairly presented.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
Nov. 26, 2014 11:45 PM EST Reads: 1,052
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Nov. 26, 2014 11:30 PM EST Reads: 940
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
Nov. 26, 2014 09:00 PM EST Reads: 1,002
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
Nov. 26, 2014 06:00 PM EST Reads: 1,000
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 26, 2014 05:45 PM EST Reads: 942
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
Nov. 26, 2014 04:45 PM EST Reads: 682
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
Nov. 26, 2014 04:45 PM EST Reads: 667
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
Nov. 26, 2014 04:45 PM EST Reads: 686
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Nov. 26, 2014 04:00 PM EST Reads: 1,043
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device exp...
Nov. 26, 2014 03:45 PM EST Reads: 1,000
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect...
Nov. 26, 2014 02:00 PM EST Reads: 1,475
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...
Nov. 25, 2014 11:30 PM EST Reads: 1,146
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
Nov. 25, 2014 09:30 PM EST Reads: 1,232
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Nov. 25, 2014 09:30 PM EST Reads: 1,285
"Verizon offers public cloud, virtual private cloud as well as private cloud on-premises - many different alternatives. Verizon's deep knowledge in applications and the fact that we are responsible for applications that make call outs to other systems. Those systems and those resources may not be in Verizon Cloud, we understand at the end of the day it's going to be federated," explained Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, in this SYS-CON.tv interview at...
Nov. 25, 2014 09:00 PM EST Reads: 1,342
"For the past 4 years we have been working mainly to export. For the last 3 or 4 years the main market was Russia. In the past year we have been working to expand our footprint in Europe and the United States," explained Andris Gailitis, CEO of DEAC, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 25, 2014 08:15 PM EST Reads: 1,123
The term culture has had a polarizing effect among DevOps supporters. Some propose that culture change is critical for success with DevOps, but are remiss to define culture. Some talk about a DevOps culture but then reference activities that could lead to culture change and there are those that talk about culture change as a set of behaviors that need to be adopted by those in IT. There is no question that businesses successful in adopting a DevOps mindset have seen departmental culture change, ...
Nov. 25, 2014 07:00 PM EST Reads: 1,011
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. Acco...
Nov. 25, 2014 07:00 PM EST Reads: 1,320
"Cloud consumption is something we envision at Solgenia. That is trying to let the cloud spread to the user as a consumption, as utility computing. We want to allow the people to just pay for what they use, not a subscription model," explained Ermanno Bonifazi, CEO & Founder of Solgenia, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 25, 2014 06:15 PM EST Reads: 988
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
Nov. 25, 2014 04:30 PM EST Reads: 1,326