Click here to close now.

Welcome!

Cloud Expo Authors: Carmen Gonzalez, Roger Strukhoff, Michael Jannery, Bart Copeland, Srinivasan Sundara Rajan

Related Topics: Cloud Expo, SOA & WOA

Cloud Expo: Article

Digital Forensic Challenges within Cloud Computing

Will cloud forensics be effective in managing boundaries of responsibility and access?

Proponents of the cloud ecosystem touts its "vastness, flexibility and scalability as advantages for the implementation of cloud services. However, from a digital point of view this can be a veritable forensic challenge as we view the cloud in terms of its scope and diversity.

According to Dr. Stephen Wolthusen[1] "Digital forensics (also referred to at times as computer forensics) encompasses approaches and techniques for gathering and analyzing traces of human and computer-generated activity in such a way that it is suitable in a court of law."

A key challenge to a digital investigator called to pursue an investigation with cloud resources as a subset will be to establish and map computational and storage structures that will fall within the realm of the investigation. Bear in mind that for any system (cloud or otherwise) security incidents will cross boundaries of responsibility and access.[2]

Within the digital forensic process where there is a no one-size-fits-all solution for a digital forensic examination, all forensic evidence must follow the Forensic process of :

Collection - Examination - Analysis - Reporting. Also no matter its environment, forensic evidence must:

  • Be relevant to the issue at hand
  • Be authentic
  • Not be unfairly prejudicial not be hearsay or if hearsay, able to meet the requirements for an exception
  • Be the original or duplicate of the evidence or able to meet an exception to that rule.

Within the cloud computing ecosystem I believe there may be a dilemma in terms of time stamps. A question for cloud vendors would be, with a distributed and "vast" infrastructure how will they ensure synchronized clocks across all their systems? Synchronized clocks across a distributed global system may not be a possibility, and if this supposition holds true, then what other solution will a cloud vendor provide in such an instance?

Another challenge can be with that of reciprocity. Digital forensics within the cloud computing environment can have legal implications within an international jurisdiction which will require corporation from established relationships with legal entities in foreign countries and/or the establishment of new ones if possible.

As with any live forensic examination another challenge will be the establishing of snapshots of the system in operation. But in this case one can question if this is good enough for such a "vast" and possibly globally distributed ecosystem.

Take the instance of malware injected into the kernel space of a system; it is possible that it may be programmed to modify data or functionality...or both, in a variety of ways upon detection of a probe, or simply set to shut-down,obfuscate evidence, or delete pertinent data residues within a set time frame. Can a forensic examiner be notified of this change, or more pertinent can a cloud service provider implement protocols, tools or processes to ensure that such an event can be mitigated in real time? Most likely not, at least for now.

However a solution of sorts to this dilemma can be gleaned from thesis suggested in a paper by Wolthusen [1] which states: Data may be present or available in a given configuration for a limited time or be staged through different levels of storage hierarchies; it is hence important to place bounds on events in question so as to be able to capture events of interest completely.

In terms of the "vast" distributed environment that can comprise a cloud ecosystem under investigation; we have to be aware of the fact that within such an ecosystem, any forensic investigation can cause: parallel or unrelated services to be interrupted to completed halted, infringe on third party rights and cross jurisdictional boundaries and in the case of duplication require infeasible storage volumes. [1]

Aspects of Control within Cloud Computing Service Models:

SaaS: Here the cloud user, dependent on their contracted services with the cloud vendor will only control certain configuration parameters, whilst the cloud vendor maintain control over application\s and underlying infrastructure.

PaaS: Here the cloud vendor controls the cloud infrastructure and runtime environments when the cloud user controls the application.

IaaS: Although a cloud user will have control over their servers with the installed Operating Systems and applications with this cloud offering the cloud vendor will still controls the virtualization infrastructure and at least parts of the network infrastructure.

These aspects will affect how a digital forensic examination is conducted as, every cloud computing environment will have variations. Therefore the degrees of methods/tools protocols etc. implemented in identifying relevant events that support the detection and analysis of attacks have to be crafted accordingly.

Four Forensic Challenges within the Cloud Ecosystem
Grobauer and Schreck [2] identified the following forensic challenges within the cloud computing environment:

  1. Separation of customer's data sources during evidence collection
  2. Adapting forensic analysis methods to the cloud
  3. Improving live analysis techniques
  4. Improving log generation & analysis techniques

Another major challenge is a need to establish a complete understanding of processes, their dependencies and distribution across different systems within the cloud ecosystem. [1]

Wolthusen[1] also states that, "if semantic dependencies must be captured, this must not only capture the immediate data required to reconstruct a view or document or to recreate and reconstruct a process, but also sufficient information to ascertain the semantics of the event at the point in time of the event."

However would not the establishment of such a process potentially impact customers not involved in an investigation that are sharing the cloud-space that is part of a cloud forensic examination?

Despite the semantics and challenges of the Cloud Computing Environment it is my opinion that:

Cloud Computing users must open dialogue with their vendor regarding processes and protocols for successfully handling/managing incidents. These need to be clearly established within the requirements portion, when drafting their service level agreement (SLA).

References

  1. Overcast: Forensic Discovery in Cloud Environments -Stephen D. Wolthusen
  2. Towards Incident Handling in the Cloud:Challenges and Approaches -Bernd Grobauer,Thomas Schreck

More Stories By Jon Shende

Jon RG Shende is an executive with over 18 years of industry experience. He commenced his career, in the medical arena, then moved into the Oil and Gas environment where he was introduced to SCADA and network technologies,also becoming certified in Industrial Pump and Valve repairs. Jon gained global experience over his career working within several verticals to include pharma, medical sales and marketing services as well as within the technology services environment, eventually becoming the youngest VP of an international enterprise. He is a graduate of the University of Oxford, holds a Masters certificate in Business Administration, as well as an MSc in IT Security, specializing in Computer Crime and Forensics with a thesis on security in the Cloud. Jon, well versed with the technology startup and mid sized venture ecosystems, has contributed at the C and Senior Director level for former clients. As an IT Security Executive, Jon has experience with Virtualization,Strategy, Governance,Risk Management, Continuity and Compliance. He was an early adopter of web-services, web-based tools and successfully beta tested a remote assistance and support software for a major telecom. Within the realm of sales, marketing and business development, Jon earned commendations for turnaround strategies within the services and pharma industry. For one pharma contract he was responsibe for bringing low performing districts up to number 1 rankings for consecutive quarters; as well as outperforming quotas from 125% up to 314%. Part of this was achieved by working closely with sales and marketing teams to ensure message and product placement were on point. Professionally he is a Fellow of the BCS Chartered Institute for IT, an HITRUST Certified CSF Practitioner and holds the CITP and CRISC certifications.Jon Shende currently works as a Senior Director for a CSP. A recognised thought Leader, Jon has been invited to speak for the SANs Institute, has spoken at Cloud Expo in New York as well as sat on a panel at Cloud Expo Santa Clara, and has been an Ernst and Young CPE conference speaker. His personal blog is located at http://jonshende.blogspot.com/view/magazine "We are what we repeatedly do. Excellence, therefore, is not an act, but a habit."

@CloudExpo Stories
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investme...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...