Click here to close now.

Welcome!

Cloud Expo Authors: Liz McMillan, Pat Romanski, Elizabeth White, AppDynamics Blog, Kelly Murphy

Related Topics: Cloud Expo, Microservices Journal

Cloud Expo: Article

Digital Forensic Challenges within Cloud Computing

Will cloud forensics be effective in managing boundaries of responsibility and access?

Proponents of the cloud ecosystem touts its "vastness, flexibility and scalability as advantages for the implementation of cloud services. However, from a digital point of view this can be a veritable forensic challenge as we view the cloud in terms of its scope and diversity.

According to Dr. Stephen Wolthusen[1] "Digital forensics (also referred to at times as computer forensics) encompasses approaches and techniques for gathering and analyzing traces of human and computer-generated activity in such a way that it is suitable in a court of law."

A key challenge to a digital investigator called to pursue an investigation with cloud resources as a subset will be to establish and map computational and storage structures that will fall within the realm of the investigation. Bear in mind that for any system (cloud or otherwise) security incidents will cross boundaries of responsibility and access.[2]

Within the digital forensic process where there is a no one-size-fits-all solution for a digital forensic examination, all forensic evidence must follow the Forensic process of :

Collection - Examination - Analysis - Reporting. Also no matter its environment, forensic evidence must:

  • Be relevant to the issue at hand
  • Be authentic
  • Not be unfairly prejudicial not be hearsay or if hearsay, able to meet the requirements for an exception
  • Be the original or duplicate of the evidence or able to meet an exception to that rule.

Within the cloud computing ecosystem I believe there may be a dilemma in terms of time stamps. A question for cloud vendors would be, with a distributed and "vast" infrastructure how will they ensure synchronized clocks across all their systems? Synchronized clocks across a distributed global system may not be a possibility, and if this supposition holds true, then what other solution will a cloud vendor provide in such an instance?

Another challenge can be with that of reciprocity. Digital forensics within the cloud computing environment can have legal implications within an international jurisdiction which will require corporation from established relationships with legal entities in foreign countries and/or the establishment of new ones if possible.

As with any live forensic examination another challenge will be the establishing of snapshots of the system in operation. But in this case one can question if this is good enough for such a "vast" and possibly globally distributed ecosystem.

Take the instance of malware injected into the kernel space of a system; it is possible that it may be programmed to modify data or functionality...or both, in a variety of ways upon detection of a probe, or simply set to shut-down,obfuscate evidence, or delete pertinent data residues within a set time frame. Can a forensic examiner be notified of this change, or more pertinent can a cloud service provider implement protocols, tools or processes to ensure that such an event can be mitigated in real time? Most likely not, at least for now.

However a solution of sorts to this dilemma can be gleaned from thesis suggested in a paper by Wolthusen [1] which states: Data may be present or available in a given configuration for a limited time or be staged through different levels of storage hierarchies; it is hence important to place bounds on events in question so as to be able to capture events of interest completely.

In terms of the "vast" distributed environment that can comprise a cloud ecosystem under investigation; we have to be aware of the fact that within such an ecosystem, any forensic investigation can cause: parallel or unrelated services to be interrupted to completed halted, infringe on third party rights and cross jurisdictional boundaries and in the case of duplication require infeasible storage volumes. [1]

Aspects of Control within Cloud Computing Service Models:

SaaS: Here the cloud user, dependent on their contracted services with the cloud vendor will only control certain configuration parameters, whilst the cloud vendor maintain control over application\s and underlying infrastructure.

PaaS: Here the cloud vendor controls the cloud infrastructure and runtime environments when the cloud user controls the application.

IaaS: Although a cloud user will have control over their servers with the installed Operating Systems and applications with this cloud offering the cloud vendor will still controls the virtualization infrastructure and at least parts of the network infrastructure.

These aspects will affect how a digital forensic examination is conducted as, every cloud computing environment will have variations. Therefore the degrees of methods/tools protocols etc. implemented in identifying relevant events that support the detection and analysis of attacks have to be crafted accordingly.

Four Forensic Challenges within the Cloud Ecosystem
Grobauer and Schreck [2] identified the following forensic challenges within the cloud computing environment:

  1. Separation of customer's data sources during evidence collection
  2. Adapting forensic analysis methods to the cloud
  3. Improving live analysis techniques
  4. Improving log generation & analysis techniques

Another major challenge is a need to establish a complete understanding of processes, their dependencies and distribution across different systems within the cloud ecosystem. [1]

Wolthusen[1] also states that, "if semantic dependencies must be captured, this must not only capture the immediate data required to reconstruct a view or document or to recreate and reconstruct a process, but also sufficient information to ascertain the semantics of the event at the point in time of the event."

However would not the establishment of such a process potentially impact customers not involved in an investigation that are sharing the cloud-space that is part of a cloud forensic examination?

Despite the semantics and challenges of the Cloud Computing Environment it is my opinion that:

Cloud Computing users must open dialogue with their vendor regarding processes and protocols for successfully handling/managing incidents. These need to be clearly established within the requirements portion, when drafting their service level agreement (SLA).

References

  1. Overcast: Forensic Discovery in Cloud Environments -Stephen D. Wolthusen
  2. Towards Incident Handling in the Cloud:Challenges and Approaches -Bernd Grobauer,Thomas Schreck

More Stories By Jon Shende

Jon RG Shende is an executive with over 18 years of industry experience. He commenced his career, in the medical arena, then moved into the Oil and Gas environment where he was introduced to SCADA and network technologies,also becoming certified in Industrial Pump and Valve repairs. Jon gained global experience over his career working within several verticals to include pharma, medical sales and marketing services as well as within the technology services environment, eventually becoming the youngest VP of an international enterprise. He is a graduate of the University of Oxford, holds a Masters certificate in Business Administration, as well as an MSc in IT Security, specializing in Computer Crime and Forensics with a thesis on security in the Cloud. Jon, well versed with the technology startup and mid sized venture ecosystems, has contributed at the C and Senior Director level for former clients. As an IT Security Executive, Jon has experience with Virtualization,Strategy, Governance,Risk Management, Continuity and Compliance. He was an early adopter of web-services, web-based tools and successfully beta tested a remote assistance and support software for a major telecom. Within the realm of sales, marketing and business development, Jon earned commendations for turnaround strategies within the services and pharma industry. For one pharma contract he was responsibe for bringing low performing districts up to number 1 rankings for consecutive quarters; as well as outperforming quotas from 125% up to 314%. Part of this was achieved by working closely with sales and marketing teams to ensure message and product placement were on point. Professionally he is a Fellow of the BCS Chartered Institute for IT, an HITRUST Certified CSF Practitioner and holds the CITP and CRISC certifications.Jon Shende currently works as a Senior Director for a CSP. A recognised thought Leader, Jon has been invited to speak for the SANs Institute, has spoken at Cloud Expo in New York as well as sat on a panel at Cloud Expo Santa Clara, and has been an Ernst and Young CPE conference speaker. His personal blog is located at http://jonshende.blogspot.com/view/magazine "We are what we repeatedly do. Excellence, therefore, is not an act, but a habit."

@CloudExpo Stories
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...
It's time to face reality: "Americans are from Mars, Europeans are from Venus," and in today's increasingly connected world, understanding "inter-planetary" alignments and deviations is mission-critical for cloud. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, discussed cultural expectations of privacy based on new research across these elements
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, discussed how a user-centric Application Performance Management solution can help inspire your users with every applicati...
As enterprises engage with Big Data technologies to develop applications needed to meet operational demands, new computation fabrics are continually being introduced. To leverage these new innovations, organizations are sacrificing market opportunities to gain expertise in learning new systems. In his session at Big Data Expo, Supreet Oberoi, Vice President of Field Engineering at Concurrent, Inc., discussed how to leverage existing infrastructure and investments and future-proof them against e...
The consumption economy is here and so are cloud applications and solutions that offer more than subscription and flat fee models and at the same time are available on a pure consumption model, which not only reduces IT spend but also lowers infrastructure costs, and offers ease of use and availability. In their session at 15th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, discussed this shifting dynamic with an ...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackI...
Once the decision has been made to move part or all of a workload to the cloud, a methodology for selecting that workload needs to be established. How do you move to the cloud? What does the discovery, assessment and planning look like? What workloads make sense? Which cloud model makes sense for each workload? What are the considerations for how to select the right cloud model? And how does that fit in with the overall IT transformation?
The recent trends like cloud computing, social, mobile and Internet of Things are forcing enterprises to modernize in order to compete in the competitive globalized markets. However, enterprises are approaching newer technologies with a more silo-ed way, gaining only sub optimal benefits. The Modern Enterprise model is presented as a newer way to think of enterprise IT, which takes a more holistic approach to embracing modern technologies.
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
There's no doubt that the Internet of Things is driving the next wave of innovation. Google has spent billions over the past few months vacuuming up companies that specialize in smart appliances and machine learning. Already, Philips light bulbs, Audi automobiles, and Samsung washers and dryers can communicate with and be controlled from mobile devices. To take advantage of the opportunities the Internet of Things brings to your business, you'll want to start preparing now.
In a world of ever-accelerating business cycles and fast-changing client expectations, the cloud increasingly serves as a growth engine and a path to new business models. Dynamic clouds enable businesses to continuously reinvent themselves, adapting their business processes, their service and software delivery and their operations to achieve speed-to-market and quick response to customer feedback. As the cloud evolves, the industry has multiple competing cloud technologies, offering on-premises ...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...