|By Jon Shende||
|October 24, 2012 11:00 AM EDT||
Digital Forensics is not an elephant, it is a process and not just one process, but a group of tasks and processes in investigation. Examiners now perform targeted examinations using forensic tools and databases of known files, selecting specific files and data types for review while ignoring files of irrelevant type and content. Despite the application of sophisticated tools, the forensic process still relies on the examiner's knowledge of the technical aspects of the specimen and understanding of the case and the law - Mark Pollitt.
As has been established from articles by various authors including myself, this re-branded model of computing now called cloud computing proposes benefits that can improve productivity, harness high-speed systems which can manage large data sets as well as systems implementations, and could have a net positive impact on the operational budget (scaling,elasticity) of some small and midsized enterprises.
Of course there is the possibility that a private cloud for a small enterprise may not warrant its cost, in comparison to that of harnessing the benefits of a public cloud offering.
For a larger enterprise with say multiple and/or international locations, a private cloud infrastructure can provide an added cost benefit that whilst not as cheap as a public cloud offering, would offset that cost variance in terms of the risk profile of systems being moved into a private cloud e.g. critical databases, transactional and/or processing systems as well as potential compliance concerns.
If however an enterprise chooses to utilize a public cloud offering there will be the added complications for information security, in terms of procedural and legal standpoints. This leads us to the point that, with a public cloud system; we no longer have the traditional defined security perimeter.
This new cloud security perimeter can now be any place on any device where people will access an enterprise provided network, resources and systems.
With regard to digital forensics and the e-discovery process, this new cloud security perimeter stemming from the trend with which data is now accessed via the internet, housed and consumed on multiple systems and devices internationally, will pose some serious challenges(legally and technically) with the potential to complicate a security investigation. e.g. defining incident response, access rules and policies governing access as well as support processes.
Traditional network forensics metrics will not give a complete picture of what can occur within the cloud computing environment; for instance there could be limitations in terms of focus only on data going into and out from systems which an enterprise has access to, and as we know this generally stops at the gateway into the cloud.
In terms of network forensics, packet capture and analysis is important; with the cloud ecosystem there is the real possibility of an increase in the vast amount of data that may need to be processed. This will only increase the workload on the digital investigator who will most likely have more than a plate full of hex patterns, network metadata and logs to analyze., as is the case with a traditional system analysis.
This increased volume can severely cripple an investigation; more so if a forensic investigator does not completely understand the cloud ecosystem's architecture, its complex linkages that bridge cloud services and an enterprise's systems in addition to how these systems impact an enterprise in terms of potential ingress points that can lead to systems compromise.
The cloud while a boon to enterprise CapEx/OpEx is also a gold-mine for crackers who can set up systems for attack with as little as $50 e.g with Amazon Web Services (AWS), an Amazon Machine Image (AMI) either Linux or Windows can run a virtual machine which can be set it up to do whatever an end-user wants to do with it, that is, within the confines of the virtualized world; this environment is owned by the enduser (a cracker in this case) from the operating system up.
Of course the IAAS and other hardware systems, IDS/IPS, firewalls, remain under the control and belong to the cloud service provider.
With regard to say conducting a forensic investigation on a virtualized server,there is that potential loss of data that can be relevant to an investigation once an image is stopped or a virtualized server is shut down, with minimal chance of retrieving a specific image from its virtualized server.
As mentioned there are several merits for the case to adopt a cloud service however, from a digital forensics point of view; an understanding of the inherent limitations of such a system needs to be clearly understood and properly reviewed and scoped by an enterprises IT Security team regarding how such an implementation will adapt to their current security model. These metrics may vary based on the selected cloud provider the enterprise will use.
Gathered data can then assist the enterprise security on how to mitigate the potential for compromise and other risk that can affect the enterprises operations stemming from this added environment. This in turn can potentially alleviate the pains of a digital forensics investigation with cloud computing overtures.
Digital Forensic expert Nicole Bebee stated, "No research has been published on how cloud computing environmnets affect digital artifacts, and legal issues related to cloud computing environments."
Of note is the fact that with the top CSPs (Amazon, Rackspace, Azure) one can find common attributes from which a security manager can tweak the enterprises security policies.
Some things of note that will impact a forensic investigation within the cloud ecosystem are:
- A network forensics investigator is limited to tools on the box rather than the entire network, however if a proper ISO is made of the machine image, then all the standard information in the machine image's ISO should be available as it would with any other server in a data center.
- Lack of access to network routers, load balancers and other networked components.
- No access to large firewall installations
- There are challenges in mapping known hops from instance to instance which will remain static across the cloud-routing schema.
- System Administrators can build and tear down virtual machines (VMs) at will. This can influence an enterprises security policy and plans as, new rules and regulations will have to be implemented as we work with cloud servers and services that are suspected of being compromised.
- An enterprises threat environment should be treated with the same mindset for the cloud ecosystem as it would for any exposed service that is offered across the Internet.
- With the cloud ecosystem an advantage with regards to forensics is the ability for a digital investigator to store very large log files on a storage instance or in a very large database for easy data retrieval and discovery.
- An enterprise has to be open to the fact that there will be a risk of data being damaged, accessed, altered, or denied by the CSP.
- Routing information that is not already on "the box" will be difficult to obtain within this ecosystem.
- For encrypted disks, wouldn't it be theoretically feasible to spin up "n" cloud instances to help crack the encryption? According to Dan Morrill this can be an expensive process.
As those of us who are students and practitioners within the field of digital forensic know , any advance in this area tend to be primarily reactionary in nature and most likely developed to respond to a specific incident or subset of incidents. This can pose a major challenge in the traditional systems; one can only imagine what can occur when faced with a distributed cloud ecosystem.
In terms of digital forensics, any tool that will make an examiners job easier, improve results, reduce false positives and generate data that is relevant, pertinent and can be admitted in a court of law will be of value.
Being my firms lead solutions researcher and consultant I am always on the lookout for any new process, system or tool that will make my job as well as that of my team easier as we work with our clients. This led me to attend a webinar: The Case for Network Forensics; from a company called Solera Networks ...continued in Part 2.
Special thanks to Mark Pollitt for his valuable insight.
- Politt MM. Six blind men from Indostan. Digital forensics research workshop (DFRWS); 2004.
- Digital Forensics:Defining a Research Agenda -Nance,Hay Bishop 2009;978-0-7695-3450-3/09 IEEE
- Dan Morrill- 10 things to think about with cloud-computing and forensics
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Jul. 28, 2016 07:15 PM EDT Reads: 343
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 06:45 PM EDT Reads: 1,627
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jul. 28, 2016 06:30 PM EDT Reads: 1,655
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 28, 2016 06:00 PM EDT Reads: 949
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Jul. 28, 2016 05:30 PM EDT Reads: 1,858
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 05:30 PM EDT Reads: 2,213
[webcast] Continuous Delivery in the Enterprise | @DevOpsSummit @IBMDevOps #IBM #DevOps #ContinuousDelivery
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 28, 2016 05:00 PM EDT Reads: 426
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 28, 2016 04:30 PM EDT Reads: 1,186
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Jul. 28, 2016 04:30 PM EDT Reads: 904
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jul. 28, 2016 04:15 PM EDT Reads: 369
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 28, 2016 04:15 PM EDT Reads: 1,773
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Jul. 28, 2016 03:45 PM EDT Reads: 1,120
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
Jul. 28, 2016 03:30 PM EDT Reads: 721
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Jul. 28, 2016 03:30 PM EDT Reads: 381
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 28, 2016 03:15 PM EDT Reads: 1,906
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
Jul. 28, 2016 03:15 PM EDT Reads: 710
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 03:00 PM EDT Reads: 1,567
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Jul. 28, 2016 12:30 PM EDT Reads: 614
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
Jul. 28, 2016 12:00 PM EDT Reads: 1,242
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 28, 2016 11:15 AM EDT Reads: 1,154