|By Jon Shende||
|October 24, 2012 11:00 AM EDT||
Digital Forensics is not an elephant, it is a process and not just one process, but a group of tasks and processes in investigation. Examiners now perform targeted examinations using forensic tools and databases of known files, selecting specific files and data types for review while ignoring files of irrelevant type and content. Despite the application of sophisticated tools, the forensic process still relies on the examiner's knowledge of the technical aspects of the specimen and understanding of the case and the law - Mark Pollitt.
As has been established from articles by various authors including myself, this re-branded model of computing now called cloud computing proposes benefits that can improve productivity, harness high-speed systems which can manage large data sets as well as systems implementations, and could have a net positive impact on the operational budget (scaling,elasticity) of some small and midsized enterprises.
Of course there is the possibility that a private cloud for a small enterprise may not warrant its cost, in comparison to that of harnessing the benefits of a public cloud offering.
For a larger enterprise with say multiple and/or international locations, a private cloud infrastructure can provide an added cost benefit that whilst not as cheap as a public cloud offering, would offset that cost variance in terms of the risk profile of systems being moved into a private cloud e.g. critical databases, transactional and/or processing systems as well as potential compliance concerns.
If however an enterprise chooses to utilize a public cloud offering there will be the added complications for information security, in terms of procedural and legal standpoints. This leads us to the point that, with a public cloud system; we no longer have the traditional defined security perimeter.
This new cloud security perimeter can now be any place on any device where people will access an enterprise provided network, resources and systems.
With regard to digital forensics and the e-discovery process, this new cloud security perimeter stemming from the trend with which data is now accessed via the internet, housed and consumed on multiple systems and devices internationally, will pose some serious challenges(legally and technically) with the potential to complicate a security investigation. e.g. defining incident response, access rules and policies governing access as well as support processes.
Traditional network forensics metrics will not give a complete picture of what can occur within the cloud computing environment; for instance there could be limitations in terms of focus only on data going into and out from systems which an enterprise has access to, and as we know this generally stops at the gateway into the cloud.
In terms of network forensics, packet capture and analysis is important; with the cloud ecosystem there is the real possibility of an increase in the vast amount of data that may need to be processed. This will only increase the workload on the digital investigator who will most likely have more than a plate full of hex patterns, network metadata and logs to analyze., as is the case with a traditional system analysis.
This increased volume can severely cripple an investigation; more so if a forensic investigator does not completely understand the cloud ecosystem's architecture, its complex linkages that bridge cloud services and an enterprise's systems in addition to how these systems impact an enterprise in terms of potential ingress points that can lead to systems compromise.
The cloud while a boon to enterprise CapEx/OpEx is also a gold-mine for crackers who can set up systems for attack with as little as $50 e.g with Amazon Web Services (AWS), an Amazon Machine Image (AMI) either Linux or Windows can run a virtual machine which can be set it up to do whatever an end-user wants to do with it, that is, within the confines of the virtualized world; this environment is owned by the enduser (a cracker in this case) from the operating system up.
Of course the IAAS and other hardware systems, IDS/IPS, firewalls, remain under the control and belong to the cloud service provider.
With regard to say conducting a forensic investigation on a virtualized server,there is that potential loss of data that can be relevant to an investigation once an image is stopped or a virtualized server is shut down, with minimal chance of retrieving a specific image from its virtualized server.
As mentioned there are several merits for the case to adopt a cloud service however, from a digital forensics point of view; an understanding of the inherent limitations of such a system needs to be clearly understood and properly reviewed and scoped by an enterprises IT Security team regarding how such an implementation will adapt to their current security model. These metrics may vary based on the selected cloud provider the enterprise will use.
Gathered data can then assist the enterprise security on how to mitigate the potential for compromise and other risk that can affect the enterprises operations stemming from this added environment. This in turn can potentially alleviate the pains of a digital forensics investigation with cloud computing overtures.
Digital Forensic expert Nicole Bebee stated, "No research has been published on how cloud computing environmnets affect digital artifacts, and legal issues related to cloud computing environments."
Of note is the fact that with the top CSPs (Amazon, Rackspace, Azure) one can find common attributes from which a security manager can tweak the enterprises security policies.
Some things of note that will impact a forensic investigation within the cloud ecosystem are:
- A network forensics investigator is limited to tools on the box rather than the entire network, however if a proper ISO is made of the machine image, then all the standard information in the machine image's ISO should be available as it would with any other server in a data center.
- Lack of access to network routers, load balancers and other networked components.
- No access to large firewall installations
- There are challenges in mapping known hops from instance to instance which will remain static across the cloud-routing schema.
- System Administrators can build and tear down virtual machines (VMs) at will. This can influence an enterprises security policy and plans as, new rules and regulations will have to be implemented as we work with cloud servers and services that are suspected of being compromised.
- An enterprises threat environment should be treated with the same mindset for the cloud ecosystem as it would for any exposed service that is offered across the Internet.
- With the cloud ecosystem an advantage with regards to forensics is the ability for a digital investigator to store very large log files on a storage instance or in a very large database for easy data retrieval and discovery.
- An enterprise has to be open to the fact that there will be a risk of data being damaged, accessed, altered, or denied by the CSP.
- Routing information that is not already on "the box" will be difficult to obtain within this ecosystem.
- For encrypted disks, wouldn't it be theoretically feasible to spin up "n" cloud instances to help crack the encryption? According to Dan Morrill this can be an expensive process.
As those of us who are students and practitioners within the field of digital forensic know , any advance in this area tend to be primarily reactionary in nature and most likely developed to respond to a specific incident or subset of incidents. This can pose a major challenge in the traditional systems; one can only imagine what can occur when faced with a distributed cloud ecosystem.
In terms of digital forensics, any tool that will make an examiners job easier, improve results, reduce false positives and generate data that is relevant, pertinent and can be admitted in a court of law will be of value.
Being my firms lead solutions researcher and consultant I am always on the lookout for any new process, system or tool that will make my job as well as that of my team easier as we work with our clients. This led me to attend a webinar: The Case for Network Forensics; from a company called Solera Networks ...continued in Part 2.
Special thanks to Mark Pollitt for his valuable insight.
- Politt MM. Six blind men from Indostan. Digital forensics research workshop (DFRWS); 2004.
- Digital Forensics:Defining a Research Agenda -Nance,Hay Bishop 2009;978-0-7695-3450-3/09 IEEE
- Dan Morrill- 10 things to think about with cloud-computing and forensics
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 27, 2016 01:45 AM EDT Reads: 1,710
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Aug. 27, 2016 01:30 AM EDT Reads: 2,046
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 27, 2016 01:15 AM EDT Reads: 1,992
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 27, 2016 12:45 AM EDT Reads: 2,926
With over 720 million Internet users and 40–50% CAGR, the Chinese Cloud Computing market has been booming. When talking about cloud computing, what are the Chinese users of cloud thinking about? What is the most powerful force that can push them to make the buying decision? How to tap into them? In his session at 18th Cloud Expo, Yu Hao, CEO and co-founder of SpeedyCloud, answered these questions and discussed the results of SpeedyCloud’s survey.
Aug. 27, 2016 12:45 AM EDT Reads: 2,140
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Aug. 27, 2016 12:15 AM EDT Reads: 2,258
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 26, 2016 11:15 PM EDT Reads: 1,747
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Aug. 26, 2016 10:45 PM EDT Reads: 2,061
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 26, 2016 10:00 PM EDT Reads: 1,847
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Aug. 26, 2016 07:15 PM EDT Reads: 413
[session] Architecting for the Cloud By @RagsS | @CloudExpo @IBMBluemix #Cloud #Docker #Microservices
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Aug. 26, 2016 07:00 PM EDT Reads: 705
Announcing @TelecomReseller Named “Media Sponsor” of @CloudExpo Silicon Valley | #IoT #Cloud #BigData
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 26, 2016 07:00 PM EDT Reads: 657
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Aug. 26, 2016 06:00 PM EDT Reads: 1,932
SYS-CON Events announced today that StarNet Communications will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. StarNet Communications’ FastX is the industry first cloud-based remote X Windows emulator. Using standard Web browsers (FireFox, Chrome, Safari, etc.) users from around the world gain highly secure access to applications and data hosted on Linux-based servers in a central data center. ...
Aug. 26, 2016 05:00 PM EDT Reads: 672
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 26, 2016 04:30 PM EDT Reads: 2,314
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 26, 2016 04:00 PM EDT Reads: 3,936
StarNet Adds Secure Remote Linux and Unix Desktops to X-Win32 | @CloudExpo @XWin32 #Cloud #Linux #Security
StarNet Communications Corp has announced the addition of three Secure Remote Desktop modules to its flagship X-Win32 PC X server. The new modules enable X-Win32 to safely tunnel the remote desktops from Linux and Unix servers to the user’s PC over encrypted SSH. Traditionally, users of PC X servers deploy the XDMCP protocol to display remote desktop environments such as the Gnome and KDE desktops on Linux servers and the CDE environment on Solaris Unix machines. XDMCP is used primarily on comp...
Aug. 26, 2016 03:45 PM EDT Reads: 571
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 26, 2016 03:15 PM EDT Reads: 446
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
Aug. 26, 2016 01:30 PM EDT Reads: 3,425
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Aug. 26, 2016 01:00 PM EDT Reads: 1,983