0% found this document useful (0 votes)
25 views

CSF Unit-4

The document discusses computer forensics, including types like disk, network, and mobile forensics. It covers procedures, tools, applications, advantages, and disadvantages of computer forensics. Current computer forensics tools are also discussed.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

CSF Unit-4

The document discusses computer forensics, including types like disk, network, and mobile forensics. It covers procedures, tools, applications, advantages, and disadvantages of computer forensics. Current computer forensics tools are also discussed.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

UNIT IV

INTRODUCTION
Computer Forensics is a scientific method of investigation and analysis in order
to gather evidence from digital devices or computer networks and components
which is suitable for presentation in a court of law or legal body. It involves
performing a structured investigation while maintaining a documented chain of
evidence to find out exactly what happened on a computer and who was
responsible for it.
TYPES
 Disk Forensics: It deals with extracting raw data from the primary or
secondary storage of the device by searching active, modified, or deleted
files.
 Network Forensics: It is a sub-branch of Computer Forensics that involves
monitoring and analyzing the computer network traffic.
 Database Forensics: It deals with the study and examination of databases
and their related metadata.
 Malware Forensics: It deals with the identification of suspicious code and
studying viruses, worms, etc.
 Email Forensics: It deals with emails and their recovery and analysis,
including deleted emails, calendars, and contacts.
 Memory Forensics: Deals with collecting data from system memory (system
registers, cache, RAM) in raw form and then analyzing it for further
investigation.
 Mobile Phone Forensics: It mainly deals with the examination and analysis of
phones and smartphones and helps to retrieve contacts, call logs, incoming,
and outgoing SMS, etc., and other data present in it.

CHARACTERISTICS
 Identification: Identifying what evidence is present, where it is stored, and
how it is stored (in which format). Electronic devices can be personal
computers, Mobile phones, PDAs, etc.
 Preservation: Data is isolated, secured, and preserved. It includes
prohibiting unauthorized personnel from using the digital device so that
digital evidence, mistakenly or purposely, is not tampered with and making a
copy of the original evidence.
 Analysis: Forensic lab personnel reconstruct fragments of data and draw
conclusions based on evidence.
 Documentation: A record of all the visible data is created. It helps in
recreating and reviewing the crime scene. All the findings from the
investigations are documented.
 Presentation: All the documented findings are produced in a court of law for
further investigations.

PROCEDURE:
The procedure starts with identifying the devices used and collecting the
preliminary evidence on the crime scene. Then the court warrant is obtained for
the seizure of the evidence which leads to the seizure of the evidence. The
evidence are then transported to the forensics lab for further investigations
and the procedure of transportation of the evidence from the crime scene to
labs are called chain of custody. The evidence are then copied for analysis and
the original evidence is kept safe because analysis are always done on the
copied evidence and not the original evidence.
The analysis is then done on the copied evidence for suspicious activities and
accordingly, the findings are documented in a nontechnical tone. The
documented findings are then presented in a court of law for further
investigations.
Some Tools used for Investigation:
Tools for Laptop or PC –
 COFFEE – A suite of tools for Windows developed by Microsoft.
 The Coroner’s Toolkit – A suite of programs for Unix analysis.
 The Sleuth Kit – A library of tools for both Unix and Windows.
Tools for Memory :
 Volatility
 WindowsSCOPE
Tools for Mobile Device :
 MicroSystemation XRY/XACT
APPLICATIONS
 Intellectual Property theft
 Industrial espionage
 Employment disputes
 Fraud investigations
 Misuse of the Internet and email in the workplace
 Forgeries related matters
 Bankruptcy investigations
 Issues concerned the regulatory compliance
Advantages of Computer Forensics :
 To produce evidence in the court, which can lead to the punishment of the
culprit.
 It helps the companies gather important information on their computer
systems or networks potentially being compromised.
 Efficiently tracks down cyber criminals from anywhere in the world.
 Helps to protect the organization’s money and valuable time.
 Allows to extract, process, and interpret the factual evidence, so it proves
the cybercriminal action’s in the court.
Disadvantages of Computer Forensics :
 Before the digital evidence is accepted into court it must be proved that it is
not tampered with.
 Producing and keeping electronic records safe is expensive.
 Legal practitioners must have extensive computer knowledge.
 Need to produce authentic and convincing evidence.
 If the tool used for digital forensics is not according to specified standards,
then in a court of law, the evidence can be disapproved by justice.
 A lack of technical knowledge by the investigating officer might not offer the
desired result.
The phases in a computer forensics investigation are:

 Secure the subject system


 Take a copy of hard drive/disk
 Identify and recover all files
 Access/view/copy hidden, protected, and temp files
 Study special areas on the drive
 Investigate the settings and any data from programs on the system
 Consider the system from various perspectives
 Create detailed report containing an assessment of the data and information
collected

Things to be avoided during forensics investigation:

 Changing date/timestamps of the files


 Overwriting unallocated space

Things that should not be avoided during forensics investigation:

 Engagement contract
 Non-Disclosure Agreement (NDA)

Elements addressed before drawing up a forensics investigation engagement contract:

 Authorization
 Confidentiality
 Payment
 Consent and acknowledgement
 Limitation of liability

General steps in solving a computer forensics case are:

 Prepare for the forensic examination


 Talk to key people about the case and what you are looking for
 Start assembling tools to collect the data and identify the target media
 Collect the data from the target media
 Use a write blocking tool while performing imaging of the disk
 Check emails records too while collecting evidence
 Examine the collected evidence on the image that is created
 Analyze the evidence
 Report your finding to your client
 CURRENT COMPUTER FORENSICS TOOLS:

 Digital forensics tools have become vitally important to data breach


investigations. Experts need them to uncover, analyze and interpret
digital evidence.
 Law enforcement uses digital forensics tools when solving crimes.
Businesses also use them to conduct incident response and recover
data. For example, organizations can use digital forensics tools to
analyze how a breach occurred, whether attackers accessed or
exfiltrated data, and how the malicious actors moved through the
network.
 With this information, organizations can accurately describe an attack to
affected stakeholders and law enforcement. The tools' widespread use
provides information on the tactics, techniques and procedures of
cybercriminal groups.
 Digital forensics products range from all-encompassing suites of tools to
dedicated single products designed for specific tasks. Listed below and
arranged alphabetically are five tools used and respected by digital
forensics experts for either criminal investigations, incident response or
both.
 Many digital forensics experts use multiple tools to handle different
aspects of the forensics process, depending on the requirements of the
investigation.
 1. Cellebrite
 Cellebrite is the go-to tool provider for mobile forensics, offering broad support
of mobile devices and advanced data exfiltration. Cellebrite offers multiple
mobile device forensics platforms, including Cellebrite Universal Forensic
Extraction Device, Cellebrite Premium Enterprise, Cellebrite Premium as a
Service and Cellebrite Inspector. Its products can be used in concert with other
digital forensics tools. For example, a cybersecurity investigator can do
computer forensics with Magnet Axiom and then switch to Cellebrite for
mobile data extraction and analysis.
 Organizations can contact Cellebrite for information on which digital forensics
platform suits their needs and for pricing.
 For more information on the vendor's various digital forensics tools,
visit Cellebrite's page.
 2. Magnet Axiom
 Magnet Axiom is commonly used for high-level analysis. It supports
investigation and analysis of computer, mobile, cloud and vehicle data.
Beneficial features include automation and an accessible UI designed to be
simple to use. Axiom offers a less clunky display and formats investigation
results in a cleaner manner, making it a useful tool for less-technical
investigators.
 Organizations can try a free 30-day trial of Magnet Axiom. For demo and
pricing information, visit Magnet Axiom's page.
 3. Velociraptor
 Velociraptor is an open source tool designed for internal security teams to
gather evidence across all endpoints. It can rapidly gather and store event logs
from an organization's endpoints so security teams can examine them for
suspicious activity. The lightweight digital forensics tool is still relatively new
to the market but boasts consistent development and an active community on
Discord for troubleshooting and more.
 For more information, visit Velociraptor's page.
 4. Wireshark
 Wireshark is an open source tool for network analysis that has been in use for
more than 20 years. It can show every network packet sent from and received
by a device, enabling an investigator to break down the type of traffic, as well
as its source and destination. It suits analyzing a potential data breach to see
where the attacker is sending compromised data. Wireshark can examine wired
and wireless network traffic for connection information and even what a single
packet contains.
 For more information, visit Wireshark's page.
 5. X-Ways Forensics
 X-Ways Forensics is a tool for investigators who like to manually dig deep for
analysis, rather than rely on automation. It boasts advanced technical features
for disk analysis, such as capturing and detailing drive contents, slack space
and interpartition space. It can operate even on limited hardware. Forensics
experts can start their analysis with other tools, such as Magnet Axiom, and
then delve into in-depth analysis using X-Ways.
 X-Ways offers nonperpetual and perpetual licenses starting at $1,339 and
$3,189, respectively. The vendor also offers WinHex, Investigator and Imager
licenses.

1 Introduction Digital forensics has been developed in a way to other types of forensics. With other
forensic sciences, methodologies have often been based on scientific discoveries, and through ad-hoc
research [1, 2, 3]. However, due to the rapid growth of digital investigations, scientific processes are
now integrated into investigations in order to make digital forensics evidence acceptable in court [4, 3].
Work has also been undertaken to provide ways to help juries understand the value of digital evidence
[4]. Robertson defines this the evolution over the next ten years for forensic sciences and illustrates
some of the current challenges: “New technologies and improved instrumentation will continue to
emerge. Forensic science usually has a lag period before these are adopted into the forensic area, and
this is to be expected, given the conservative nature of the legal arena. In the past, however, the lag
period has been too long. Forensic scientists need to be quicker to recognize the potential applications
to forensic problems and they also need to be able to carry out research aimed at helping to interpret
what analytical data means.” [5] As digital forensics is still an immature science, legal issues still exist
which need to be overcome, such as from Meyers and Rogers [6] who highlighted three main issues
faced within computer forensics: • Admissibility of evidence. In order to ensure that evidence are
admissible in court, investigators should follow rigorous procedures. These need to be standardised, or,
at minimum, guidelines should be defined. Unfortunately, even closely following recommendations,
errors can be made. This is often due to each case being different, and it is not possible to create a
manual which can cover all possibilities. For this reason, it is essential that digital investigators develop
appropriate skills to get round these problems. • Standards and certifications. Certifications are a good
way to develop investigators’ skills. This has is already successfully applied in other computer fields, such
as computer security [7]. A small number of certifications – such as the Certified Forensic Analyst [8]
certification, which is provided by the Global Information Assurance Certification (GIAV) founded by the
SANS Institute – are available. Nonetheless, certifications and standards are not only applicable to
investigators.

International Organization for Standardization (ISO) associated with the International Electrotechnical
Commission (IEC) created this standard in order to provide laboratories general requirements to carry
out, tests, calibrations and sampling. The main requirements are the following: • Management system •
Document control • Subcontracting of tests and calibrations • Purchasing services and supplies • Service
to the customer • Complaints • Corrective action • Preventive action • Test and calibration methods and
method validation • Assuring the quality of test and calibration results • Reporting the results Projects
have been carried out by organisations in order to evaluate Digital Forensic Tools. The most well-known
project is undertaken by the National Institute of Standards and Technology (NIST) under the Computer
Forensics Tool Testing (CFTT) project [10]. Results of these tests are released to the public. The Scientific
Working Group on Digital Evidence (SWGDE) and the DoD [11] also assess Digital Forensic Tools,
however, results are available only to U.S. law enforcement agencies [12]. It is difficult to understand
the reason behind the choice in not release information because computer forensics, as with any other
science, is based on information sharing. Even if all results were available, it would not be possible for
these organisations to keep up with the fast pace of tool development [13]. In addition, many
practitioners rely too much on vendors capability to validate their own tools [3]. Furthermore, Beckett
[3] defined that, in order to comply with the standard requirements, laboratories should not rely on
testing performed by another organisation. The risk of failure in Digital Forensic Tools has been proven
by different authors. NIST [14] demonstrated that the famous acquisition tool dd [15] was not able to
retrieve the last sector of the hard drive if it had an odd number of sectors. These results have been
confirmed by Kornblum [16]. However, the author explained that the behaviour was not coming from
the tool implementation. Instead, he argued that this issue was coming from the Linux Kernel 2.4, but
not from Linux Kernel 2.6, which demonstrates that organisations which validate DFTs can make
mistakes. If the results are followed blindly by laboratories, a major issue might arise if errors have been
introduced in the testing procedure. The previous example discussed the issues related to software.
However, investigators might have to use other tools which combine hardware and software such as
write blockers. NIST produced an evaluation methodology for this type of products and evaluated
multiple write blockers. Beckett [3] properly explained the risks that a laboratory might encounter if no
additional testing is carried out. Each device needs to be tested before it can be used in the field, but a
manufacturing fault may be present, or that the device was damaged, for instance, during transport.

NIST Standardised Approach of Tool Evaluation In the Computer Forensics Tool Testing (CFTT) project,
NIST developed methodologies to validate a range of forensics tools, initally focusing on data acquisition
tools [21, 22] and write blocker [23, 24] (software and hardware based). Figure 2 illustrates the
methodology used to assess the tools [10]. When a tool will be tested, the NIST methodology starts by
acquiring the tool, with a review of the tool documentation. If this documentation is non-existent, the
tool is analysed in order to generate such documentation, and which leads to a list of features along
with the requirements for these features, and thus a test strategy. This methodology is based on
rigorous and scientific methods, and the results are reviewed by both of the stakeholders (vendor and
testing organization), ensuring a certain level of fairness. However, this is also the major weakness of
this methodology, as the time required for the evaluation can be significant. The resources needed to
carry out each test does not enable a single organisation to test all tools along with all versions [13].
Thus, by the time the results are publicly available, the version of the tested tool might be deprecated.
In addition, the requirements of features might evolve which need to be reflected in the test strategy.
Moreover, the time needed to define the requirement of a single function need to be counted in years.
NIST has defined standards for string searching tools [25], but since dditional work has been made
publicly available. The specifications for digital data acquisition tools are still in a draft version since
2004 [21], and these examples show that this methodology is not viable for law enforcement agencies
to rely only on organisations which evaluate DFTs. Some categories of tools commonly used in digital
investigation are only not covered, such as file carving tools. For these reasons, it is essential for digital
investigators to validate DFTs themselves. 2.3 Validation and Verification of Digital Forensics Tools with
Reference Sets Beckett [3] explained that testing may not find all errors of a DFT, due to the fact that a
complete evaluation of a product would require extensive resources. The requirements defined by ISO
17025:2005 [9] specifies that validation is a balance between cost, risk and technical possibilities.
However, testing should be able to provide information on the reliability of the tool.

Computer forensics hardware tools:

hese devices are often powered from the source or from the suspect machine. However, the forensic
analyst who relies on forensic hardware must ensure that all possible connectors are available prior to
starting a job. Some of the advantages include:

1. The mbedded development that has been completed, saving space and time and generally simplifying the
acquisition process.

2. The greater portability of the products.

3. The increeased speed of acquiring of digital data using hardware devices compared to using software.

Hard Disk Write Protection Tools

Hardware protection devices offer a simple method to acquire an image of a suspect drive with much less fiddling
with the configuration settings in software. This makes the process simpler and less prone to error.

The following section introduces a number of these hardware-based forensic tools.

NoWrite
NoWrite prevents data from being written to the hard disk. It supports hard disk drives with high capacities. It is
compatible with all kinds of devices including USB or FireWire boxes, adapters, and IDE interface cables. It
supports communication between common IDE interfaces.

NoWrite is only functional on native IDE devices. It supports all USB features such as plug- and-play. It is
compatible with most operating systems and drive formats. NoWrite is transparent to the operating system and the
application programs ...

Validating and Testing Forensics software: 1. Introduction


Originating in the late 1980s as an ad hoc practice to meet the service demand from the law
enforcement community, computer forensics has recently developed into a multi-domain
discipline crossing the corporate, academic and law enforcement fields. While the definitions of
computer forensics and its interacting elements vary and depend on the authors and their
backgrounds (Lin, 2008), the core connotation of computer forensics can be concisely described
as the process of identifying, preserving, analyzing and presenting digital evidence in a manner
that is legally acceptable (McKemmish, 1999). In this work, we use the term Electronic Evidence
(EE) which has been in common use by law enforcement and agencies in Australia and which
subsumes, and includes, terms such as computer forensics, digital forensics and forensic
computing.
Over the last decade, the world has experienced an explosive growth in IT technology and
electronic crime. On one hand, the technology field has become very dynamic and the number of
types of digital devices with processing and storage capacity in common usage, such as notebook
computers, iPods, cameras and mobile phones, has grown incredibly rapidly. On the other hand,
unfortunately, these continual advances in IT technology pose complex challenges to the
electronic evidence discipline.

One of those challenges faced by EE practitioners is how to assure the reliability (or forensical
soundness) of digital evidence acquired by EE investigation tools (NRC, 2009). As today's EE
investigations heavily rely on automated software tools, the reliability of investigation outcomes
is predominantly determined by the validity and correctness of such tools and their application
process. Therefore, an insistent demand has been raised by law enforcement and other agencies
to validate and verify EE tools to assure the reliability of digital evidence.
Another factor demanding the validation and verification of EE tools is the request to bring the
EE discipline inline with other established forensic disciplines (e.g. DNA and ballistics) (Beckett
and Slay, 2007). To achieve this goal, one main way is to gain external accreditation, such as
ISO 17025 laboratory accreditation (ISO 17025E). EE laboratories and agencies are tested
against developed criteria and have to satisfy the extensive requirements outlined within this
document to gain accreditation. As a part of the accreditation, the EE tools and their utilization
process need to be tested.
In this work, we propose a functionality orientated paradigm for EE tool validation and
verification based on Beckett's work (Beckett and Slay, 2007). Within this paradigm, we dissect
the EE discipline into several distinct functional categories, such as searching, data recovery and
so on. For each functional category, we further identify its details, e.g. sub-categories,
components and etc. We call this dissection process function mapping. Our focus in this work is
the searching function. We map the searching function, specify its requirements, and design a
reference set for testing EE tools that possess searching functions.
The rest of this paper is organized as follows. Section 2 explains the necessity for EE tool
validation and verification. It also reviews the related work of traditional EE tools testing in the
EE discipline. Section 3 discusses the previous work and identifies their limitations. In Section 4,
we present our functionality orientated testing paradigm in detail, which includes its fundamental
methodology and unique features. Section 5 presents detailed searching function mapping. The
requirements of searching function are identified in Section 6. Lastly, we develop a focused pilot
reference set for testing the searching function in Section 7. This paper is finally concluded by
Section 8.
2. Background and related work
2.1. Validation and verification of softwares
The methods and technologies that provide confidence in system software are commonly called
software validation and verification (VV). There are two approaches to software VV: software
inspection and software testing (Fisher, 2007). While software inspection takes place at all states
of the software development life-cycle, inspecting requirements documents, design diagrams and
program codes, software testing runs an implementation of the target software to check if the
software is produced correctly or as intended. The VV work proposed in this paper falls into the
software testing category.
Since introduced in early 1990s, the concept of validation and verification has been interpreted in
a number of contexts. The followings are some examples.

 1)
In IEEE standard 1012–1998, validation is described as the process of evaluating a system or
component during or at the end of the development process to determine whether it satisfies
requirements. Verification is the process of evaluating a system or component to determine
whether the products of a given development phase satisfy the conditions imposed at the start of
that phase.
 2)
ISO (17025E) describes validation as the confirmation by examination and the provision of
objective evidence that the particular requirements for a specific intended use are fulfilled.
 3)
Boehm (1997), from the software engineering point of view, succinctly defines validation and
verification as “validation: Are we building the right product?” and “verification: Are we
building the product right?”
 4)
The only available description of software validation in the EE discipline is given by the
Scientific Working Group on Digital Evidence (SWGDE, 2004) as an evaluation to determine if
a tool, technique or procedure functions correctly and as intended.
Taking into consideration all these definitions and keeping in mind the requirements of ISO
17025, we adopt the definitions of validation and verification of forensic tools (Beckett and Slay,
2007) as follows.
 •
Validation is the confirmation by examination and the provision of objective evidence that a
tool, technique or procedure functions correctly and as intended.
 •
Verification is the confirmation of a validation with laboratories tools, techniques and
procedures.
2.2. Demands of EE tools validation and verification
The process of using automated software has served law enforcement and the courts very well,
and experienced detectives and investigators have been able to use their well-developed policing
skills, in conjunction with the automated software, so as to provide sound evidence. However,
the growth in the field has created a demand for new software (or increased functionality to
existing software) and a means to verify that this software is truly forensic, i.e. capable of
meeting the requirements of the ‘trier of fact’. Another factor demanding EE tools validation and
verification is for the EE discipline to move inline with other established forensic disciplines.

2.2.1. Trustworthiness of digital evidence


The validity and credibility (i.e. the “trustworthiness”) of electronic evidence are of paramount
importance given the forensic (for court) context of the discipline. Nowadays, the collection,
preservation and analysis of electronic evidence in the EE process mainly rely on EE tools
(hardware or software). If the EE tools or their application procedures are incorrect or not as
intended, their results, i.e. digital evidence, will be questioned or may be inadmissible for court.
In other words, the trustworthiness of digital evidence relies on the scientific application of the
process, the analysis and the correct utilization and functioning of computer forensic tools.

However, the EE community is now facing a complex and dynamic environment with regard to
EE tools. On one hand, the technology field has become very dynamic and the types of digital
devices, such as notebook computers, iPods, cameras and mobile phones, have changed
incredibly rapidly. And thus the digital evidence acquired from those devices has also changed.
On the other hand, in such a dynamic technological environment, there is no individual tool that
is able to meet all the needs of a particular investigation (Bogen and Dampier, 2005). Therefore,
the world has been witnessing an explosive boom in EE tools in the last decade. Although these
EE tools are currently being used by law enforcement agencies and EE investigators, we must be
aware that while some of them (e.g. EnCase, FTK) were originally developed for the forensic
purpose, others were designed to meet the needs of particular interest groups (e.g. JkDefrag
(Kessels) is a disk defragmenter and optimizer for Windows 2000/2003/XP/Vista/2008/X64).
Hence, to guarantee that the digital evidence is forensically sound, EE investigators must
validate and verify the EE tools that they are using to collect, preserve and analyze digital
evidences.
2.2.2. Laboratory accreditation
The establishment of digital forensic laboratories within Australia has predominantly been
aligned with law enforcement agencies. While these laboratories or teams have worked
successfully since their establishment, the discipline is now developing to a stage where the
procedures, tools and people must be gauged against a quality and competency framework.

To achieve this goal, one main method is to comply with ISO 17025 Laboratory Accreditation
standard. The ISO 17025 intends to specify the general requirements for the competence to carry
out test and/or calibrations. It encompasses testing and calibration performed by the laboratory
using standard methods, non-standard methods, and laboratory-developed methods. A laboratory
complying with this standard will also meet the quality management system requirements of ISO
9001. Among these requirements (e.g. document control, internal audits and etc.), one content
relating to the subject of this paper is “test and calibration methods and method validation”. Due
to the lack of verification and validation of EE tools, the EE branch may not be accredited by
ISO 9001, like other law enforcement branches (e.g. DNA and ballistics) have already done. As
a result, for the EE branch to avoid becoming the shortest wooden board (branch) of a bucket
(law enforcement departments) calls for the verification and validation of EE tools.
2.3. Existing works of EE tools validation and verification
In the last a few years, although extensive research efforts have been conducted in the EE
disciple ranging from generic frameworks and models (Reith et al., 2002, Bogen and Dampier,
2005, Brian, 2006) to practical guidelines and procedures (Beebe and Clark, 2005, Ruibin et al.,
2005, Good practice guide), there is still very little on the validation and verification of digital
evidence and EE tools.
Some efforts (Palmer, 2001, Bor-Wen Hsu and Laih, 2005) in the past have been made to
investigate “trustworthiness” of digital evidence, that is the product of the process. In these
works, the digital evidence (outcomes of the tools) are being examined rather than validating and
verifying the tools themselves. The question can be asked as to why we should not validate the
forensic development of such tools also.
The National Institute of Standards and Technology (NIST) is the one of the pioneers pursuing
the validation and verification of computer forensic tools. Within NIST, the Computer Forensics
Tool Testing (CFTT) project (NIST, 2002) was established to test the EE tools. The activities
conducted in forensic investigations are separated into discrete functions or categories, such as
write protection, disk imaging, string searching, etc. A test methodology is then developed for
each category. So far, several functionalities and tools have been tested and documented, such as
write blockers (NIST, 2003), disk imaging (NIST, 2004, NIST, 2005), string search (NIST,
2009) and mobile devices associated tools (NIST, 2008).
Developing extensive and exhaustive tests for digital investigation tools is a lengthy and
complex process, which the CFTT project at NIST has taken on. To fill the gap between
extensive tests from NIST and no public tests, Carrier (Brian, 2005) has been developing small
test cases, called Digital Forensics Tool Testing Images (DFTT). These tests include keyword
searching, data carving, extended partition and windows memory analysis.
Another research entity that is interested in the validation and verification of EE tools is the
Scientific Working Group on Digital Evidence (SWGDE). Rather than developing specific test
cases, the SWGDE recommended general guidelines for validation testing of EE tools (SWGDE,
2004). These guidelines include purpose and scope of testing, requirements to be tested,
methodology, test scenario selection, test data and documenting test data used.
The validation and verification of EE tool can also been conducted by the vendors that produce
these tools. For example, the Encase and FTK are two widely used digital forensic investigation
tools in the world. Their developers, Guidance Software and Access Data have conducted some
validation and verification work on Encase and FTK. These works can be found on their bulletin
boards.

3. Discussion of existing works


3.1. Various EE tools VV approaches
As pointed out in the last section, a range of excellent work has been conducted on the validation
and verification of EE tools, such as NIST/CFTT, DFTT and the VV work of EE tool vendors. In
terms of the methods or methodologies that they utilize, these testing work can be categorized
into two classes: tool orientated VV and functionality orientated VV.

3.1.1. Tool orientated VV approach


The validation and verification work of EE tools conducted by the vendors (e.g. Encase from
Guidance Software and FTK from Access data) falls into this category. Traditionally, in the
digital forensic domain, the EE software tool, as an unseparated entity, is treated as the target of
validation and verification. Usually, axiomatic proofs and/or reproducible experiments (testing)
are required to perform the VV. To validate the target, the test cases need to be defined, the tests
need to be run and the measured results need to be verified.

There are a few of points that need to be noted. First, vendor validation has been widely
undocumented, and not proven publicly, except through rhetoric and hearsay on bulletin boards.
Many published documents in this field discuss repeatability of process with other tools as the
main validation technique, but no documented record can be found in the discipline that expands
on the notion of two tools being wrong (Beckett and Slay, 2007).
Secondly, this validation work treats the EE software package as a single unseparated entity.
Tool orientated validation methods would usually invalidate a tool package when one of all the
functions fails the validation even though all other functions pass the test. In most cases a
forensic tool package is quite complex and provides hundreds of specific functions (Wilsdon and
Slay, 2005), of which only a few may ever be used by an examiner. Therefore, according to the
traditional tool orientated validation method, a digital forensic software suite, where most
functions are valid and can be partially utilized, will not be utilized at all, or else must wait for
the complete validation of the entire function set. Because the cost of purchasing such software is
so great it would be infeasible to discount an entire package due to a single or small group of
functions failing validation.
Following the second point is the cost and complexity issue of the tool orientated VV approach.
Currently, to keep abreast of the broad range and rapid evolution of technology, many EE tools
(and their updated versions) are constantly emerging. These tools either are designed solely for
forensic purposes or are designed to meet the needs of particular interest groups. In such
complex and diverse environments of EE tools, even trivial testing of all functions of a forensic
tool for every version under all conditions, conservative estimates would indicate significant cost
(Beckett and Slay, 2007).
3.1.2. Functionality orientated VV approach
NIST/CFTT and DFTT perform the validation and verification of EE tools from another angle:
functionality driven. Instead of targeting the EE software tool, they start the validation by
looking at the EE discipline itself. They identify various activities required in forensic
investigation procedures and separate them into functionalities or categories, such as write
protection, disk imaging, string searching, etc. Then, they specify requirements that need to be
fulfilled for each function category. Based on the requirements specification, testing cases are
then designed to test functions of candidate EE tools.

The difference between the functionality orientated VV approach and the tool orientated VV
approach is that the former does not treat a EE tool as a single entity. Instead, they parse an EE
tool (or package) into various functions and test each function against the requirements specified
by practitioners and expert advisory groups. For example, in the case of disk imaging testing
(NIST, 2005), the EnCase LinEn 6.01 is selected as a test target and only its imaging function is
tested. Clearly, the functionality orientated VV approach outperforms the tool orientated VV
approach in terms of the effectiveness and cost.
3.2. Open issues in previous work
Despite the considerable achievements of previous EE work (including validation and
verification of digital evidence and investigation tools), we discover two potential issues
remaining unsolved, which motivate our proposed work.

The first open issue is that operational focus in the digital forensics domain to date has been to
solve each problem as it presents and not to look at the process of analysis as a whole. For
example, when dealing with the issue of analyzing an image obtained from one new device (e.g.
new iPod), researchers and practitioners may design an investigation tool specifically working
with this new device, rather than examining what impact will be on the digital forensics as a
scientific discipline.

Digital forensics is very much an emerging discipline and has developed in an ad-hoc fashion
(Beckett and Slay, 2007) without much of the scientific rigour of other scientific disciplines,
such as DNA, ballistics, and fingerprints. Although the scientific foundations of EE field and the
functions which together make up the EE process exist, they have never been formally or
systematically mapped and specified (scientific foundations), or stated and characterized
(functions). Though there have been recent efforts to formalize a definitive theory of digital
forensics and research dissertations that focus on the process model have started to appear
(Brian, 2006), there is still no adequate description of any depth of the specific functions of the
discipline.
The second open issue regarding the validation and verification of EE tools is that methodologies
proposed by NIST/CFTT and DFTT are broad and offer no conclusive detailed identification of
what needs to be tested. In other words, there is still a lack of systematical and definitive
description of the EE field as a scientistic discipline. For example, what basic procedures are in
the EE investigation? What fundamental functionalities are needed in the EE investigation? What
are the requirements of each functionality?

4. A new functionality orientated VV paradigm


4.1. Proposed VV methodology
Our methodology starts with a scientific and systematical description of the EE field through a
model and the function mapping. Components and processes of the EE discipline are defined in
this model and fundamental functions in EE investigation process are specified (mapped), i.e.
searching, data preservation, file identification, etc. Based on the comprehensive and clear
understanding of EE discipline, we then actually perform the validation and verification of EE
tools as follows. First, for each mapped function, we specify its requirements. Then, we develop
a reference set in which each test case (or scenario) is designed corresponding to one function
requirement. With the reference set, a EE tool or its functions can be validated and verified
independently.

In this work, we use the CFSAP (computer forensic-secure, analyze, present) model (Mohay
et al., 2003) to describe the basic procedures of EE investigation. In this model, four fundamental
procedures are identified: Identification, preservation, analysis and presentation. In the context of
validation and verification, identification and presentation are skill-based concepts, while
preservation and analysis are predominately process, function and tool driven concepts and are
therefore subject to tool validation and verification. In Beckett's previous work (Beckett and
Slay, 2007), the processes of preservation and analysis are preliminarily dissected into several
fundamental functions at an abstract level. The functions in the data preservation procedure are
forensic copy, verification, write protection and media sanitation. The data analysis procedure
involves eight functions: searching, file rendering, data recovery, decryption, file identification,
processing, temporal data and process automation. An ontology of such function mapping is
shown in Fig. 1.

1. Download : Download full-size image


Fig. 1. Validation and verification top level mapping.
In this work, we aim to complete the mapping of the functional categories of the field to a level
of abstraction that would serve the purposes of a specification for a software developer, a
technical trainer or educator, or for tool validation or verification. Specifically, we detail the
specification of functional categories (e.g. searching, data preservation, file rendering and etc)
and its sub-categories. For example, the searching function category can be further divided into
three sub-categories, i.e. searching target, searching mode and searching domain which have a
number of parameters need to be specified. We focus this work on the searching function, i.e.
mapping the searching function, specifying the requirements of searching function and
developing the reference set to validate and verify EE tools that possess the searching function.
For each mapped function, we specify its requirements. The following are some examples of
searching function requirements.

 •
The tool shall find a keyword in the file slack.

 •
The tool shall find a keyword in a deleted file.
 •
The tool shall find a regular expression in a compressed file.
Based on the requirement specification, we then develop a reference set in which each test case
(or scenario) is designed corresponding to one function requirement. With the reference set, a EE
tool or its functions can be validated and verified independently.

Our proposed VV methodology can be presented as the following. If the domain of computer
forensic functions is known and the domain of expected results (i.e. requirements of each
function) are known, that is, the range and specification of the results, then the process of
validating any tool can be as simple as providing a set of references with known results. When a
tool is tested, a set of metrics can also be derived to determine the fundamental scientific
measurements of accuracy and precision. In summary, if the discipline can be mapped in terms
of functions (and their specifications) and, for each function, the expected results are identified
and mapped as a reference set, then any tool, regardless of its original design intention, can be
validated against known elements.

4.2. Features and benefits


Our functionality orientated validation is performed on individual functionality of the software
package rather than treating the package as a single entity. Validating each of the specific
functions allows for a complex collection of tools to be partially utilized for active investigations
rather than waiting for the complete validation of the complete set. Because of the function
mapping, requirements specification and reference sets (which are described in detail in
Section 5 Searching function mapping, 6 Requirements specification of searching function and
Section 7), our validation approach has the following features.
 •
Detachability: a EE tool suite may possess a number of functions. Each function can be
individually validated by our testing approach and the tool suite can be partially utilized rather
than waiting for the complete validation of the complete function set.

 •
Extensibility: With a defined function, there exists a set of specifications for components that
must be satisfied for the result of a function to be valid. That means as new specifications are
found they can be added to a schema that defines the specification.

 •
Tool (tool version) Neutrality: If the results, or range of expected results, are known for a
particular function, then it does not matter what tool is applied, but that results it returns for a
known reference set can be measured. As a tool naturally develops over time, new versions are
common, but the basic premise of this validation and verification paradigm means that the
comments previously described for tool neutrality are also measurable.

 •
Transparency: A set of known references described as a schema are therefore auditable and
independently testable.
Beside validating and verifying the EE tools, a standard test paradigm is also useful for
proficiency testing (certification), training (competency) and development of procedures.
According to a survey conducted by the National Institute of Justice (Appel and Pollitt, 2005),
only 57% of agencies in US required specific training to duplicate, examine and analyze
evidence and more than 70% of practitioners had no or minimal (less than a few hours) of
training in this discipline. The situation in Australia is no different: new investigators, and the
pool of seasoned detectives with advanced IT qualifications is drying up. Although some modern
IT security certifications include aspects of forensic computing as part of incident response but
there is no formal Australian certification, or established training standards for Electronic
Evidence. Through the development of these standard tests, the skills necessary to carry out a
particular test may similarly be specified. For example, if we know that a piece of software must
be able to search for a keyword from an image, then we can also specify that the investigator will
only be certified as competent if he or she can use the software to analyze the image and find the
same specified keyword.

5. Searching function mapping


For our validation and verification model to be developed, the discipline needs to be described in
sufficient detail so that the discrete functions can be applied to the model. In this section, we
complete the searching function mapping by detailing its sub-categories and various factors that
need to be considered.

Generally speaking in the computer forensic domain, the searching relates to finding and
locating the information of interest in digital devices. Naturally, several questions would be
asked when performing the searching: what do we search? how to search? And where to search?
To answer these questions, we divide the searching function category into three sub-
categories: searching target, searching mode and searching domain as shown in

Fingerprint recognition and iris scanning are the most well-known forms of
biometric security. However, facial recognition and (finger and palm) vein
pattern recognition are also gaining in popularity. In this article we consider the
pros and cons of all these different techniques for biometric security.

1. Fingerprint recognition
An identification system based on fingerprint recognition looks for specific
characteristics in the line pattern on the surface of the finger. The bifurcations, ridge
endings and islands that make up this line pattern are stored in the form of an image.

The disadvantage of capturing an image of an external characteristic is that this image


can be replicated – even if it is stored in encoded form. An image is still an image,
after all, and can therefore be compared. In principle, you can then generate the same
code. Fingerprints can already be spoofed* using relatively accessible technology.
Another, by no means insignificant, point to consider is that a finger presented for
recognition does not necessarily still need to be attached to a body...

In addition, some line patterns are so similar that in practice this can result in a high
false acceptance rate.** Fingerprints can also wear away as you get older, if you do a
lot of DIY or a particular kind of work, for example. As a result, some people may
find that their fingerprints cannot be recognised (false rejection**) or even recorded.
There is even a hereditary disorder that results in people being born without
fingerprints!

On the other hand, fingerprint identification is already familiar to much of the public
and is therefore accepted by a large number of users to use as biometric security. The
technology is also relatively cheap and easy to use. It should be noted, however, that
quality can vary significantly from one fingerprint recognition system to another, with
considerable divergence between systems in terms of false acceptance and false
rejection rates.

* Biometric spoofing refers to the presentation of a falsified biometric characteristic


with the aim of being identified as another person. This may involve using a
replicated fingerprint or a contact lens with a falsified iris pattern. The risk of
spoofing mainly applies to forms of biometric security based on superficial external
characteristics.

** Find out more about false acceptance and false rejection in our article ‘FAR and
FRR: security level versus user convenience’.

Recogtech is the specialist in biometric security. View our biometric scanner.

2. Facial recognition

A facial recognition system analyses the shape and position of different parts of the
face to determine a match. Surface features, such as the skin, are also sometimes taken
into account.

Facial recognition for biometric security purposes is an offshoot of face detection


technology, which is used to identify faces in complex images in which a number of
faces may be present. This technology has developed rapidly in recent years and is
therefore an excellent candidate as biometric security if a system is needed for remote
recognition. Another plus is that the technology allows ‘negative identification’, or the
exclusion of faces, making it a good deal easier to scan a crowd for suspicious
individuals.

However, facial recognition also has a number of significant drawbacks. For example,
the technology focuses mainly on the face itself, i.e. from the hairline down. As a
result, a person usually has to be looking straight at the camera to make recognition
possible. And even though the technology is still developing at a rapid pace, the level
of security it currently offers does not yet rival that of iris scanning or vein pattern
recognition.
3. Iris recognition

When an iris scan is performed a scanner reads out the unique characteristics of an
iris, which are then converted into an encrypted (bar)code. Iris scanning is known to
be an excellent biometric security technique, especially if it is performed using
infrared light.

However, one problem frequently encountered when the technology is introduced is


resistance from users. Quite a few people find having their eyes scanned a rather
unpleasant experience. You also have to adopt a certain position so the scanner can
read your iris, which can cause discomfort. Hygiene is another frequently cited
drawback, as many systems require users to place their chin on a chin rest that has
been used by countless people before them.

Lastly, it is important to bear in mind that although iris scanning offers a high level of
biometric security, this may come at the expense of speed. Incidentally, systems have
recently been developed that can read a person’s iris from a (relatively short) distance.

CONTACT US

4. Finger vein pattern recognition


In the case of vein pattern recognition the ending points and bifurcations of the veins
in the finger are captured in the form of an image, digitised and converted into an
encrypted code. This method, combined with the fact that veins are found beneath
rather than on the surface of the skin, makes this technology considerably more secure
than fingerprint-based identification, as well as faster and more convenient for the
user. It is a more expensive method, however.

Another point to bear in mind is that very cold fingers and ‘dead’ fingers (such as
those of people suffering from Raynaud’s syndrome) are impossible or difficult to
read using finger vein pattern recognition. Perhaps the greatest drawback, however, is
that this type of biometric security is still relatively unknown.

5. Palm vein pattern recognition


This technique is also based on the recognition of unique vein patterns. However, as
more reference points are used than in the case of finger vein pattern recognition, this
is an even simpler and more secure identification method.

The technology, which cannot be copied (or only with extreme difficulty), is currently
regarded as the best available method in the area of biometric security, alongside iris
scanning. Palm scanning is fast and accurate and offers a high level of user
convenience.

Access control systems based on palm vein pattern recognition are relatively
expensive. For that reason such systems are mainly used within sectors that have
exacting demands when it comes to security, such as government, the justice system
and the banking sector.

Please note that this recognition method is sometimes confused with hand geometry.
However, that is an outdated form of biometrics that is based on the shape of the hand
and involves even fewer unique characteristics than fingerprint recognition.

Forensic Audio-Video Analysis and Verification


Audio and video are the digitalized source of evidence that can be found
at the scene of a crime or with the victim or the accused in the form of
audio-video from mobile device or any CCTV footage. Such types of
digital evidences are of utmost importance in civil or criminal cases.
Therefore, audio and video forensics is the leading branch of forensic
science in the digitalized era.
In forensic science, audio-video forensics forms three basic principles
such as acquisition, analysis, and evaluation of audio and video
recordings which are admissible in the court of law. One of the main
tasks of audio and video forensic experts is to establish the authenticity
and credibility of digital evidence. The forensic examination of audio and
video is done in order to enhance the recordings to improve speech
intelligibility and audibility of the sounds.
How the analysis of Audio-Video evidences are performed?

One of the primary tasks of forensic digital investigators to assist the


crime scene investigators in order to find the conclusive proof via a
number of scientific tools and equipments. After following the
standardized procedure of crime scene investigation, at the time of
evidence collection, the investigators must thoroughly search the
suspected area and recover the evidence carefully. Such digital
evidences must be protected from physical harm, environment, and
heat.
Once the evidence is collected in a safe and secure manner, the proper
documentation of evidence must be done in the form of notes or
photography/videography. The documentation must include in which
condition the evidence was found from the crime scene along with the
name of the evidence collector, date, and time of evidence collection. All
examination protocols are carefully examined and constructed for
enhancement techniques that must be employed to the recovered
evidence.
A variety of enhancement techniques can be employed on audio and
video analysis. The techniques which are employed for video
enhancement are as follows:
a. Sharpening: This step makes the edges of the images more clear and
distinct.
b. Video Stabilization: This step reduces the amount of movement in the
video, and also produces the fine playback.
c. Masking: This step covers the face or areas of the video that may
protect the victim, witness or any law enforcement official.
d. Interlacing: In an analog system, the interlaced scanning is used to
record the images or at the same time, this process is used to de-
interlace which may be used to retrieve the information.
e. Demultiplexing: There is a device in CCTV named multiplexer, which
is used to combine multiple series of video signals into a single signal.
The techniques which are employed for audio enhancement are as follows:

a. Frequency Equalization: There is a need for highly precise equalizers


that are used to cut specific bands of frequencies. The frequency band
which often contains more speech content is amplified or isolated.
Large noise or interruptions can be analyzed by spectrum analyzer and
the corresponding frequencies are reduced in order to reduce noise
level.
b. Compression: Faint or low sound can be increased by compressing
or leveling the signal so that the loud range can be minimized.
Such evidences are critically analyzed and listened carefully and proper
documentation must be done. After standard forensic protocol, the
forensic report is prepared by a forensic cyber expert and presented in
the court of law.
The forensic audio-video report must include the following details:
1. Results which were obtained from the analysis of audio-video files
2. Waveform charts of audio recordings and comparison waveform with
formants
3. Identification of the format and type of recording
4. Type of processing which was used to analyze
5. Date and Time of analysis of audio files.
6. Description of the evidence in the type of circumstances and
conditions in which it was collected
7. Description of the enhanced audio-video and type of software that
was used.
8. Qualifications of the Audio Video Analyst.
9. Authorized signatory with name and stamp
10. Hash Values of the Audio Video Files

What is Windows Forensic Analysis?

Windows Forensic Analysis focuses on 2 things:


1. In-depth analysis of Windows Operating System.
2. Analysis of Windows System Artifacts.
Windows artifacts are the objects which hold information about the activities
that are performed by the Windows user. The type of information and the
location of the artifact varies from one operating system to another. Windows
artifacts contain sensitive information that is collected and analyzed at the time
of forensic analysis.

What are Forensic Artifacts?

Forensic artifacts are the forensic objects that have some forensic value. Any
object that contains some data or evidence of something that has occurred like
logs, register, hives, and many more. In this section, we will be going through
some of the forensic artifacts that a forensic investigator look for while
performing a Forensic analysis in Windows.
1. Recycle Bin: The windows recycle bin contains some great artifacts like:
 $1 file containing the metadata. You can find this file under the
path C:\$Recycle.Bin\SID*\$Ixxxxxx
 $R file containing the contents of the deleted files. This file can be located
under the path C:\$Recycle.Bin\SID*\$Rxxxxxx
 $1 file can be parsed using a tool $1 Parse.
2. Browsers: Web browsers contain a lot of information like:
 Cookies.
<="" li="" style="box-sizing: border-box;">

 Cached website data.


 Downloaded files.
3. Windows Error Reporting: This features enables user to inform Microsoft
about application faults, kernel faults, unresponsive application, and other
application specific problems. This feature provides us with various artifacts
like:
 Program Execution, if a malicious program crashes during program
execution.
 You can locate these artifacts at the following locations:
 C:\ProgramData\Microsoft\Windows\WER\ReportArchive
 C:\Users\XXX\AppData\Local\Microsoft\Windows\WER\ReportArchive
 C:\ProgramData\Microsoft\Windows\WER\ReportQueue
C:\Users\XXX\AppData\Local\Microsoft\Windows\WER\ReportQueue
4. Remote Desktop Protocol Cache: When using the “mstc” client that is
provided by the Windows, RDP can be used to move laterally through the
network. Cache files are created containing the sections of the screen of the
machine to which we are connected to and that is rarely changing. These cache
files can be located in the directory:
C:\Users\XXX\AppData\Local\Microsoft\Terminal Server Client\Cache
Tools like BMC-Tools can be used to extract images stored in these cache files.
5. LNK Files: .lnk files are the windows shortcut files. LNK files link or point to
other files or executables for ease of access. You can find following information
in these files:
 The original path of the target file.
 Timestamp of both the target files and the .lnk files.
 File Attributes like System, Hidden, etc.
 Details about the disk.
 Remote or local execution.
 MAC address of the machines.
You can use tools like Windows LNK Parsing Library or LECmd to parse the
contents of these files.
6. Jump Lists: They contain information about the recently accessed
applications and files. This feature was introduced with Windows 7. Two types
of Jump Lists can be created in Windows:
 AUTOMATICDESTINATIONS-MS: These jump lists are created
automatically when a user opens a file or an application. They are located
under the path:
C:\Users\xxx\AppData\Roaming\Microsoft\Windows\Recent\AutomaticDestin
ations
 CUSTOMDESTINATIONS-MS: These jump lists are custom made and are
created when a user pins a file or an application. They are located under the
directory C:\Users\xxx\AppData\Roaming\Microsoft\Windows\Recent\Custom
Destinations
You can use tools like JumpList Explorer, JLECmd, or Windows JumpList
Parser to parse Jump lists.
7. Prefetch Files: These files contain a wealth of information like:
 Application Name.
 Application path.
 Last execution timestamp.
 Creation timestamp.
These files can be located under the directory: C:\Windows\Prefetch\. You can
use tools like Windows Prefetch Parser, WinPrefetchView, or PECmd.

Top Open-Source Tools for Windows Forensic Analysis

In this section, we will be discussing some of the open-source tools that are
available for conducting Forensic Analysis in the Windows Operating System.
1. Magnet Encrypted Disk Detector: This tool is used to check the encrypted
physical drives. This tool supports PGP, Safe boot encrypted volumes,
Bitlocker, etc. You can download it from here.
2. Magnet RAM Capture: This tool is used to analyze the physical memory of
the system. You can download it from here.
3. Wireshark: This is a network analyzer tool and a capture tool that is used to
see what traffic is going in your network. You can download it from here.
4. RAM Capture: As the name suggests, this is a free tool that is used to
extract the entire contents of the volatile memory i.e. RAM. You can download it
from here.
5. NMAP: This is the most popular tool that is used to find open ports on the
target machine. Using this tool you can find the vulnerability of any target to
hack. You can download it from here.
6. Network Miner: This tool is used as a passive network sniffer to capture or
to detect the operating systems ports, sessions, hostnames, etc. You can
download it from here.
7. Autopsy: This is the GUI based tool, that is used to analyze hard disks and
smartphones. You can download it from here.
8. Forensic Investigator: This is a Splunk toolkit which is used in HEX
conversion, Base64 conversion, metascan lookups, and many more other
features that are essential in forensic analysis. You can download it from here.
9. HashMyFiles: This tool is used to calculate the SHA1 and MD5 hashes. It
works on all the latest websites. You can download it from here.
10. Crowd Response: This tool is used to gather the system information for
incident response. You can download it from here.
11. ExifTool: This tool is used to read, write, and edit meta information from a
number of files. You can download it from here.
12. FAW (Forensic Acquisition of Websites): This tool is used to acquire web
pages image, HTML, source code of the web page. This tool can be integrated
with Wireshark. You can download it from here.

LINUX FORENSICS

While Windows forensics is widely covered via a number of courses


and articles, there are fewer resources introducing to the Linux
Forensics world. I have recently had an opportunity to handle a Linux-
based case. Hence, the article aims to share some useful artifacts which
can be used as a checklist to assist a Linux forensics case and as a lead
to further investigation.

OS forensics is the art of finding evidence/artifacts left by systems,


apps and users’ activities to answer a specific question. Windows
Forensics is well researched, in which there are multiple places for
evidence (some of them are event hard to wipe up completely like
registry hives) as de facto standards such as registry hives, event logs,
prefetches, shell items (e.g. shortcut, jumplist etc.), userassist, SRUM,
Shellbag, amcache.hve and shimcache etc. Linux Forensics, in the
other way around, is less popular and doesn’t contain anything in
common with Windows forensics. This article is organized in four
sections that provide more insight into Linux forensics:
 Image capture and mounting

 System configuration

 User activities (e.g. opened files, commands etc.)

 Logfile analysis

Note:

 Since there are a number of Linux distributions and the article can’t
cover all of them. All artifacts below are presented for Debian.
Fortunately, it is trivial to find similar artifacts in another
distribution.

 The article assumes the dead box situation which means that you
only have a hard disk(s) from the targeted machine.

 If you have any other findings/ideas, please kindly share them in


the comment below.

Image capture and mounting


There are multiple ways/tools for image capture. FTK Imager (a GUI
tool — freeware from Access data) is properly one of the most famous
tools for creating digital forensics images (FTK® Imager 4.2.1 is the
latest version at the time of writing which can be referenced here).
There is also a good user guideline on creating a forensics image —
Forensics 101: Acquiring an Image with FTK Imager. However, in this
article, I present a command-line utility, namely dd, which is available
in most Linux distributions.
Note: for a sound image capture process, connect the investigated
hard drive to a write blocker so that no change can be made to the
device.

dd (a command-line tool, available in most Unix and Linux) is a tool to


copy files at the bit level. Below is the command in action, in
which input is the hard drive of the given Linux box (e.g. /dev/sdb)
and output is where the image is stored (e.g.
/home/forensics/linux_disk.img).
dd if=input of=output

For the forensic investigation, you may want to mount a copy of the
original image in another Linux machine. The steps below illustrate
how to mount a raw image in a Debian Linux machine:

 Step 1: attach the image to a loop device:


sudo losetup /dev/loop0 <raw_image_to_mount> (if
/dev/loop0 is already occupied, /dev/loopX can be used instead)
Then to verify that the image is attached using losetup -a

 Step 2: Using kpartx (available to most Linux system) to map


image partitions. Each partition will be mapped to
/dev/mapper/loop0pX (X is a number)
sudo kpartx -a /dev/loop0
 Step 3: Mount mapped loopback as read-only
sudo mount -o ro /dev/mapper/loop0pX

System Configuration

 Host Name is useful to identify the computer name that the hard
disk belongs to. Furthermore, it can be used to correlate with other
logs and network traffic based on the hostname.

 Time Zone is important to build an event timeline (usually


converted to UTC).

 Network configuration:

/etc/network/interfaces is the configuration file for network setup


(dynamic or static IP assignment as well as scripts running when the
interface is “up” or “down”).
A /etc/network/interfaces sample configuration

/etc/host is the configuration file for local DNS name assignment.


/etc/resolv.conf is the configuration file for DNS. However, if
the resolvconf program is used, the configuration for DNS
is /etc/resolvconf/run/resolv.conf.
/etc/dnsmasq.conf is the configuration file for DNS forwarder
server and DHCP server if it is implemented in the investigated host.
/etc/wpa_supplicant/*.conf contains SSID configuration to which
the Linux machine will automatically connect when the wifi signal is in
the vicinity.

 OS information determines OS release information.

 Login information:
There are three places to find this information:
(1) /var/log/auth.log records connections/authentication to the
Linux host. The command “grep -v cron auth.log*|grep -v
sudo|grep -i user” filters out most of the unnecessary data and
leaves only information regarding connection/disconnection.
(2) /var/log/wtmp maintains the status of the system, system
reboot time and user logins (providing time, username and IP
address if available). For more information, please refer to this
Wikipedia page.
(3) /var/log/btmp records failed login attempts.

Use “last -f” to examine the content of wtmp

 Account and group: may provide more inside about permission


of an interested user or find out whether any suspicious account
was created. Those information are stored in /etc/passwd (user
account), /etc/groups (group information). Furthermore, it is
recommended to check the /etc/sudoers file as well since it
describes what commands a user can run with privilege permission.

 Mounted Disk: provides more inside how the Linux box is set up.
Noticeably, attackers may mount a particular path to RAM; hence,
it will not survive upon reboot.
 Persistence mechanisms:
- Cron jobs are often used for persistence. Cron jobs can be
examined in /etc/crontab (system-wide crontab)
and /var/spool/cron/crontabs/<username> (user-wide
crontab)
- Bash Shell initialization: when starting a shell, it will first
execute ~/.bashrc and ~/.bash_profile for each user.
/etc/bash.bashrc and /etc/profile are the system-wide versions
of ~/.bashrc and ~/.bash_profile (If another shell is used, checked
in documents of that shell for similar configuration files).
- Service start-up: System V (configuration files are in
/etc/init.d/* and /etc/rd[0–6].d/*) , Upstart (configuration files
are in /etc/init/*) and Systemd (configuration files are
in /lib/systemd/system/* and /etc/systemd/system/*). For more
information regarding service start-up, please refer to How To
Configure a Linux Service to Start Automatically After a Crash or
Reboot — Part 2: Reference
- RC (Run-control) is a traditional way with init to start
services/programs when run level changes. Its configuration can be
found at /etc/rc.local:
The word “forensics” means the use of science and technology to investigate
and establish facts in criminal or civil courts of law. Forensics is the procedure
of applying scientific knowledge for the purpose of analyzing the evidence and
presenting them in court.
Network forensics is a subcategory of digital forensics that essentially deals
with the examination of the network and its traffic going across a network that is
suspected to be involved in malicious activities, and its investigation for
example a network that is spreading malware for stealing credentials or for the
purpose analyzing the cyber-attacks. As the internet grew cybercrimes also
grew along with it and so did the significance of network forensics, with the
development and acceptance of network-based services such as the World
Wide Web, e-mails, and others.
With the help of network forensics, the entire data can be retrieved including
messages, file transfers, e-mails, and, web browsing history, and reconstructed
to expose the original transaction. It is also possible that the payload in the
uppermost layer packet might wind up on the disc, but the envelopes used for
delivering it are only captured in network traffic. Hence, the network protocol
data that enclose each dialog is often very valuable.
For identifying the attacks investigators must understand the network protocols
and applications such as web protocols, Email protocols, Network protocols, file
transfer protocols, etc.
Investigators use network forensics to examine network traffic data gathered
from the networks that are involved or suspected of being involved in cyber-
crime or any type of cyber-attack. After that, the experts will look for data that
points in the direction of any file manipulation, human communication, etc. With
the help of network forensics, generally, investigators and cybercrime experts
can track down all the communications and establish timelines based on
network events logs logged by the NCS.

Processes Involved in Network Forensics:


Some processes involved in network forensics are given below:
 Identification: In this process, investigators identify and evaluate the
incident based on the network pointers.
 Safeguarding: In this process, the investigators preserve and secure the
data so that the tempering can be prevented.
 Accumulation: In this step, a detailed report of the crime scene is
documented and all the collected digital shreds of evidence are duplicated.
 Observation: In this process, all the visible data is tracked along with the
metadata.
 Investigation: In this process, a final conclusion is drawn from the collected
shreds of evidence.
 Documentation: In this process, all the shreds of evidence, reports,
conclusions are documented and presented in court.
Challenges in Network Forensics:
 The biggest challenge is to manage the data generated during the process.
 Intrinsic anonymity of the IP.
 Address Spoofing.

Advantages:
 Network forensics helps in identifying security threats and vulnerabilities.
 It analyzes and monitors network performance demands.
 Network forensics helps in reducing downtime.
 Network resources can be used in a better way by reporting and better
planning.
 It helps in a detailed network search for any trace of evidence left on the
network.
Disadvantage:
 The only disadvantage of network forensics is that It is difficult to implement.

E-Mail Investigation:

Role of Email in Investigation


Emails play a very important role in business communications and
have emerged as one of the most important applications on
internet. They are a convenient mode for sending messages as well
as documents, not only from computers but also from other
electronic gadgets such as mobile phones and tablets.

The negative side of emails is that criminals may leak important


information about their company. Hence, the role of emails in digital
forensics has been increased in recent years. In digital forensics,
emails are considered as crucial evidences and Email Header
Analysis has become important to collect evidence during forensic
process.

An investigator has the following goals while performing email


forensics −

 To identify the main criminal


 To collect necessary evidences
 To presenting the findings
 To build the case

Challenges in Email Forensics


Email forensics play a very important role in investigation as most
of the communication in present era relies on emails. However, an
email forensic investigator may face the following challenges during
the investigation −
Fake Emails

The biggest challenge in email forensics is the use of fake e-mails


that are created by manipulating and scripting headers etc. In this
category criminals also use temporary email which is a service that
allows a registered user to receive email at a temporary address
that expires after a certain time period.

Spoofing

Another challenge in email forensics is spoofing in which criminals


used to present an email as someone else’s. In this case the
machine will receive both fake as well as original IP address.

Anonymous Re-emailing

Here, the Email server strips identifying information from the email
message before forwarding it further. This leads to another big
challenge for email investigations.

Techniques Used in Email Forensic Investigation


Email forensics is the study of source and content of email as
evidence to identify the actual sender and recipient of a message
along with some other information such as date/time of
transmission and intention of sender. It involves investigating
metadata, port scanning as well as keyword searching.

Some of the common techniques which can be used for email


forensic investigation are

 Header Analysis
 Server investigation
 Network Device Investigation
 Sender Mailer Fingerprints
 Software Embedded Identifiers

In the following sections, we are going to learn how to fetch


information using Python for the purpose of email investigation.
Extraction of Information from EML files
EML files are basically emails in file format which are widely used for
storing email messages. They are structured text files that are
compatible across multiple email clients such as Microsoft Outlook,
Outlook Express, and Windows Live Mail.

An EML file stores email headers, body content, attachment data as


plain text. It uses base64 to encode binary data and Quoted-
Printable (QP) encoding to store content information. The Python
script that can be used to extract information from EML file is given
below −

First, import the following Python libraries as shown below −

from __future__ import print_function


from argparse import ArgumentParser, FileType
from email import message_from_file

import os
import quopri
import base64

In the above libraries, quopri is used to decode the QP encoded


values from EML files. Any base64 encoded data can be decoded
with the help of base64 library.

Next, let us provide argument for command-line handler. Note that


here it will accept only one argument which would be the path to
EML file as shown below −

if __name__ == '__main__':
parser = ArgumentParser('Extracting information from
EML file')
parser.add_argument("EML_FILE",help="Path to EML
File", type=FileType('r'))
args = parser.parse_args()
main(args.EML_FILE)
Mobile forensics, a subtype of digital forensics, is concerned with retrieving data
from an electronic source. The recovery of evidence from mobile devices such
as smartphones and tablets is the focus of mobile forensics. Because
individuals rely on mobile devices for so much of their data sending, receiving,
and searching, it is reasonable to assume that these devices hold a significant
quantity of evidence that investigators may utilize.
Mobile devices may store a wide range of information, including phone records
and text messages, as well as online search history and location data. We
frequently associate mobile forensics with law enforcement, but they are not the
only ones who may depend on evidence obtained from a mobile device.
Uses of Mobile Forensics:
The military uses mobile devices to gather intelligence when planning military
operations or terrorist attacks. A corporation may use mobile evidence if it fears
its intellectual property is being stolen or an employee is committing fraud.
Businesses have been known to track employees’ personal usage of business
devices in order to uncover evidence of illegal activity. Law enforcement, on the
other hand, may be able to take advantage of mobile forensics by using
electronic discovery to gather evidence in cases ranging from identity theft to
homicide.
Process of Mobile Device Forensics:

 Seizure and Isolation: According to digital forensics, evidence should


always be adequately kept, analyzed, and accepted in a court of law. Mobile
device seizures are followed by a slew of legal difficulties. The two main
risks linked with this step of the mobile forensic method are lock activation
and network / cellular connectivity.
 Identification: The identification purpose is to retrieve information from the
mobile device. With the appropriate PIN, password, pattern, or biometrics, a
locked screen may be opened. Passcodes are protected, but fingerprints are
not. Apps, photos, SMSs, and messengers may all have comparable lock
features. Encryption, on the other hand, provides security that is difficult to
defeat on software and/or hardware level.
 Acquisition: Controlling data on mobile devices is difficult since the data
itself is movable. Once messages or data are transmitted from a
smartphone, control is gone. Despite the fact that various devices are
capable of storing vast amounts of data, the data itself may be stored
elsewhere. For example, data synchronization across devices and apps may
be done either directly or via the cloud. Users of mobile devices commonly
utilize services such as Apple’s iCloud and Microsoft’s One Drive, which
exposes the possibility of data harvesting. As a result, investigators should
be on the lookout for any signs that data may be able to transcend the
mobile device from a physical object, as this might have an impact on the
data collecting and even preservation process.
 Examination and analysis: Because data on mobile devices is
transportable, it’s tough to keep track of it. When messages or data from a
smartphone are moved, control is lost. Despite the fact that numerous
devices can hold vast amounts of data, the data itself may be stored
elsewhere.
 Reporting: The document or paper trail that shows the seizure, custody,
control, transfer, analysis, and disposition of physical and electronic
evidence is referred to as forensic reporting. It is the process of verifying
how any type of evidence was collected, tracked, and safeguarded.
Principles of Mobile Forensics:
The purpose of mobile forensics is to extract digital evidence or relevant data
from a mobile device while maintaining forensic integrity. To accomplish so, the
mobile forensic technique must develop precise standards for securely seizing,
isolating, transferring, preserving for investigation, and certifying digital
evidence originating from mobile devices.
The process of mobile forensics is usually comparable to that of other fields of
digital forensics. However, it is important to note that the mobile forensics
process has its own unique characteristics that must be taken into account. The
use of proper methods and guidelines is a must if the investigation of mobile
devices is to give positive findings.

You might also like