digital-forensics-seminar-report
digital-forensics-seminar-report
digital-forensics-seminar-report
ENT
It is great happiness and privilege for me to present this seminar report on Feb.
05.2013. This is an essential requirement to complete course as a student of B.Tech. Four year
degree course.
INDEX
01. ACKNOWLEDGMENT
02. INDEX
05. BIBLOGRAPHY
SOFTWRE & HARDWARE USED
Software used:
➢ AccessData (FTK)
➢ Encase
➢ PRTK
➢ Registry Viewer
Hardware Used:
➢ Hard Drive copier
➢ Dossier
INTRODUCTION
TO
DIGITAL FORENSICS
The field of digital evidence, aka Forensic Computing, is unlike most other forensic sciences
because the nature of the material under examination is determined, largely, by human ingenuity.
Rather than looking for traces of material deposited by physical or biological entities, which tend
to develop and evolve slowly, we deal with technology which is updated, enhanced and even
created at an alarming rate.
Since the 1960s, the rate of development of digital technology has held true to Moore’s law,
which originally proposed that the density of transistors on a given area of silicon would double
approximately every 18 months. Since the start of the 21st century, the rate has slowed slightly,
but we still see a doubling in density every two years.
This means that a modern mobile phone can contain more processing power and storage capacity
than the computers which NASA used to send man to the moon. In a device a fraction of the
size. At a much lower price. And easier to use. With greater reliability. And a smaller power
supply.
Key developments
The time when only nerds or geeks1 were interested in computers is long gone. Advances in
computer usability have led to the development of digital devices which are no longer the sole
preserve of the white-coated “high priests” of computing (once known as the programmers and
operators), but have become accessible to everyone capable of holding a mouse or using a
keyboard. Increasing dependence on computers can, arguably, be traced back to the late 1970s
and early 1980s with the development of machines such as the Apple], Lisa and Macintosh;
Sinclair ZX81 and Spectrum; Commodore Vic20, 64 and Amiga and, finally, the IBM Personal
Computer.
The IBM PC, with its standardized low-cost hardware, simple Microsoft Disc Operating System
(PC-DOS or MS-DOS) and the backing of the world’s largest computer manufacturer resulted in
a host of imitators and compatible machines targeted mainly at business. It seems that Ponselle’s
law2 was perceived to be true in business. The creation of low-cost machines that allowed users
to perform common computing tasks on their desktops, without having to wait for time on
The company mainframe or mini-computer, led to the first steps towards pervasive computing:
“computing anywhere and everywhere”.
The success of these IBM-compatible PCs with Microsoft operating systems and applications
created a de-facto standard, never before seen, which allowed free exchange of data and
information between systems, people and organizations, thus eliminating one of the biggest
barriers to information exchange.
Standardization of software and data created opportunities for “paperless offices”, where every
member of staff had access to computing resources on the desktop – often linked to a local area
network – connecting machines within an office or building for even greater resource sharing
and efficiency. Meanwhile, since the 1960s, work had been progressing on what we know today
as the Internet. This wide area network began life as an academic project designed to allow data
sharing between distant sites, but in a way which allowed the network to be scaled up to include
millions of machines.
Again, this created a de-facto international standard for networking through the creation of an
easy to use system which allowed developers to add new features without compromising the
existing network. In effect, the Internet provides a global “road network” which is capable of
carrying any type of traffic which can be devised.
Prior to 1989, however, the Internet was largely the preserve of the technically minded, mostly
because of the huge number of incompatible applications which existed on it. Tim Berners-Lee, a
British physicist working at CERN, proposed a new information management system for CERN
to counter the problems of information loss, damage and confusion which the organization was
suffering at the time.
The proposal defined an information sharing system which allowed disparate information
systems to be linked together via a common interface based around the concept of Hypertext. In
a Hypertext system, the user can navigate around the text by activating links, which jump to
other pieces of text. Berners-Lee’s innovation was to allow these links to reference documents
and even applications external to the current document. In this way, theWorldWideWeb as we
know it was born, with a single consistent interface to a range of different applications.
Arguably, this is the single most important innovation in information systems in the 20th
century. It has certainly led to the widespread adoption of Internet services as a part of everyday
life.
Alongside, and slightly behind, the developments in desktop computing and internetworking, the
continual shrinking of components created opportunities for smaller devices to be created. In the
1980s we saw the creation of the first analogue mobile telephony networks, with a proper
launch in the UK in 1982. Although the devices in use were bulky with very limited battery life
(typically a few hours), poor network coverage and susceptible to interference and
eavesdropping, they were well-received and became essential tools for modern business. 19824
also saw the launch of the compact disc (CD) by Philips, setting a new standard for audio and
data storage. The thirst for increased capacity in this convenient disc format led to the later
creation of the Digital Versatile Disc (DVD) and the current battles over High-
Definition disc standards.
By the 1990s, the GSM5 18] standard had been developed for digital mobile telephone networks,
providing better quality and better use of the available networks where it was implemented.
Continued improvements in technology meant that digital handsets had shrunk in size to become
devices which could fit into a briefcase or pocket, with longer battery life and lower cost.
The Internet and the existence of a workable digital telecommunications network, combined with
increasingly powerful low-cost devices, also created a desire to distribute more complex data in
the form of music, photographs and video. Unfortunately, although the communications
technologies were effective, they had not been designed with real-time high quality audio and
video in mind. As a result, it became necessary to develop compression methods such as MPEG
33] which would allow acceptable quality content to be delivered over low-bandwidth
connections. The same technology is currently being used for broadcast digital television and is
used to allow multiple digital channels to occupy the same bandwidth as a single analogue
channel (although the analogue and digital signals cannot be present at the same time).
Inadvertently, theMPEG2 standard for digital video had a major impact on the music industry
through the creation of a new standard for digital audio – MP3 (MPEG2 layer 3). The 1990s also
saw some major changes in operating systems. Microsoft finally released its “Chicago” software,
better known as Windows 95, setting a new baseline for the IBM-compatible world. This
included networking in a format which was relatively easy to set up and considerably easier than
the previous Windows 3.11 and Windows for Workgroups systems, which had relied on support
being provided by their underlying DOS. Developments of Windows 95 strengthened network
and hardware support through
Windows 98 and ME until the “home” platform converged with Microsoft’s professional
operating system (Windows NT) to create Windows XP and, at the start of the 21st century,
Windows Vista.
Most recently, a new development in wired telecommunications has driven down the cost of
high-performance internetworking to the point where it has become affordable for domestic
users. Broadband ds technology, in the form, mainly, of ADSL (Asymmetric Digital
Subscriber Line), offers a high-speed digital connection
using existing telephone wiring. It offers an always-on connection, for those who want it, and
allows consumers to receive more complex, “richer” content, in the form of video and other
media, than was previously possible using slow dial-up connections. The increased speed also
means that it has become properly possible for someone to work at home as efficiently as they
could in an Office.
The network connection to their home computer is not as fast as the one they would have in the
corporate network, but the speed is sufficient for them to access core corporate services such as
e-mail.
Digital forensics (sometimes known as digital forensic science) is a branch of forensic science
encompassing the recovery and investigation of material found in digital devices, often in
relation to computer crime.1]2] The term digital forensics was originally used as a synonym for
computer forensics but has expanded to cover investigation of all devices capable of storing
digital data.1] With roots in the personal computing revolution of the late 1970s and early '80s,
the discipline evolved in a haphazard manner during the 1990s, and it was not until the early 21st
century that national policies emerged.
Digital forensics investigations have a variety of applications. The most common is to support or
refute a hypothesis before criminal or civil (as part of the electronic discovery process) courts.
Forensics may also feature in the private sector; such as during internal corporate investigations
or intrusion investigation (a specialist probes into the nature and extent of an unauthorized
network intrusion).
The technical aspect of an investigation is divided into several sub-branches, relating to the type
of digital devices involved; computer forensics, network forensics, forensic data analysis and
mobile device forensics. The typical forensic process encompasses the seizure, forensic imaging
(acquisition) and analysis of digital media and the production of a report into collected evidence.
As well as identifying direct evidence of a crime, digital forensics can be used to attribute
evidence to specific suspects, confirm alibis or statements, determine intent, identify sources (for
example, in copyright cases), or authenticate documents.3] Investigations are much broader in
scope than other areas of forensic analysis (where the usual aim is to provide answers to a series
of simpler questions) often involving complex time-lines or hypotheses.4]
History
Prior to the 1980s crimes involving computers were dealt with using existing laws. The first
computer crimes were recognized in the 1978 Florida Computer Crimes Act, which included
legislation against the unauthorized modification or deletion of data on a computer system.5]6]
Over the next few years the range of computer crimes being committed increased, and laws were
passed to deal with issues of copyright, privacy/harassment (e.g., cyber bullying, cyber stalking,
and online predators) and child pornography.7]8] It was not until the 1980s that federal laws
began to incorporate computer offences. Canada was the first country to pass legislation in
1983.6] This was followed by the US Federal Computer Fraud and Abuse Act in 1986,
Australian amendments to their crimes acts in 1989 and the British Computer Abuse Act in
1990.6]8]
The growth in computer crime during the 1980s and 1990s caused law enforcement agencies to
begin establishing specialized groups, usually at the national level, to handle the technical
aspects of investigations. For example, in 1984 the FBI launched a Computer Analysis and
Response Team and the following year a computer crime department was set up within the
British Metropolitan Police fraud squad. As well as being law enforcement professionals, many
of the early members of these groups were also computer hobbyists and became responsible for
the field's initial research and direction.
One of the first practical (or at least publicized) examples of digital forensics was Cliff Stoll's
pursuit of hacker Markus Hess in 1986. Stoll, whose investigation made use of computer and
network forensic techniques, was not a specialized examiner. Many of the earliest forensic
examinations followed the same profile.
Throughout the 1990s there was high demand for the new, and basic, investigative resources.
The strain on central units lead to the creation of regional, and even local, level groups to help
handle the load. For example, the British National Hi-
Tech Crime Unit was set up in 2001 to provide a national infrastructure for computer crime; with
personnel located both centrally in London and with the various regional police forces (the unit
was folded into the Serious Organized Crime Agency (SOCA) in 2006).
During this period the science of digital forensics grew from the ad-hoc tools and techniques
developed by these hobbyist practitioners. This is in contrast to other forensics disciplines which
developed from work by the scientific community.1]13] It was not until 1992 that the term
"computer forensics" was used in academic literature (although prior to this it had been in
informal use); a paper by Collier and Saul attempted to justify this new discipline to the forensic
science world.14]15] This swift development resulted in a lack of standardization and training. In
his 1995 book, "High-Technology Crime: Investigating Cases Involving Computers", K
Rosenblatt wrote: 6] Seizing, preserving, and analyzing evidence stored on a computer is the
greatest forensic challenge facing law enforcement in the 1990s.
Although most forensic tests, such as fingerprinting and DNA testing, are performed by specially
trained experts the task of collecting and analyzing computer evidence is often assigned to patrol
officers and detectives.16]
Since 2000, in response to the need for standardization, various bodies and agencies have
published guidelines for digital forensics. The Scientific Working Group on Digital Evidence
(SWGDE) produced a 2002 paper, "Best practices for Computer Forensics", this was followed,
in 2005, by the publication of an ISO standard (ISO 17025, General requirements for the
competence of testing and calibration laboratories). A European lead international treaty, the
Convention on Cybercrime, came into force in 2004 with the aim of reconciling national
computer crime laws, investigative techniques and international co-operation. The treaty has
been signed by 43 nations (including the US, Canada, Japan, South Africa, UK and other
European nations) and ratified by 16.
The issue of training also received attention. Commercial companies (often forensic software
developers) began to offer certification programs and digital forensic analysis was included as a
topic at the UK specialist investigator training facility, Centrex.6]10]
Since the late 1990s mobile devices have become more widely available, advancing beyond
simple communication devices, and have been found to be rich
forms of information, even for crime not traditionally associated with digital forensics.19]
Despite this, digital analysis of phones has lagged behind traditional computer media, largely due
to problems over the proprietary nature of devices.20]
Focus has also shifted onto internet crime, particularly the risk of cyber warfare and cyber
terrorism. A February 2010 report by the United States Joint Forces Command concluded:
Through cyberspace, enemies will target industry, academia, government, as well as the military
in the air, land, maritime, and space domains. In much the same way that airpower transformed
the battlefield of World War II; cyberspace has fractured the physical barriers that shield a nation
from attacks on its commerce and communication.21]
The field of digital forensics still faces unresolved issues. A 2009 paper, "Digital Forensic
Research: The Good, the Bad and the Unaddressed", by Peterson and Shensi identified a bias
towards Windows operating systems in digital forensics research.22] In 2010 Samson Garfunkel
identified issues facing digital investigations in the future, including the increasing size of digital
media, the wide availability of encryption to consumers, a growing variety of operating systems
and file formats, an increasing number of individuals owning multiple devices, and legal
limitations on investigators. The paper also identified continued training issues, as well as the
prohibitively high cost of entering the field.11]
During the 1980s very few specialized digital forensic tools existed, and consequently
investigators often performed live analysis on media, examining computers from within the
operating system using existing sysadmin tools to extract evidence. This practice carried the risk
of modifying data on the disk, either inadvertently or otherwise, which led to claims of evidence
tampering. A number of tools were created during the early 1990s to address the problem.
The need for such software was first recognized in 1989 at the Federal Law Enforcement
Training Center, resulting in the creation of IMDUMP (by Michael White) and in 1990, Safe
Back (developed by Side). Similar software was developed in other countries; DIBS (a hardware
and software solution) was released commercially in the UK in 1991, and Rob McKemmish
released Fixed Disk Image free to Australian law enforcement.9] These tools allowed examiners
to create an exact copy of a piece of digital media to work on, leaving the original disk intact for
verification. By the end of the '90s, as demand for digital evidence grew more advanced
commercial tools such as Encase and FTK were developed,
allowing analysts to examine copies of media without using any live forensics.6] More recently,
a trend towards "live memory forensics" has grown resulting in the availability of tools such as
Windows SCOPE.
More recently the same progression of tool development has occurred for mobile devices;
initially investigators accessed data directly on the device, but soon specialist tools such as XRY
or Radio Tactics Aces appeared.6]
Forensic process
During the analysis phase an investigator recovers evidence material using a number of different
methodologies and tools. In 2002, an article in the International Journal of Digital Evidence
referred to this step as "an in-depth systematic search of evidence related to the suspected
crime."1] In 2006, forensics researcher Brian Carrie described an "intuitive procedure" in which
obvious evidence is first identified and then "exhaustive searches are conducted to start filling in
the holes."4]
The actual process of analysis can vary between investigations, but common methodologies
include conducting keyword searches across the digital media (within files as well as unallocated
and slack space), recovering deleted files and extraction of registry information (for example to
list user accounts, or attached USB devices).
The evidence recovered is analyzed to reconstruct events or actions and to reach conclusions,
work that can often be performed by less specialized staff.1] When an investigation is complete
the data is presented, usually in the form of a written report, in lay persons' terms.1]
Application
An example of an image's Exit metadata that might be used to prove its origin Digital forensics is
commonly used in both criminal law and private investigation. Traditionally it has been
associated with criminal law, where evidence is collected to support or oppose a hypothesis
before the courts. As with other areas of forensics this is often as part of a wider investigation
spanning a number of disciplines. In some cases the collected evidence is used as a form of
intelligence gathering, used for other purposes than court proceedings (for example to locate,
identify or halt other crimes). As a result intelligence gathering is sometimes held to a less strict
forensic standard.
In civil litigation or corporate matters digital forensics forms part of the electronic discovery (or
discovery) process. Forensic procedures are similar to those used in criminal investigations, often
with different legal requirements and limitations. Outside of the courts digital forensics can form
a part of internal corporate investigations.
The main focus of digital forensics investigations is to recover objective evidence of a criminal
activity (termed acts rues in legal parlance). However, the diverse range of data held in digital
devices can help with other areas of inquiry.3] Attribution
Meta data and other logs can be used to attribute actions to an individual. For example, personal
documents on a computer drive might identify its owner.
Alibis and statements Information provided by those involved can be cross checked with digital
evidence. For example, during the investigation into the Sham murders the offender's alibi was
disproved when mobile phone records of the person he claimed to be with showed she was out of
town at the time.
Intent.
As well as finding objective evidence of a crime being committed, investigations can also be
used to prove the intent (known by the legal term men’s real). For example, the Internet history
of convicted killer Neil Entwisted included references to a site discussing How to kill people.
Evaluation of source
File artifacts and meta-data can be used to identify the origin of a particular piece of data; for
example, older versions of Microsoft Word embedded a Global Unique Identifier into files which
identified the computer it had been created on. Proving whether a file was produced on the
digital device being examined or obtained from elsewhere (e.g., the Internet) can be very
important.3]
Document authentication
Related to "Evaluation of source," Meta data associated with digital documents can be easily
modified (for example, by changing the computer clock you can affect the creation date of a
file). Document authentication relates to detecting and identifying falsification of such details.
Limitations
One major limitation to a forensic investigation is the use of encryption; this disrupts initial
examination where pertinent evidence might be located using keywords. Laws to compel
individuals to disclose encryption keys are still relatively new and controversial.11]
Legal considerations
The examination of digital media is covered by national and international legislation. For civil
investigations, in particular, laws may restrict the abilities of analysts to undertake examinations.
Restrictions against network monitoring, or reading of personal communications often exist.
During criminal investigation, national laws restrict how much information can be seized. For
example, in the United Kingdom seizure of evidence by law enforcement is governed by the
PACE act. The "International Organization on Computer Evidence" (IOCE) is one agency that
works to establish compatible international standards for the seizure of evidence.
In the UK the same laws covering computer crime can also affect forensic investigators. The
1990 computer misuse act legislates against unauthorized access to computer material; this is a
particular concern for civil investigators who have more limitations than law enforcement.
An individual’s right to privacy is one area of digital forensics which is still largely undecided by
courts. The US Electronic Communications Privacy Act places limitations on the ability of law
enforcement or civil investigators to intercept and access evidence. The act makes a distinction
between stored communication (e.g. email archives) and transmitted communication (such as
VOIP). The latter, being considered more of a privacy invasion, is harder to obtain a warrant
for.6]16] The ECPA also affects the ability of companies to investigate the computers and
communications of their employees, an aspect that is still under debate as to the extent to which a
company can perform such monitoring.6]
Article 5 of the European Convention on Human Rights asserts similar privacy limitations to the
ECPA and limits the processing and sharing of personal data both within the EU and with
external countries. The ability of UK law enforcement to conduct digital forensics investigations
is legislated by the Regulation of Investigatory Powers Act.6]
Digital evidence
When used in a court of law digital evidence falls under the same legal guidelines as other forms
of evidence; courts do not usually require more stringent guidelines.6]28] In the United States
the Federal Rules of Evidence are used to evaluate the admissibility of digital evidence, the
United Kingdom PACE and Civil Evidence acts have similar guidelines and many other
countries have their own laws. US federal laws restrict seizures to items with only obvious
evidential value. This is acknowledged as not always being possible to establish with digital
media prior to an examination.26]
Laws dealing with digital evidence are concerned with two issues: integrity and authenticity.
Integrity is ensuring that the act of seizing and acquiring digital media does not modify the
evidence (either the original or the copy). Authenticity refers to the ability to confirm the
integrity of information; for example that the imaged media matches the original evidence.26]
The ease with which digital media can be modified means that documenting the chain of custody
from the crime scene, through analysis and, ultimately, to the court, (a form of audit trail) is
important to establish the authenticity of evidence.6]
Attorneys have argued that because digital evidence can theoretically be altered it undermines
the reliability of the evidence. US judges are beginning to reject this theory; in the case US v.
Bonillo the court ruled that "the fact that it is possible to alter data contained in a
computer is plainly insufficient to establish
untrustworthiness."6]29] In the United Kingdom guidelines such as those issued by ACPO are
followed to help document the authenticity and integrity of evidence.
Digital investigators, particularly in criminal investigations, have to ensure that conclusions are
based upon factual evidence and their own expert knowledge.6] In the US, for example, Federal
Rules of Evidence state that a qualified expert may testify “in the form of an opinion or
otherwise” so long as:
The sub-branches of digital forensics may each have their own specific guidelines for the
conduct of investigations and the handling of evidence. For example, mobile phones may be
required to be placed in a Faraday shield during seizure or acquisition to prevent further radio
traffic to the device. In the UK forensic examination of computers in criminal matters is subject
to ACPO guidelines.6]
Investigative tools
The admissibility of digital evidence relies on the tools used to extract it. In the US, forensic
tools are subjected to the Dauber standard, where the judge is responsible for ensuring that the
processes and software used were acceptable. In a 2003 paper Brian Carrier argued that the
Dauber guidelines required the code of forensic tools to be published and peer reviewed. He
concluded that "open source tools may more clearly and comprehensively meet the guideline
requirements than would closed source tools."31]
Branches
Digital forensics includes several sub-branches relating to the investigation of various types of
devices, media or artifacts.
Computer forensics
The goal of computer forensics is to explain the current state of a digital artifact; such as a
computer system, storage medium or electronic document.32] the discipline usually covers
computers, embedded systems (digital devices with rudimentary computing power and onboard
memory) and static memory (such as USB pen drives).
Computer forensics can deal with a broad range of information; from logs (such as internet
history) through to the actual files on the drive. In 2007 prosecutors used a spreadsheet recovered
from the computer of Joseph E. Duncan III to show premeditation and secure the death
penalty.3] Sharon Lopatka's killer was identified in 2006 after email messages from him
detailing torture and death fantasies were found on her computer.6]
Network forensics
Network forensics is concerned with the monitoring and analysis of computer network traffic,
both local and WAN/internet, for the purposes of information gathering, evidence collection, or
intrusion detection.34] Traffic is usually intercepted at the packet level, and either stored for later
analysis or filtered in real- time. Unlike other areas of digital forensics network data is often
volatile and rarely logged, making the discipline often reactionary.
In 2000 the FBI lured computer hackers Aleksey Ivanovo and Gorchakov to the United States for
a fake job interview. By monitoring network traffic from the pair's computers, the FBI identified
passwords allowing them to collect evidence directly from Russian-based computers.6]35]
Forensic Data Analysis is a branch of digital forensics. It examines structured data with the aim
to discover and analyses patterns of fraudulent activities resulting from financial crime.
Database forensics
Database forensics is a branch of digital forensics relating to the forensic study of databases and
their metadata.36] Investigations use database contents, log files and in-RAM data to build a
timeline or recover relevant information.
Device Handling
This is particularly true of digital devices as, unlike some other forensic sciences;
we cannot “split” them into separate samples for testing using different processes
by independent parties. The act of cutting a digital device into pieces tends to stop
it working at all. The Association of Chief Police Officers for England and Wales
produces the “Good Practice Guide for Computer Based Electronic Evidence”
2], 1 which lays down four key principles applicable to the handling and
processing of digital devices.
Principle 1:
No action taken by law enforcement agencies or their agents should change data
held on a computer or storage media which may subsequently be relied upon in
court.
Principle 2:
Principle 3:
An audit trail or other record of all processes applied to computer based electronic
evidence should be created and preserved. An independent third party should be
able to examine those processes and achieve the same result.
Principle 4:
The person in charge of the investigation (the case officer) has overall
responsibility for ensuring that the law and these principles are adhered to.
Principle 1:
No action taken by investigative agencies or their agents should change data held
on a computer or storage media which may subsequently be relied upon in court.
Principle 4:
The person in charge of the investigation (the manager) has overall responsibility
for ensuring that the law and these principles are adhered to.
Examination Principles
As mentioned previously, the ACPO Good Practice Guide for Computer Based
Electronic Evidence 2] requires that any examination of a digital device is done in
such a way as to minimize the possibility of digital contamination. By implication,
this means that all work should be carried out on a copy of the suspect device
rather than the original and, in a perfect world, this would be the case.
There are circumstances, however, where it is not possible to seize equipment and
produce a complete “forensically sound” copy. We must ensure that any action
taken is justified, appropriate and proportionate. In the case of a shared computer
or network this can be difficult to achieve without performing some level of
analysis at the time when the equipment is first discovered. This approach, often
called previewing is also useful in the laboratory environment to assist in
determining which devices should be given priority for examination due to the
nature of their contents.
Evidence Creation
Any piece of data or software in a digital system may be evidence, if it has some
meaning relevant to the investigation. In order to determine which software or data
we need to consider, it is useful to think in terms of how these came to be present
on the system we are examining. We have already considered, where the examiner
may find material on a system, but now we are starting to consider how it arrived
on the system in the first place, and what it might mean as a result.
Since devices are concerned with information storage and processing, it is said that
the information security has failed in some way and, although this viewpoint may
be valid, it does not always tell the whole story. Some crimes can be committed
without breaching any information security, but simply by using devices in the
correct way for apparently proper purposes. The fault lies outside the system itself
and relates to how it can be used to facilitate activates involving other systems and
people.
One approach that can help to understand where evidence may be entering the
system uses a seven-element model of information security 47] Although the
model does not directly show how anything may have arrived on a system, it does
allow us to consider the elements, both digital and non-digital, which comprise the
whole system, and thus allows us to consider how it has been used and what it has
been processing. If we understand something about how it may have been used in
committing the crime, we can direct our investigative resources most
appropriately.
Evidence Interpretation
Once the means by which unwanted data (which includes software) have arrived has been
identified we can start to consider its meaning. To achieve a thorough consideration of this we
need to think about what the data actually is (Data Content) and where they are (Data Context). It
is possible for illicit material to arrive as a result of innocent (Unknowing) actions and, although
the material is clearly evidence of a crime, the person responsible for it arriving on a system may
not be deliberately or directly involved in the criminal act itself.
BIBLIOGRAPHY
As per the need of my Seminar of Digital Forensics, I required information regarding various
tools used. Therefore I have gone through several books for above information’s.
Books:-
1. Digital Forensics
2. Computer Forensics
Websites:-
➢ www.digitalforensics.com
➢ www.msdn.com