Full Text 01
Full Text 01
IT 23 044
Beneath the Surface: Exploring the Dark Web and its Societal
Impacts
Hasan Saleh
Abstract
The Dark Web is a hidden part of the Internet that has gained attention due to its illegal
activities and potential impact on society. This thesis aims to explore the structure of the
Dark Web and its actors. Moreover, this thesis covers the effects the Dark Web has had on
individuals and society. A comprehensive literature review, interviews with experts, and
explorations of the Dark Web was used to gather information. The findings reveal that the
Dark Web consists of hidden services that are only accessible using specialised software
and tools which helps individuals remain anonymous. Different actors operating on the Dark
Web are identified and categorised into two different categories, lawful and unlawful based
on the activities carried by them. The thesis aims to categorise and analyse the motives and
behaviours of these actors. Anonymity provided by the Dark Web serves different kinds of
purposes and can facilitate illegal activities such as drug trafficking and cybercrime while
also providing a platform for individuals to be able to express their thoughts freely. The
study concludes that the Dark Web influences various aspects of society such as privacy,
security and criminal justice. The research seeks to unveil both the potential benefits and
risks associated with the Dark Web and which challenges it poses for law enforcement
agencies. Moreover, the study calls for methods which can be used to combat the negative
impact the Dark Web has on society.
Faculty of Science and Technology, Uppsala University. Place of publication eg Uppsala/Visby. Supervisor: Name Surname, Subject reader: Name Surname, Examiner: Name Surname
Acknowledgments
I would like to express my deepest appreciation to my reviewers, Karl Marklund
and Aletta Nylén who made this work possible. Their advice, willingness to
invest their time and effort into reviewing my work, engaging in thoughtful
discussions, and offering guidance has been truly remarkable and helped me
through all the stages of writing my thesis. The combined contributions and
efforts of my reviewers have greatly enhanced my research experience and
enabled me to successfully reach this significant milestone in my academic
journey.
TABLE OF CONTENTS 2
Table of Contents
1 Introduction 6
1.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2 Methodology 7
2.1 Literature Study . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.2 Expert Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3 Accessing the Dark Web . . . . . . . . . . . . . . . . . . . . . 9
10 Discussion 49
11 Conclusion 50
LIST OF TABLES 4
List of Tables
1 Purpose of usage for different entities on the Dark Web . . . . 29
2 Services and products offered on different Dark Web websites. 35
LIST OF FIGURES 5
List of Figures
1 Literature Review Methodology . . . . . . . . . . . . . . . . . 8
2 The OSI model. . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3 A three-way handshake. Source: [1]. . . . . . . . . . . . . . . . 15
4 Code showing an HTTP GET request. . . . . . . . . . . . . . 15
5 Hijacking an HTTP connection. . . . . . . . . . . . . . . . . . 21
6 The different layers of encryption applied on a message using
onion routing. Source: [2]. . . . . . . . . . . . . . . . . . . . . 22
7 Onion routing. Source: [3]. . . . . . . . . . . . . . . . . . . . 23
8 Data visible to eaves-droppers when combining Tor with HTTPS.
Source: [4]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
9 First step of setting up a hidden service (Dark Web website). . 25
10 TorBot crawling’s results on the HiddenWiki . . . . . . . . . . 28
11 A Dark Web website which offers free media and news. . . . . 31
12 The New York Times official SecureDrop Dark Web website. . 32
13 A Dark Web website which sells stolen paypal accounts, ebay
accounts, and credit cards. . . . . . . . . . . . . . . . . . . . . 33
14 A Dark Web website which allows users to hire a hacker. . . . 34
15 A Dark Web drug marketplace. . . . . . . . . . . . . . . . . . 36
16 Top markets on the Dark Web sorted by revenue in 2022.
Adopted from: [5]. . . . . . . . . . . . . . . . . . . . . . . . . 37
17 Monthly sales of drugs through different Dark Web markets.
Source: [6]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
18 Proportion of surveyed Internet users using drugs in the past
year who purchased drugs over the Dark Web. Source: [6]. . . 42
19 The impact that market closures have had on individuals.
Source: [6]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
20 Dark Web’s impact on different entities. . . . . . . . . . . . . 44
21 The results of Operation Pacifier. Source: [7]. . . . . . . . . . 48
6
1 Introduction
The Internet has made a huge impact on the world and society by providing
communication abilities between people all over the world. The evolution of
the Internet has made accessing information and communication easier than
ever before. As of January 2023, there are about 5.16 billion Internet users
worldwide [8].
The Internet continues to evolve and shape our world in ways that we could
never have imagined. Connecting people across the globe, making it easier
to research, learn, and stay informed about a wide range of topics, and
providing the convenience of shopping from anywhere in the world are just
a few possibilities that are provided to society by the Internet. The Internet
is being actively monitored and protected from activities that could harm
its users [9]. The usage of IP addresses makes it possible to identify and
monitor activities occurring on the Internet. For example, websites, servers,
and online services often log IP addresses of visitors and users. Internet
Service Providers (ISPs) and network administrators can monitor and log
the IP addresses associated with devices on their network. This is important
for managing network traffic, identifying issues, and maintaining security.
However, in reality, security is only ensured for a limited portion of the
publicly accessible Internet.
The Internet in general, has different layers where privacy and accountability
varies for each one of them. In this thesis, we consider the Internet as divided
into the Surface Web and the Deep Web. The Surface Web, which is also
known as the visible web is the part of the Internet that is easily accessible
by anyone [10]. The Surface Web is publicly accessible by using standardised
search engines [10]. This part of the Internet is actively monitored and
protected, thus accountability can be enforced since its users are identifiable
and their activities are controlled by law through a combination of national
and international legal frameworks, regulations, and agreements that govern
various aspects of online activities.
Another layer of the Internet is called the Deep Web. Accessing the Deep Web
with search engines alone is not possible since websites located on the Deep
Web are not indexed. The Deep Web contains a huge amount of data and
information that is not easily accessible to the public. Take for example an
academic research paper published on a website where paying a subscription
fee is required to access contents. Only subscribers are able to access this
research paper, thus the fact that the paper is not publicly available makes it
part of the Deep Web. Websites that require any authentication credentials
1.1 Purpose 7
such as email addresses or passwords are also a part of the Deep Web.
The Dark Web on the other hand is a specific part of the Deep Web that is also
hidden yet is only accessible through specialized software and tools. Activities
are conducted anonymously on the Dark Web, making it a shelter for criminal
activities such as cybercrime and selling drugs. The anonymity provided
on the Dark Web makes it difficult for law enforcement to detect illegal
activities and creates significant challenges for them to be able to provide
safety and security on the Dark Web [10]. Since users are not identifiable on
this platform, the Dark Web lacks accountability.
The aim of this thesis is to provide an in-depth understanding of the Dark
Web, discussing its characteristics and functionalities. The thesis also concluded
that the Dark Web is not a very researched subject since there was a limited
availability of articles focusing on the Dark Web and its impacts.
1.1 Purpose
This thesis mainly focuses on informing the reader about the structure of the
Dark Web and what it comprises. Moreover, the thesis is going to explore the
feasibility and potential implications of mitigating the impact of the Dark
Web. In summary, the three main questions that the research is going to
cover are:
RQ1. What is the Dark Web?
RQ2. Which are the primary actors that operate on the Dark Web?
RQ3. What role does the Dark Web plays in society?
2 Methodology
This thesis was mainly based on a literature study and expert analysis
where experts from different fields were interviewed to supply information
surrounding the Dark Web. To be able to explore the different actors that
operate on the Dark Web and provide information about them, the thesis
also relied on both accessing and browsing the Dark Web and utilizing a
Dark Web crawler called TorBot to output data for analysis.
resource that was used for this review was Google Scholar which provided a
fair amount of research articles surrounding the Dark Web.
A keyword-based search approach was conducted to provide comprehensive
information surrounding the Dark Web. The first step used to initiate the
literature search was the identification of keywords. A broad of relevant
keywords were chosen such as "dark web", "deep web", "Tor" , and "hidden
services" which aimed to provide different aspects of the Dark Web. Furthermore,
to be able to understand the different activities that take place on the Dark
Web, keywords such as "cybercrime", "markets", "paedophilia", "messaging",
"anonymity" were used too.
After identifying relevant keywords, a selection of databases was conducted.
These databases consisted of academic databases such as Scopus and Google
Scholar. Both databases provided a fair amount of different academic literature
and studies which helped capture studies from various disciplines.
Search execution was then done on the selected databases using the identified
keywords. The search provided different scholarly articles published that
reviewed the Dark Web and its functionalities. Not all articles found were
included in the research since a vast majority of them didn’t have the Dark
Web as the main focus. Thus, those which mostly focused on the Surface
Web and general Internet usage were excluded.
Data extracted from relevant articles were analyzed to be able to identify
common patterns and trends.
2.2 Expert Analysis 9
3.2.2 IP addresses
IP addresses are utilized to assign unique identifiers to devices within a
network, enabling routing and communication across the Internet. There
are currently two different versions of IP addresses used today, IPv4 and
IPv6. IPv4 stands for Internet protocol version 4 while IPv6 stands for
Internet protocol version 6 [13]. More specifically, IPv4 addresses are a string
which consists of a 32-bit number split into four different groups of 8-bit
numbers. Each 8-bit value is then transformed into decimal form to arrange
an IP address. An example of an IP address can be 192.158.1.54. On the
3.2 OSI and Protocols 12
other hand, IPv6 addresses consists of 128-bit numbers which are expressed
in hexadecimal form [13]. One might wonder why there are two different
versions of IP addresses, this is due to the fact that IPv4 addresses are 32-bit
binary numbers which can totally express 4.3 billion unique addresses. This
was enough addresses when the protocol was firstly introduced in the 1980s,
however the exponential growth of the Internet was unexpected and thus
resulted in a shortage of IPv4 addresses, thats why IPv6 was introduced as
a solution.
Section 3.2.3, TCP ensures that data is being transferred and resends data
that has been lost. To be able to initiate a reliable TCP connection between
the server and the client, a three-way handshake is used firstly [13]. Figure
3 demonstrates how a three-way handshake gets established.
Theoretically speaking, data can now be exchanged between the browser and
the server after establishing a TCP connection. This is done by following
the HTTP protocol which consists of two types of messages, requests and
responses. HTTP requests are generated by the browser itself and are used to
retrieve information and contents from the server. When accessing Wikipedia
for example, the browser will send out an HTTP request message to the
server to be able to retrieve the HTML code and load the website for the
user. HTTP responses are answers to HTTP requests.
An HTTP GET request is generated and sent away to the server when trying
to load Wikipedia in the web-browser. Figure 4 above displays an example
of an HTTP GET request generated when trying to access Wikipedia. The
GET method is used to retrieve the resources of the web-page specified,
which in this case is Wikipedia. Other fields specify different things, for
example the Host field states the domain name, while the User-Agent
field specifies the web-browser being using by the client, which in this case
is Mozilla Firefox.
4.2 Security 16
4.2 Security
The HTTP protocol itself does not provide any type of security or protection
for the data being transmitted. Sensitive data such as usernames and passwords
are not encrypted and can thus be retrieved by hackers while being sent to the
web-server. The HTTP requests and responses are simply sent in plain text.
An extension of HTTP called HTTPS was created to be able to solve this
problem. Notice how the URL of Wikipedia (https://en.wikipedia.org/)
starts with https instead of http.
HTTPS comes supplied with encryption and verification methods which uses
TLS (Transport Layer Security) to be able to provide these features [13]. TLS
is a cryptographic protocol designed to provide communications security over
a computer network including emails, web browsing and file transfers [13].
TLS mainly provides three major features:
• The client can be sure that the data exchanged between it and the
server is not being read by anyone else.
• The client can be sure that the data exchanged between it and the
server is not being changed by anyone else before arriving to the server
or back to it.
• The client can be sure that it is communicating with the intended
server.
The main goals here are security, authentication and integrity which are
solved by encrypting data and signing it. The whole process of TLS is built
on using public key cryptography and digital signatures.
Public Key Cryptography
Public key cryptography uses two cryptographic keys, a public key and
a private key. The public key, which is available to anyone, is used for
encryption of data while the private key, which is kept secret and never
shared, is used for decryption of data. Data can only be decrypted using
the private key. The keys can be generated using different mathematical
techniques which will not be covered in this thesis.
Digital Signatures
A digital signature is used to validate the authenticity and integrity of a
digital document. These are typically created by using hashing the data
itself. The person who creates the digital signature uses a private key to
4.2 Security 17
encrypt the hash, and then the public key is used by the other end to verify
the signature.
After establishing a TCP connection between the client and the server, the
client issues a TLS handshake to be able to authenticate and secure the
connection. A web server must have a TLS certificate to be able to establish
secure connections with its clients. Certificates are used to allow the server to
prove its identity when communicating with its clients. These certificates are
mainly sold by certificate authority organizations which validates the domain
and the owner details before giving out the certificate for the website. An
example of a certificate authority is Amazon Trust Services. TLS certificates
has a maximum validity period of 13 months. A certificate typically contains
the name of the CA company, the name of the domain, the servers public
key, and the CA’s digital signature. The CA signs the certificate using its
own private secret key that no one else knows. Anyone with access to the
CA’s public key is able to verify that the digital signature was initiated by
the CA itself [14]. Every modern browser comes preinstalled with the public
keys of certified certificate authorities nowadays, which makes it possible for
the client to use one of the public keys it already has depending on which CA
the certificate came from. The browser computes the hash of the certificate
and decrypts the digital signature by using the public key it already has, if
both hashes match, then this means that the certificate was really issued by
the CA and the client can be sure that the public key the server sent is really
the server’s and not somebody else’s [14]. The following steps provide a very
brief explanation of the basics of TLS handshakes, do note that these steps
differ depending on which version of TLS is being used:
1. The handshake starts with the client sending a hello message to the
server asking to initiate a TLS connection.
2. The web-server receives the message and sends back its certificate and
its public key back to the client.
3. Before using the server’s public key, the client needs to verify first
that this message was indeed sent from the server. This is done by
decrypting the signature in the certificate.
4. After the verification process, the client generates a random key which
will be used by both the server and the client as a symmetric key to
encrypt/decrypt messages they will be exchanging with each other.
The symmetric key is then encrypted with the server’s public key and sent
back to the server. Since only the server has the private key, it is the only
one that can decrypt the message and thus the key generated by the client
18
5.1.1 Crawling
Every search engine uses different search algorithms and favour web-pages
based on content quality and users experience. Crawling is the a process
which sends out a team of robots known as spiders to find out new and
updated content [15]. The goal of a spider is to learn what every web-page
on the web consists of and retrieves its information. When a user provides a
search query for the search engine, the search engine firstly use web crawlers
to scour the Internet and build a database consisting of relevant websites that
are related to the search query the user supplied. Web crawlers mainly start
searching from a specific specified seed or a know URL list. Crawling these
websites leads them to other websites via hyperlinks for example. Sometimes
different websites are excluded because the hosting web-server itself denies
access to these crawlers and thus these websites are not crawled and not
shown to public.
In summary, crawling is about discovering and finding relevant URLs on the
web. Web scraping on the other hand, is the process of extracting data from
websites and can be done by using different tools to provide the user with
the contents of specific websites.
5.2 The Deep Web 19
5.1.2 Indexing
The information that has been found by crawlers are arranged and categorised
in this step. Indexing is about processing and analyzing the contents of the
web-page. Indexing stores relevant websites in its huge database called the
index. These databases are kept fresh since crawling and indexing is done
continuously to provide the users with the best results. If a website is not
indexed, it is simply not stored in the search engines database and can thus
not be viewed by others.
According to wordlwidewebsize.com, the Indexed Web contains at least
7.26 billion pages [16].
be identified mainly because different encryption services are being used. The
Dark Web itself exists on what is called the Darknets network which is an
overlayed network on the Internet [18]. A Darknet network refers to a portion
of the Internet that is intentionally hidden and inaccessible through standard
web browsers and search engines. It operates on encrypted and anonymized
communication protocols, allowing users to access online content, services,
and resources with a high degree of privacy and confidentiality. Before diving
into how Dark Web websites can be accessed, one should firstly understand
how the networks that enable access to Dark Web websites work.
There are a couple of different services that offer access to Darknets such as
I2P, Freenet, Zeronet, and GNUnet, however this thesis entirely focuses on
the most popular service called Tor. This section aims to answer the research
question RQ1: What is the Dark Web?
its contents. This is due to the requirement for the interceptor to possess the
symmetric key, established between the client and server, in order to decrypt
the data.
When sending a packet with data on the Internet, the packet also contains
the source IP address and the destination IP address. Someone sniffing
an HTTPS connection cannot see the contents of the data in the packet,
however, the client’s IP address and the server’s IP address are still visible.
Hiding these IP addresses would violate the Internet protocol since the source
and the destination IP addresses should be known on the Internet in order
for the packets to be forwarded or else the packet is discarded.
Onion routing is a clever technique to solve this issue and anonymise connections
to servers making it nearly impossible to know which site a client is trying to
communicate with. The website itself does not know who is communicating
with it either. Tor network itself consists of a group of volunteer-operated
servers which are oftenly called relays. Tor relays are routers or nodes that
receives your traffic on the Internet and pass it along. A client initiates a
connection on this network by connecting through a series of relays rather
than making a direct connection. The default number of relays used by the
connection is three and are called Entry Node, Middle Node and Exit Node.
When a client establishes a TCP connection using Tor, Tor finds three
different nodes out of its operated servers. The main focus of the algorithm is
that not any single node knows the entire path. The Entry Node knows who
you are but not who you are communicating with, the Middle Node knows
neither, and the Exit Node knows who you are communicating with but not
who you are. Having three nodes in the circuit makes it difficult to correlate
incoming and outgoing traffic. This even helps prevent a single point of failure
6.1 Tor network 22
in case one node is compromised or malicious. Once three distinct nodes are
identified, three unique symmetric shared keys are established between the
client and these three relays. As a result, the client possesses all three keys,
while each router holds only one of the keys, excluding the others. Now
assume that the client has a packet of data that should be sent to the server.
The client firstly encrypts the packet starting with the Exit Node’s key, then
the Middle Node’s key and lastly the Entry Node’s key. The packet is then
sent by the client to the Entry Node which is the only node able to see your
IP address. The Entry Node uses its key to decrypt the first layer of the
packet and then knows that the packet should be forwarded to the Middle
Node. The Middle Node cannot see your IP address, it only sees where
the packet came from and where the packet is headed when it decrypts the
second layer with its key. The Middle Node then forwards the packet to the
Exit Node which finally decrypts the whole packet with its key and sends
the data to the server. The destination server will thus never know your IP
address. Figure 6 illustrates how different layers of encryptions are applied
to a message before getting forwarded by the source.
The same process is applied but in reversed order when receiving data from
the server, the packet is encrypted on the way back and then decrypted by the
client since the client has access to all the keys. Further documentation and
explanation of the Tor design can be found on Tor’s official documentation
6.1 Tor network 23
website [19].
6.1.2 Vulnerabilities
Two main vulnerabilities can be noted from the implementation of the Tor
network. The first being that the actual data is being decrypted by the Exit
Node and sent directly to the server with no protection. This of coarse
is not secure since an attacker can easily listen to the last channel and
be able to read the data. The Tor browser, utilizing the Tor network,
addressed this problem by incorporating a solution that involves employing
HTTPS within its implementation. Consequently, alongside onion routing,
HTTPS is employed to guarantee that the Exit Node remains unaware of the
data transferring between the client and the server. Additionally, the data
transferred from the Exit Node to the server is made secure by utilizing TLS
encryption, ensuring that the message remains encrypted.
Another vulnerability that might arise is if sniffing was done on both the
channel between the client and the Entry Node and the channel between the
Exit Node and the server. Sniffing the channel between the client and the
Entry Node can provide eavesdroppers with the IP address of the client. This
is not a concern since eavesdroppers will only know that the client is using
Tor and not what the client is trying to access. However, combining this
6.2 Onion sites and Tor hidden service 24
information with the information that can be sniffed on the channel from the
Exit Node to the server (mainly the servers IP address), it might be possible
to figure what the client is trying to access. This is difficult to execute since
these relay nodes are not only acting as relay nodes for one client but also
serve other clients as well. Packets are always traveling through and out of
them the whole time, thus trying to identify which packets belongs to the
client is quite difficult and requires special algorithms.
When clients try to access normal websites on the Tor network, the IP address
of the server is publicly known. However Tor hidden services aims to hide
the identity of the server which makes it impossible for people to trace or
know the actual location of the server, in other words, neither the client nor
the server know anything about each other.
The main difference between onion routing and Tor hidden services is the
fact that packets do not leave the Tor network when getting forwarded to
the hidden service making exit nodes attacks impossible. Tor hidden services
provides three main benefits to its users: Location hiding which hides the
location of the server and allows it to offer TCP connections to clients
without disclosing its IP address, end-to-end authentication which means
that a person visiting an onion website knows for sure that the content they
are seeing can only come from that website itself, and end-to-end encryption
which offers encrypted traffic from the client to the server without using
HTTPS [19].
For this particular section assume that Bob is the hidden service and Alice
is the client trying to access this server. The first step in the whole process
starts when Bob calculates its key pairs which consists of a public key and a
private key. Bob then picks 3 random introduction points in the Tor network
and creates a Tor circuit informing them to act as introduction points for
it and supplies them with its public key. Introduction points are normal
onion routers operated on the Tor network and act as contact points and
introduce clients to the hidden service without making the server get involved
6.2 Onion sites and Tor hidden service 26
in the process. This provides location-hiding for Bob since Bob’s location is
not revealed to any of these introduction points (this is achieved by the
use of the Tor circuits). Access to the hidden service will only be allowed
through these introduction points. After this step, Bob creates a hidden
service descriptor which contains Bob’s public key and which introduction
points Bob has chosen. The descriptor is then signed with Bob’s private key
and uploads the descriptor to an onion directory server which is part of the
Tor network and is a distributed hash table [19].
Once the hidden service has been setup, the service publishes its onion
service address which is a 56 character hash name and ends with .onion
and are based on the public key of the hidden service. Onion addresses
are not publicised over the whole Tor network, they can be found through
accessing private communities on the Internet or simply provided through
communication with someone that has an onion address of a website. There
are public websites on the Internet that provide users with different onion
addresses (dark web websites). An onion address can look like the following:
hashvalueofpublickeyofhiddenservice.onion.
Once Alice has been supplied with an onion address, a connection is made to
the distributed hash table (the directory) and the hidden service descriptor is
provided to Alice. Once Alice has the hidden service descriptor, the signature
is verified using the encoded public key in the onion address. This provides
end-to-end authentication for the protocol. The client (Alice) now has the
server’s public key and information about the introduction points the hidden
service is using. Before Alice makes a connection to one of the introduction
points, she firstly establishes a connection to a random onion router on the
Tor network to act as a rendezvous point (RP). Alice supplies the rendezvous
point with a one-time secret code which is called a rendezvous cookie. The
cookie is used to make Alice recognize Bob when a connection is established.
After achieving a connection with a rendezvous point, Alice sends its one
time secret again with the RP address to one of Bob’s introduction points
over a Tor circuit. The message is encrypted with the hidden service’s public
key and then passed over to the introduction point which forwards it to Bob.
Bob decrypts the message and decides to allow the connection. Bob now
establishes a Tor circuit with the rendezvous point and supplies it with the
one time cookie again. The rendezvous point compares the two cookies, if
they match then the client is informed that a connection has been successfully
established.
Do note that a Diffie-Hellman handshake also takes place between Alice and
Bob in the process above so that end-to-end encryption is supplied.
6.3 Accessing the Dark Web 27
This section aims to investigate the different actors on the Dark Web and
analyze their activities by exploring the Dark Web and crawling different
websites with the TorBot tool, providing an answer to the research question
RQ2: Which are the primary actors that operate on the Dark Web?. Furthermore,
the actors are divided into different groups: lawful and unlawful. To differentiate
between lawful and unlawful entities on the Dark Web, we establish specific
criteria that guide our categorization process. The primary factor considered
is the adherence to existing legal frameworks. Entities operating within
the boundaries of the law are categorized as lawful, while those actively
violating established laws fall under the classification of unlawful actors. The
categories of the actors are summarized in Table 1 and then described in
detail further on in this section.
29
Civilians
The Dark Web offers protection of personal privacy which helps conceal users
identities and makes it hard to identify them. Being able to express different
ideas and opinions without getting identified can provide many benefits
to online users [21]. Users around the world may harbor apprehensions
regarding potential consequences such as political or economic retaliation,
harassment or life threatening threats and may thus refer to use the Dark
Web to overcome these alarms. The Dark Web is primarily sought after by
regular members of the public who wish to browse the Internet anonymously,
30
cybercrime and terrorism. The investigation that was carried on the different
Dark Web websites concluded that there are a vast amount of websites which
offer resources for journalists and researchers that could be used to explore
different subjects and court cases which might be relevant to journalists. A
website named Judicial review that was found on the HiddenWiki offered a
huge database of court cases. The Dark Web provides platforms, including
news and media forums, that can be utilized by nations lacking unrestricted
media access. This allows journalists and individuals to share political information
within the confines of the Dark Web. Media organizations such as The
Guardian and The Washington Post tend to use services on the Dark Web
which allows them to securely exchange information without the fear of
their identities being revealed or their communication being intercepted.
SecureDrop was one of the websites that was found through TorBot that
provided a secure service for media organizations to accept documents from
anonymous services. Figure 11 demonstrates a Dark Web news site called
ProPublica which can be used to read news about different political topics.
Figure 11: A Dark Web website which offers free media and news.
32
Figure 12: The New York Times official SecureDrop Dark Web website.
Organisations
Organisations and businesses tend to explore the Dark Web to protect their
businesses from different threats that their companies could face or have
faced. Corporations usually face many cyberthreats nowadays which can
include DDos and hacking attacks that could leak their customers private
information and data. Different websites that sold leaked data were found
on the Dark Web markets. TorBot for example provided a website which
sold stolen credit cards, Paypal accounts, and eBay accounts. Companies
33
might sometimes not notice that they have been exposed to such attacks
which creates a further risk. To be able to provide protection, businesses
typically monitor the Dark Web and react to threats and stolen data found
on marketplaces [21]. Figure 13 shows a Dark Web website which offers
stolden Paypal accounts and eBay accounts which can be purchased using
crypto-currency.
Figure 13: A Dark Web website which sells stolen paypal accounts, ebay
accounts, and credit cards.
Figure 14: A Dark Web website which allows users to hire a hacker.
Market Operators
Market operators create and manage underground marketplaces on the Dark
Web. Just like any other marketplace found on the Surface Web such as
Amazon, these market operators hosts similar websites on the Dark Web
which sell illegal goods and resources. These goods can range from drugs
including narcotics, opiods and steroids to weapons and firearms. Illegal
goods and services have big demand which makes market operators seek for
new clients. Almost any type of illegal product or service can be found on
35
the Dark Web. Silk Road was a widely used Dark Web market back in 2013.
Silk Road had the same web design as any common shopping website, each
product has a detailed description, a photograph and a price. Silk Road
offered drugs, fake IDs, passports, and stolen credit cards. Credit cards
are typically not used to purchase from such markets as that would be too
easy to trace, instead, payments are done via crypto-currency. Silk Road
was shutdown by U.S. Federal Bureau of Investigation (FBI) in 2013 [24]
however there are still a vast amount of marketplaces available on the Dark
Web [5]. Different kind of services were found while investigating the Dark
Web, Table 2 provides an insight of the different products and services that
could be purchased on the Dark Web’s marketplaces.
36
Figure 16: Top markets on the Dark Web sorted by revenue in 2022.
Adopted from: [5].
for online narcotics and had the most revenue back in 2022 was shutdown
by a U.S.-German operation in April 2022 [5]. On the other hand, the
OMG!OMG! marketplace seems to be taking up the mantel for Hydra MarketPlace.
Terrorists
Terrorists typically use the Dark Web to discuss and plan unethical activities.
Terrorist organizations utilize the Dark Web to disseminate propaganda and
spread their ideologies and instruction materials. They have websites which
are used to recruit new members. Terrorists mainly aim to achieve two goals
to be able to fulfil their objectives, namely establishing online presence to
be able to spread propaganda without being detected by law enforcement.
Attack plans and other terrorist activities are discussed by them on the
Dark Web which maintains their anonymity. Moreover, terrorists use the
Dark Web to be able to survive. Money is needed to be able to conduct
attack operations and buy equipment, this is fulfilled through the donations
terrorists get through their supporters on their websites on the Dark Web
and the different services terrorists sell on the Dark Web such as human
organs and stolen items and goods [21]. The terrorist attack in Paris that
took place in November 2015 was done via weapons and explosives that were
purchased by terrorist groups through the Dark Web [21].
Table 1 provides a brief summary of this section. Law and unlawful categories
illustrate if the entity involved uses the Dark Web for lawful or unlawful
purposes. Ordinary users for example, which only use the Dark Web for the
benefits it provides are not breaking any laws. One could certainly argue
that journalists,researchers, activists and whistle-blowers are not breaking
any laws by using the Dark Web, however, using the Dark Web to disclose
classified or confidential information might be illegal in some countries. Unauthorized
publication of sensitive materials can violate laws. Journalists tend to publish
false and damaging information that could harm organisations reputation
without proper verification which could also lead to defamation lawsuits. The
following activities carried out by journalists on the Dark Web can be counted
as unlawful depending on the laws that might vary in different countries:
• Promoting illegal activities.
• Violating data protection and privacy laws.
• Publishing classified government information.
39
online which lowers the risk of them being caught by law enforcement,lowers
the chance of them being exposed to violence and increases their financial
gains [26].
Figure 17: Monthly sales of drugs through different Dark Web markets.
Source: [6].
The Dark Web’s marketplaces encounters huge risks for shutdowns. These
shutdowns can happen for several different reasons:
• Voluntary shutdowns: The marketplace is voluntary shutdown by their
administrators mainly because the market is unprofitable or because
the administrators are fearing a seizure by law enforcement.
• Exit scams: The marketplace decides to shutdown to scam its users
and keeps all the money that was in the escrow system.
• Hacked or raided: Sometimes marketplaces gets hacked by other users
where hackers try to steal money and shutdown the market for certain
reasons. On the other hand, the markets can get raided by law enforcement
agencies and get seized.
Shutdowns of such markets can arguably have an impact on users who uses
them. According to the Global Drug Survey 2018 [6], the shutdown of the
market AlphaBay suggested that 15 percent of Dark Web users used such
markets less frequently after the shutdowns, and 9 percent had stopped using
the Dark Web for drug purchases [6].
8.2 Drug Dealers and Markets 42
Figure 18: Proportion of surveyed Internet users using drugs in the past
year who purchased drugs over the Dark Web. Source: [6].
Figure 19: The impact that market closures have had on individuals. Source:
[6].
The Dark Web is certainly known to have caused an impact on street crime
8.3 Law Enforcement 43
too. Individuals seeking to purchase drugs certainly use the market places
offered by the Dark Web which leads to a decrease in street-level drug dealing
as drug users always seek convenience and anonymity while purchasing drugs.
Being able to purchase weapons and firearms via the Dark Web’s marketplaces
can contribute to decrease street crimes involving weapons.
9.2.2 Tools
Law enforcement authorities use different tools and techniques to be able to
track and monitor activities on the Dark Web. These tools can be used to
identify threats and stop them. Even-though Tor provides a secure protocol
to protect the identities of its users, its not invulnerable to attacks. In Section
6.1.2, a vulnerability of the onion routing technique was presented where a
hacker sniffing the channel between the Entry Node and the channel between
the Exit Node and the server would deanonymize Tor users. This specific
attack is called an end-to-end confirmation attack which attempts to correlate
the traffic entering and exiting the Tor network to be able to deanonymize
users [21]. More specifically, end-to-end confirmation attacks performed by
Law Enforcement agencies tend to take control of both the entry and the
exit relays of the Tor network. By doing so, they are able to monitor the
traffic entering and leaving the network which allows them to correlate the
timing of packets. This would provide Law Enforcement agencies with the
IP address of the client and the IP address of the server they are trying to
access which would deanonymize the user.
Law enforcement agencies commonly employ a type of attack targeting the
hidden service directory within the Tor network [21]. Hidden service directories
in the Tor network are used to retrieve a list of the introduction points used
by the server. Law enforcement agencies tend to compromise the directory
which helps them monitor the activities of the hidden service.
Open Source Intelligence (OSINT) tools which are tools that allow the collection
49
10 Discussion
The thesis provides valuable insights into the Dark Web, its actors and
societal impact, however, by nature, the Dark Web is anonymous which
made it hard to collect data about its actors and cover all of the activities
that takes place on it, thus, this work might not cover all the aspects of the
Dark Web comprehensively. One significant limitation of this thesis is the
absence of available data that could directly illustrate the precise impact of
the Dark Web. As an example, no direct research was conducted focusing on
the relationship between the Dark Web and street crime. The decentralized
nature of the Dark Web made it challenging to obtain data regarding its
societal impacts. This limitation restricted the ability to provide information
about the Dark Web’s influence on various aspects of society such as crime
rates and economic factors.
Secondly, a challenge encountered during the research was the difficulty in
finding and engaging with experts in the field. Additionally, some experts
who were approached for interviews were hesitant or unwilling to disclose
information. Since the Dark Web is notorious for facilitating illegal activities,
experts, particularly those working in law enforcement may be bound by
legal restrictions that prevent them from openly discussing specific details or
sharing classified information. Fear for personal safety and security may also
be one of their concerns. Moreover, since the Dark Web operates within
a highly specialized and terchnically complex environments, some of the
experts which denied to disclose information may not have an extensive
knowledge of the Dark Web making them hesitant to discuss a topic they
feel ill-equipped to address accurately.
Despite the limitations the study reveals that the encryption and anonymity
provided by the Dark Web has yielded both positive and negative impacts to
society. Utilizing the TorBot crawling tool made it possible to gather a wide
50
range of data from the Dark Web, which helped offer an insights into the
various activities and communities that exist in this hidden environment.
Accessing the Dark Web itself presented an opportunity to observe and
analyze the operations and behaviours of its actors firsthand and made it
possible to determine different actors on the Dark Web. These actors where
divided into two different groups, lawful and unlawful based on whether the
activities carried by these actors violate the law of the countries they are
living in.
The distinction between lawful and unlawful actors on the Dark Web aimed
to challenge the common perception that the entire Dark Web is a hub of
criminal activity. By recognizing the presence of lawful actors, we acknowledge
that not all activities conducted on the Dark Web are illegal or malicious.
This recognition prompted us to consider the potential benefits and positive
impacts that can arise from these lawful activities. Some actors may belong
to both of these groups since what classifies as lawful and unlawful differs in
countries.
When it comes to unlawful individuals, the anonymity offered by the Dark
Web creates an advantageous environment for the illicit trade of prohibited
items and services. These include drugs, weapons, stolen data, and various
illegal services. Criminal networks which develop on the Dark Web pose a
challenge to Law Enforcement agencies where different tactics and methods
should be used to combat these activities. The proliferation of illegal activities
on the Dark Web leads to harmful consequences for society such as drug
addiction, violence and financial losses. On the other hand, the levels of
privacy and anonymity provided by the Dark Web helps protect users who
wish to hide their identities and enable freedom of speech.
When conducting the investigation on the Dark Web, the author carefully
considered the ethics of all actions taken. In particular, the investigation did
not contribute or perpetuate criminal behaviour.
11 Conclusion
The Dark Web provides a platform with strengthened anonymity offering
both opportunities and challenges to society. Lawful actors utilize this space
for legitimate purposes such as anonymous communication, research and
privacy protection. On the other hand it is undeniable that the Dark Web
also serves as a breeding ground for unlawful activities. Illicit actors take
advantage of the anonymity provided, engaging in the sale of illegal goods
REFERENCES 51
and services, including drugs, weapons, stolen data, and other illicit offerings.
The sale of drugs, weapons, and stolen data not only fuels criminal activities
but also contributes to social harm and addiction. The societal impact of
the Dark Web’s unlawful actors extends beyond the virtual realm, influencing
communities and individuals both online and offline.
By acknowledging the presence of both positive and negative actors, we
move beyond the stereotype that the Dark Web is solely a breeding ground
for illegal activities. The categorization of these actors was made based
of the adherence to existing legal frameworks. Those actors who operate
within the boundaries of the law were categorized as lawful while those
violating established laws fall under the classification of unlawful actors.
This categorization allows for a better understanding of the motivations,
behaviors, and implications associated with different actors operating within
this field.
Addressing the societal impact of the Dark Web requires a multi-faceted
approach. Collaboration between law enforcement, technology companies,
policymakers, and civil society organizations is important in developing effective
strategies to tackle the illegal activities conducted on the Dark Web. Balancing
the need for privacy and security with the prevention of criminal activities is
a complex challenge that demands continuous adaptation.
References
[1] E. Conrad, S. Misenar, and J. Feldman, “Chapter 5 - domain 4:
Communication and network security (designing and protecting network
security),” in CISSP Study Guide (Third Edition), third edition ed.,
E. Conrad, S. Misenar, and J. Feldman, Eds. Boston: Syngress, 2016,
pp. 219–291, DOI: 10.1016/B978-0-12-802437-9.00005-9.
[2] F. Murtaza, “How Do TOR Onion Addresses Actually Work?” https:
//www.makeuseof.com/how-tor-addresses-work/, accessed 2023-05-30.
[3] Privacyguides, “TOR Overview,” https://www.privacyguides.org/en/
advanced/tor-overview/#encryption, accessed 2023-05-30.
[4] TOR development team, “TOR (network),” https://www.torproject.
org/, accessed 2023-05-30.
[5] CHAINALYSIS TEAM, “How darknet markets and
fraud shops fought for users in the wake of
hydra’s collapse,” https://blog.chainalysis.com/reports/
REFERENCES 52
how-darknet-markets-fought-for-users-in-wake-of-hydra-collapse-2022/,
accessed 2023-06-17.
[6] The United Nations Office on Drugs and Crime, “In focus: Trafficking
over the darknet - world drug report 2020.” https://www.unodc.org/
documents/Focus/WDR20_Booklet_4_Darknet_web.pdf, accessed
2023-05-31.
[7] N. Nearchou, Combating Crime on the Dark Web : Learn How to
Access the Dark Web Safely and Not Fall Victim to Cybercrime., 1st ed.
Birmingham: Packt Publishing, Limited, 2023.
[8] S. KEMP, “Digital 2023 april global statshot report,” https:
//datareportal.com/reports/digital-2023-april-global-statshot, accessed
2023-08-22.
[9] A. Narayan, “Are you being tracked on the internet? Know how
to find out,” https://economictimes.indiatimes.com/tech/internet/
are-you-being-tracked-on-internet-know-how-to-find-out/articleshow/
60890696.cms?from=mdr, accessed 2023-06-03.
[10] K. Taylor, “Detailed introduction about the surface, deep
dark web levels explored,” https://www.hitechnectar.com/blogs/
introduction-surface-web-deep-dark-web/, accessed 2023-06-03.
[11] TOR development team, “TOR,” https://www.torproject.org/, accessed
2023-05-19.
[12] P. Narayanan and KingAkeem, “Torbot,” https://github.com/
DedSecInside/TorBot, accessed 2023-05-01.
[13] J. F. Kurose, Computer networking : a top-down approach, eighth
edition ed. Harlow: Pearson Education Limited, 2022 - 2022, accessed
2023-06-17.
[14] Amazon, “What is an SSL/TLS certificate?” https://aws.amazon.com/
what-is/ssl-certificate/, accessed 2023-05-25.
[15] B. Muller and the Moz Staff., “How do search engines work?” https://
moz.com/beginners-guide-to-seo/how-search-engines-operate, accessed
2023-05-29.
[16] M. de Kunder, “The size of the world wide web (the internet),” https:
//worldwidewebsize.com/, accessed 2023-05-29.
REFERENCES 53