OSINT in DarkWeb
OSINT in DarkWeb
Priyal Walpita
UCSC
The Internet
Initially Google, Yahoo, Bing and other search engines index
sites by crawling them and handling the information crawled
What is into their index servers. These search engines then organize the
Surface Web data by context, considering its logic, and entering them into a
database of algorithms which makes a search engine. This
particular data is then indexed by a search engine and accessed
through the Surface Web or the World Wide Web (WWW).
The Deep Web is the area on the internet that cannot be
indexed. Simply if the surface web is the indexable part in the
Deep Web internet, Deep Web is everything else. If any website or system
needs login credentials it is a part of Deep Web. Initially
academic institutes, organizational information, intranets of
business, governmental departments, etc. are a part of the
Deep Web. These websites mainly avoid search engines from
indexing parts of the website such as from Google Scholar or
Amazon. Also Deep Web is accessible by any standard browser.
But it is not indexed by search engines. So that in order to
access the content in a specific network typically you need to
either enter a username and password.
Dark Web only exists on the Dark net, similarly as Surface Web
exists in Surface Web or WWW. Simply, using the Dark Web
allows people to communicate, buy, connect and work privately
Dark Web and anonymously. In order to preserve and maintain online
privacy and anonymity the user could use measures such as
VPN’s, Tor browsers and etc.
OSINT Tools in
Dark Web
Hunchly Dark Web
This tool can be used as a discovery tool. Hunchly could be utilized
when you are looking for a low tech solution for data basing sources for
OSINT Tools in Dark Web research. There are two approaches for this tool as
mentioned below;
Dark Web
● Subscribe via email on the Hunchly website
● Follow for daily posts on the Hunchly Twitter page.
It’s clearly said by Hunchly that they do not investigate the hidden
services for content. So that the links you may receive could be a path
to drug markets, malware, and other sensitive contents where Hunchly
mentions explicitly that they are not responsible for those contents, but
for the user to be careful.
https://www.hunch.ly/resources/Hunchly-Dark-Web-Setup.pdf
Dark Search
This tool could also be used as a low tech solution in the Dark Web. The
search engine could be viewed in any web browser by following the
OSINT Tools in links found in its index by using Tor or similar ones. One of the search
operators can be mentioned as ‘boost operator’.
Dark Web
TorBot
OSINT Tools in
Dark Web
TorBot
TorBot is an open source intelligence tool developed in python. The
main aim of this tool is to accumulate open data from the deep web and
with the assistance of data mining algorithms, collect as much
OSINT Tools in information as possible and produce an interactive tree graph.
Dark Web Following gives some of the features of TorBot;
● Onion Crawler (.onion).
● Returns Page title and address with a short description about the
site.
● Save links to database.
● Get emails from site.
● Save crawl information to JSON file.
● Crawl custom domains.
● Check if the link is live.
● Built-in Updater.
● Visualizer module.
● Social Media integrati
Fresh Onions
OSINT Tools in
Dark Web
Fresh Onions
Fresh Onions is a directory and also a strong search engine, which
includes onion and . clos domains, and shows the most recent deep web
links and dark web links. This is a type of tool that has not been
OSINT Tools in updated in a while which includes the following mentioned features;
Dark Web ● Crawls the Dark Web looking for new hidden service.
● Find hidden services from a number of clearnet sources.
● Optional fulltext elasticsearch support
● Marks clone sites of the /r/darknet superlist
● Finds SSH fingerprints across hidden services
● Finds email addresses across hidden services
● Finds bitcoin addresses across hidden services
● Shows incoming / outgoing links to onion domains
● Up-to-date alive / dead hidden service status
● Port scanner
Fresh Onions
● Search for “interesting” URL paths, useful 404 detection
● Automatic language detection
● Fuzzy clone detection (requires elasticsearch, more advanced than
OSINT Tools in superlist clone detection)
Dark Web
Onioff
When you are done with the creating part of a database of hidden
services and onion domains in Tor, the next thing to do is to examine
them to secure from exposing yourself to malicious material. The
OSINT Tools in ‘Onioff’ is an onion ‘url’ inspector used to check deep web links and it
Dark Web takes specified onion links and return their current status along with
the site’s title. This is written in pure python.
TorCrawl
OSINT Tools in
Dark Web
TorCrawl
Initially Tor is a well-known software that allows anonymous
communications, and is becoming more popular due to the increasing
media on dark web sites. “Dark Web” sites are usually not crawled by
OSINT Tools in generic crawlers because the web servers are hidden in the Tor network
Dark Web and require use of specific protocols for being accessed.
This tool is a powerful robust tool which crawls into hidden services on
Tor, but also extracts the codes on services as well. Functions such as
crawling, inspecting, investigating shall be done using this tool as well.
You could get the webpage markup so that you can view the content
without physically accessing the page. Also you could view the static
webpage by saving it as an .html file as well.