Lab Assignment 1
Lab Assignment 1
Note the target domain's IP address in the result above (here, 162.241.216.11). You also
obtain information on Ping Statistics such as packets sent, packets received, packets lost,
and approximate round-trip time.
The response, Packet needs to be fragmented but DF set means that the frame is too large
to be on the network and needs to be fragmented. The packet was not sent as we used the -
f switch with the ping command, and the ping command returned this error.
Try different values until you find the maximum frame size. For instance, ping
www.certifiedhacker.com -f-1 1473 replies with Packet needs to be fragmented but DF set,
and ping www.certifiedhacker.com -f -1 1472 replies with a successful ping. It indicates that
1472 bytes are one maximum frame size on the machine's network.
discover what happens when TTL (Time to Live) expires. Every frame on the network has TTL
defined. If TTL reaches 0, the router discards the packet. This mechanism prevents the loss of
packets.
Centralops (centralops. net) is a free online network scanner that investigates domains and
IP addresses, DNS records, traceroute, nslookup, whois searches, etc.
To extract information associated with the target organization's website, type the target
website's URL (here, www.certifiedhacker.com) in the enter a domain or IP address field,
and then click on the go button.
Scroll down to view information such as the Network Whois record and DNS records.
Web data extraction is the process of extracting data from web pages available on the company's
website. A company's data such as contact details (email, phone, and fax), URLs, and meta tags (title,
description, keyword) for website promotion, directories, web research, etc. are important sources
of information for an ethical hacker. Web spiders (also known as web crawlers or web robots) such as
Web Data Extractors perform automated searches on the target website and extract specified
information from the target website.
You can duplicate websites by using website mirroring tools such as HTTrack Web Site Copier.
HTTrack is an offline browser utility that downloads a website from the Internet to a local directory,
builds all directories recursively, and transfers HTML, images, and other files from the webserver to
another computer.
Here, we will use the HTTrack Web Site Copier tool to mirror the entire website of the target
organization, store it in the local system drive, and browse the local website to identify possible
exploits and vulnerabilities.
The words available on the target website may reveal critical information that can assist in
performing further exploitation. CeWL is a ruby app that is used to spider a given target URL to a
specified depth, optionally following external li, and returns a list of unique words that can be used
for cracking passwords.
cewl-d 2 -m 5 www.certifiedhacker.com
-d represents the depth to spider the website (here, 2) and -m represents minimum word length
(here, 5).
A unique wordlist from the target website is gathered, as shown in the screenshot.
The minimum word length is 5, and the depth to spider the target website is 2.
Alternatively, this unique wordlist can be written directly to a text file. To do so, type cewl-w
wordlist.txt -d 2 -m 5 www.certifiedhacker.com.
the wordlist file gets saved in the root directory. Type pluma wordlist.txt and press Enter to view the
extracted wordlist.