0% found this document useful (0 votes)
2 views

LabManual-2

The document is a lab manual focused on advanced UNIX network commands, detailing their purposes, safe and risky usages, and providing examples. It covers commands such as scp, sftp, wget, curl, nc, ssh, nmap, and iptables, emphasizing the importance of caution and proper authorization when using powerful network tools. Additionally, it includes lab experiments aimed at enhancing understanding through practical application of these commands.

Uploaded by

Raghuram Alur
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

LabManual-2

The document is a lab manual focused on advanced UNIX network commands, detailing their purposes, safe and risky usages, and providing examples. It covers commands such as scp, sftp, wget, curl, nc, ssh, nmap, and iptables, emphasizing the importance of caution and proper authorization when using powerful network tools. Additionally, it includes lab experiments aimed at enhancing understanding through practical application of these commands.

Uploaded by

Raghuram Alur
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Computer Networks Lab Manual: Lab-II

(Advanced UNIX Network Commands)

Sanjay K. Sahay
Dept. of CSIS, BITS Pilani, K.K. Birla Goa Campus

January 14, 2025


Advanced UNIX Network Commands

0.1 Theory
As networks grow larger and more complex, administrators and power users require more ad-
vanced tools for secure remote administration, file transfers, network scanning, automated re-
quests, and firewall configuration. This chapter explores a variety of powerful commands that
address these needs. However, some of these commands (especially those that modify firewall
rules or alter system connectivity) can disrupt your network or introduce security vulnerabili-
ties if misused. Therefore, while we encourage experimentation for learning, always proceed
with caution and only on systems or networks where you have proper authorization.
Below are the advanced commands we will cover in detail:

• scp and sftp

• wget

• curl

• nc (netcat)

• ssh

• nmap

• iptables

We will highlight benign commands that only gather information and pose minimal risk, as
well as risky commands that can change configurations or break connectivity.

0.1.1 scp and sftp


Purpose: scp (Secure Copy) and sftp (SSH File Transfer Protocol) facilitate encrypted
file transfers between hosts over the SSH protocol. These commands replace older, insecure
methods like ftp.

Safe vs. Risky Usage:

• Safe / Benign: Downloading or uploading files when you already have SSH access is
typically safe, as long as you have the correct permissions on the remote server.

• Risky: Overwriting critical configuration files on the server or transferring sensitive data
without verifying the remote host’s authenticity can lead to security or stability issues.

1
Usage and Examples:
• scp localfile.txt user@host:/home/user/ Copies localfile.txt to
/home/user/ on the remote host. Prompted for user’s password or private key
passphrase if necessary.
• scp -r mydir user@host:/home/user/ Recursively copies all files and sub-
directories within mydir to the remote host.
• sftp user@host Opens an interactive SFTP session. You can navigate the remote
filesystem using cd or ls, then upload (put) or download (get) files.
• sftp -i /.ssh/mykey_rsa user@host Uses a specified private key (mykey_rsa)
instead of a password. Ensure you keep private keys secure and never share them.

Additional Tips:
• scp and sftp rely on SSH, so any issues with SSH configuration (e.g., firewall rules or
missing keys) will affect these commands.
• For automated transfers (like cron jobs), use SSH key-based authentication to avoid stor-
ing passwords in scripts.

0.1.2 wget
Purpose: wget is a non-interactive command-line utility for retrieving files over HTTP,
HTTPS, and FTP. It’s ideal for scripting, mirroring websites, and performing unattended down-
loads.

Safe vs. Risky Usage:


• Safe / Benign: Downloading publicly available files or websites is generally harmless.
• Risky: Mass-downloading entire sites without permission or using wget for web scrap-
ing against site policies can cause legal or ethical issues. Also be wary of executing
unknown downloaded files.

Key Features:
• Non-interactive: Continues even if you log out.
• Recursive Download: Mirrors entire directories or websites.
• Resume Option: Automatically resumes partial downloads if the server supports it.

Common Options:
• -O <filename>: Save the downloaded file under a specific name instead of the de-
fault.
• -r: Recursive downloading of linked pages or files.
• -k: Convert links in downloaded HTML pages for local offline viewing.
• -c: Resume an interrupted download (if the server supports partial content requests).

2
Examples:

• wget https://www.example.com/file.zip Downloads file.zip to the cur-


rent directory.

• wget -r -k -p https://www.example.com/ Recursively fetches a website,


grabs all page requisites (-p), and converts links (-k) for offline browsing.

• wget -c https://www.example.com/big.iso Resumes an interrupted down-


load of big.iso.

0.1.3 curl
Purpose: curl is an extremely versatile data transfer tool supporting HTTP(S), FTP(S),
SCP, SFTP, and more. It’s commonly used for REST API testing, retrieving files, or debugging
web servers.

Safe vs. Risky Usage:

• Safe / Benign: Basic GET requests, retrieving public URLs, or performing read-only
API requests are generally safe.

• Risky: Posting sensitive data to the wrong endpoint or ignoring SSL certificate checks
(-k) can lead to security risks.

Common Options:

• -O: Save output to a file named as on the remote server.

• -L: Follow HTTP 3xx redirects automatically.

• -X <method>: Specify the HTTP method (GET, POST, PUT, DELETE, etc.).

• -d <data>: Send form data or raw JSON/XML in a request body.

• -H <header>: Add custom headers (e.g., Content-Type: application/json).

• -u <user:pass>: Provide basic authentication credentials.

Examples:

• curl https://www.example.com Performs a GET request and prints the re-


sponse HTML to stdout.

• curl -I https://www.example.com Fetches only the HTTP headers, revealing


status codes, server type, and content length.

• curl -X POST -d "name=Alice" https://api.example.com/users Sends


form data (name=Alice) in a POST request.

• curl -k https://self-signed.example.com Ignores SSL certificate vali-


dation (risky in production).

3
0.1.4 nc (netcat)
Purpose: nc, also known as netcat, is called the “Swiss army knife of networking.” It can
open TCP/UDP connections, listen on ports, transfer files, and even perform simple port scans.

Safe vs. Risky Usage:

• Safe / Benign: Listening on a port for debugging, testing connectivity by connecting to


known open ports.

• Risky: Port scanning hosts without permission, creating backdoors, or transferring criti-
cal files in unsecured environments.

Common Usage:

• nc -l -p 1234 Listen on TCP port 1234 for incoming connections.

• nc <host> <port> Connect to <host>:<port> as a client. Useful for banner


grabbing (e.g., HTTP server headers).

• nc -u Enable UDP mode.

• nc -v -n -z <target> 1-1000 Verbose, numeric (no DNS), zero-I/O mode to


quickly scan ports 1–1000 on <target>.

File Transfer Example:

1. On the receiving machine:

nc -l -p 5000 > received.txt

2. On the sending machine:

cat data.txt | nc <receiver_ip> 5000

Upon completion, received.txt should match data.txt.

0.1.5 ssh
Purpose: ssh (Secure Shell) enables secure remote logins, command execution, and tunnel-
ing. It replaces older, insecure protocols like telnet.

Safe vs. Risky Usage:

• Safe / Benign: Read-only activities on a remote system you are authorized to access.

• Risky: Forwarding ports (-L, -R), or changing critical configurations on production


servers if you do not fully understand the implications.

4
Key Features:
• Encrypted Connections: Prevents eavesdropping.

• Key-Based Authentication: More secure and script-friendly than passwords.

• Tunneling/Port Forwarding: Forward local or remote ports through the SSH tunnel
securely.

Examples:
• ssh user@host Prompt for password (or key passphrase) to log into host.

• ssh -i /.ssh/mykey_rsa user@host Use a specific private key instead of a


password.

• ssh -L 8080:localhost:80 user@host Forwards local port 8080 to port 80


on host, effectively tunneling HTTP traffic.

0.1.6 nmap
Purpose: nmap is a powerful network scanner used to discover hosts, open ports, running
services, operating system details, and potential vulnerabilities.

Safe vs. Risky Usage:


• Safe / Benign: Scanning a lab machine or a host you own for educational purposes.

• Risky: Scanning networks without permission is often illegal or against usage policies.
High-intensity scans can appear hostile to intrusion detection systems.

Common Scanning Techniques:


• -sS (SYN Scan): Half-open scan, less likely to be logged.

• -sT (Connect Scan): Uses the OS’s connect call. Easier to detect in logs.

• -sV (Version Detection): Attempts to identify service versions (e.g., Apache 2.4.29).

• -O (OS Fingerprinting): Tries to guess the remote OS.

• -A: Enables several advanced features including OS detection, version detection, and
default script scanning.

Examples:
• nmap -sS <target_host> Performs a stealthy SYN scan of <target_host>.

• nmap -sV -O <target_host> Performs version detection (-sV) and OS finger-


printing (-O).

• nmap -A -T4 <target_host> Comprehensive scan (-A) with faster timing (-T4)
on a reliable network.

5
0.1.7 iptables
Purpose: iptables configures the Linux netfilter firewall, allowing granular control of
inbound and outbound packets. It is extremely powerful but can also break network access if
misconfigured.

Safe vs. Risky Usage:

• Safe / Benign: Viewing current rules (sudo iptables -L -v) is harmless.

• Risky: Changing default policies to DROP, removing essential ACCEPT rules, or incor-
rectly forwarding ports can immediately lock you out of the system.

Key Concepts:

• Tables:

– filter – main firewall logic for ACCEPT/DROP.


– nat – network address translation and port forwarding.
– mangle, raw – advanced packet modifications.

• Chains (in filter table):

– INPUT – for packets destined for the local system.


– FORWARD – for routed packets going through the system.
– OUTPUT – for packets originating from the local system.

• Targets: ACCEPT, DROP, REJECT, etc.

Safe Viewing:

sudo iptables -L -v
sudo iptables -t nat -L -v
sudo iptables -S

These commands list the existing rules without modifying them.

Example Administration:

sudo iptables -A INPUT -p tcp --dport 22 -j ACCEPT


sudo iptables -P INPUT DROP

By first adding an ACCEPT rule for SSH (dport 22) and then setting the default INPUT
policy to DROP, only explicitly allowed ports remain open. Use caution: a single error can
block remote administration.

6
0.2 Lab Experiments: Advanced Commands
Below are a series of advanced experiments to deepen your understanding. Each experiment
specifies its Objective, the Expected Outcome, and step-by-step Instructions. Always verify
you have permission before scanning, transferring files, or modifying firewall rules on any
network or system.

1. Experiment: Secure File Transfers

• Tools: scp, sftp


• Objective: Practice encrypted file transfers using SSH, understand how to upload
and download files, and compare scp vs. sftp usage.
• Instructions:
(a) Basic Copy with scp:
scp localfile.txt user@<remote_host>:/home/user/

Verify that localfile.txt now appears in /home/user/ on the remote


machine.
(b) Interactive sftp:
sftp user@<remote_host>

After logging in, use ls to list remote files, put anotherfile.txt to


upload a file, and get remotefile.txt to download. Then exit (bye or
quit).
(c) (Optional) Key-Based Authentication: If you have SSH keys set up, try:
scp -i ~/.ssh/mykey_rsa localfile.txt \
user@<remote_host>:/home/user/

Confirm you do not need a password unless your key has a passphrase.
• Expected Outcome:
– You successfully transfer files to/from the remote server.
– scp and sftp both use SSH, but sftp offers an interactive interface while
scp is non-interactive (better for scripts).
– Key-based authentication simplifies automated transfers, avoiding stored pass-
words.

2. Experiment: Automated Downloads and API Calls

• Tools: wget, curl


• Objective: Learn how to automate file downloads and interact with web APIs.
• Instructions:
(a) Website/File Download:
wget https://www.example.com/testfile.txt

Check that testfile.txt is saved to your current directory.

7
(b) Viewing HTTP Headers with curl:
curl -I https://www.example.com

Observe the HTTP status code (200 OK, etc.), server type, and any other head-
ers (e.g., Date, Content-Length).
(c) REST API Request: If you have a local or test API endpoint, try:
curl -X POST \
-d ’{"name":"Alice"}’ \
-H "Content-Type: application/json" \
https://api.example.com/users

Check the JSON response or status code returned by the API.


• Expected Outcome:
– You retrieve public files with wget and see them in your local filesystem.
– curl -I displays the response headers from a website.
– A successful POST request returns a valid response from the test API (e.g.,
"id":123,"name":"Alice").

3. Experiment: Netcat Listeners and Transfers

• Tools: nc (netcat)
• Objective: Use nc to create an ad-hoc server (listener) on one machine and connect
from another, transferring files or messages in real time. This demonstrates basic
TCP usage and nc’s flexibility.
• Instructions:
(a) Set Up a Listener (Machine A):
nc -l -p 5000 > received.txt

Now Machine A is listening on TCP port 5000 and redirects all incoming data
into received.txt.
(b) Send Data (Machine B):
cat data.txt | nc <MachineA_IP> 5000

Replace <MachineA_IP> with the IP of Machine A. nc connects to port


5000 and streams the contents of data.txt.
(c) Verify Transfer (Machine A): Once the transfer completes, open received.txt
and confirm it matches data.txt.
• Expected Outcome:
– You see real-time data transfer from Machine B to Machine A over TCP.
– received.txt is identical to data.txt.
– You gain insight into how nc can read from stdin and write to stdout, making
it a flexible tool for quick debugging or file transfer.

4. Experiment: Nmap Scanning

8
• Tools: nmap
• Objective: Investigate open ports and running services on a test machine or lab
environment. Understand how to interpret scan results.
• Instructions:
(a) Basic SYN Scan:
nmap -sS <target_host>

Lists which ports are open/filtered/closed.


(b) Service Version and OS Detection:
nmap -sV -O <target_host>

Attempts to identify the service versions (e.g., SSH 7.9p1, Apache 2.4) and
guess the OS (e.g., Linux kernel 5.x).
(c) Nmap Scripting Engine (Optional):
nmap --script=http-title -p 80 <target_host>

Fetches the HTTP title on port 80, if running a web server.


• Expected Outcome:
– A list of open ports on the target, like 22/tcp open ssh, 80/tcp open
http, etc.
– If version detection succeeds, nmap might show “Apache httpd 2.4.46” or
“OpenSSH 8.2p1.”
– With -O, it might guess “Linux 4.15 - 5.0” or similar OS.
– The http-title script displays the title tag of the website’s homepage.
• Warning: Scanning a host without explicit permission can violate acceptable use
policies or even local laws. Only scan machines that you own or are explicitly
allowed to test.

5. Experiment: Basic Firewall Rule Testing

• Tools: iptables
• Objective: Carefully add a rule to allow SSH or HTTP inbound, then remove it.
Observe how changes affect connectivity. Perform this only on a test machine or
virtual environment to avoid lockouts.
• Instructions:
(a) View Existing Rules (Safe):
sudo iptables -L -v

Check which ports or services are currently allowed or blocked. Note the de-
fault policy (ACCEPT/DROP).
(b) Add an Allow Rule (Optional):
sudo iptables -A INPUT -p tcp --dport 22 -j ACCEPT

9
This explicitly allows inbound SSH connections on port 22.
(c) Verify SSH Connectivity: From another machine, attempt:
ssh user@<test_machine_ip>

Confirm that you can connect successfully.


(d) Remove the Rule:
sudo iptables -D INPUT 1

Assuming the rule you added is the first in the chain. Verify it’s gone by listing
rules again.
(e) Optional Default Policy Change:
sudo iptables -P INPUT DROP

This sets all inbound traffic to DROP by default. You should ensure critical
ports (like SSH) have ACCEPT rules first, or you risk locking yourself out.
• Expected Outcome:
– You see how a new rule appears in the INPUT chain using sudo iptables
-L.
– SSH remains accessible due to the ACCEPT rule. Removing that rule or setting
the default policy to DROP could block SSH unless properly configured.
– You gain practical experience managing basic firewall rules, understanding
how changes can immediately impact connectivity.
• Warning: Improper iptables configuration can break network access. Always
have a direct console or backup method to revert changes if you become locked out.

Note: Remember that many of these advanced commands can significantly impact your system
or network if misused. Always test in a safe, controlled environment, maintain proper backups
of critical configurations, and only scan or modify settings on networks you are authorized to
access. Practicing good documentation and change management will help you avoid accidental
outages or security breaches.

10

You might also like