0% found this document useful (0 votes)
48 views56 pages

ATS-15-16 Security Testing Part 3

The document discusses security testing and provides guidance on evaluating the quality of security testing solutions. It makes the following key points: 1. It is difficult to measure the quality and effectiveness of security testing. Factors like false positives, coverage, and business impact must be considered. 2. Different security testing approaches like vulnerability scanning, penetration testing, and code review each have their own pros and cons in terms of coverage, quality, and validity. 3. To evaluate security testing solutions, organizations should perform comparative analysis on their own systems, check references, and validate findings to determine accuracy rates and other quality aspects. Testing the tester is crucial but challenging.

Uploaded by

Usman Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views56 pages

ATS-15-16 Security Testing Part 3

The document discusses security testing and provides guidance on evaluating the quality of security testing solutions. It makes the following key points: 1. It is difficult to measure the quality and effectiveness of security testing. Factors like false positives, coverage, and business impact must be considered. 2. Different security testing approaches like vulnerability scanning, penetration testing, and code review each have their own pros and cons in terms of coverage, quality, and validity. 3. To evaluate security testing solutions, organizations should perform comparative analysis on their own systems, check references, and validate findings to determine accuracy rates and other quality aspects. Testing the tester is crucial but challenging.

Uploaded by

Usman Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

Software Testing

&
Security Testing
Security Testing
• Security Testing is a Critical Element
– Part of the system’s security lifecycle
– Is in fact the QA of the security in the system
– Provides the only way to assess the quality of security
• Nonetheless, Quality is Uncertain
– No way to measure quality of security testing
– No certification or other type of quality ranking
– And even worse – We only know our testing failed
when it’s too late.
Security Testing
• Creates a Huge Challenge for Organizations
– How to choose the right security testing solution
– How can we guarantee that security testing is
sufficient
• Makes Budget Decisions Harder…
– How can we determine cost effectiveness
– How to deal with purchasing constraints
Security Testing
1. Quality of Security Testing
– False Positives / False Negatives
– Coverage / Validity
– Business Impact / Threat
2. Security Testing Approaches (Pros & Cons)
– Black/Grey/White Box
– Penetration Test vs. Code Review
– Automatic vs. Manual
3. Testing the Tester
– Determining the Right Approach
– Evaluating Tools Quality
– Evaluating Services Quality
1. Quality of Security
Testing

5
Quality of Security Testing
• Quality of Testing is Essentially Measured by Two
Elements:
• False Negatives (Something was missed…)
– Most obvious problem
– Exposes the system to attacks
• False Positives (Something was made up…)
– Surprisingly – equal problem for enterprises
– Generates redundant work and effort
– Creates distrust
– Not necessarily technological (the flaw is there – but
poses no real threat)
Coverage Problems
• Application Data Coverage
– Automatic crawling problems
• Practically infinite links
• Client-side created links (JS/AJAX)
• Proper flow and context data
– Availability
• 3rd Party components can not be tested
• Code unavailable
– Size
• Too many URLs/parameters/code to cover
• Insufficient time 9
Coverage Problems
• Test Coverage
– Vulnerability not tested
• Impossible to test (Logical by Automated, Brute Force by
Manual, etc.)
• Newly Discovered Vulnerability (Not up to date yet…)
• Seemingly Insignificant Vulnerability
• Never-before seen vulnerability (Mostly logical…)
• Test may impair availability or reliability
– Variant not tested
• Too many possible variants (common with injection
problems) – may require very specific tweaking
• Logical vulnerability extremely dependent on actual
application
10
Business Impact
• How Dangerous is This Vulnerability?
• Still Controversial – Do we really want to fix just
vulnerabilities posing immediate threats?
• Can we call it a vulnerability if it does nothing?
• Associating Risk
– Organizations usually prioritize effort by risk
– Do we really give each vulnerability the same risk level
in every system?
• Requires Contextual understanding of system

14
2. Security Testing
Approaches
(Pros & Cons)

16
Security Testing Approaches
(Pros & Cons)
• Black/Grey Box
– Application vulnerability scanners
– Manual penetration test

• White Box
– Static code analyzers
– Manual code review

17
Application Vulnerability Scanners
• Application Data Coverage
– Good in terms of volume (Large applications)
– Problematic in contextual aspects
• Complex Flows
• Multiple User Privileges
• Specific data influences code executed
• Coverage of Tests
– Generally good with Technical Vulnerabilities
– Very limited with Logical Vulnerabilities
– Variant Coverage – Wide, but not adaptive
18
Application Vulnerability Scanners
• Test Quality / Proficiency
– Generally Good (Depends on product…)
– However, fails to adapt to changes and non
standard environments
• Validity
– Limited – Generally high rate of False Positives
• No (or very limited) Exploits
• Validation differentiators suffer from test quality
• Business Impact
– None
19
Manual Penetration Testing
• Application Data Coverage
– Good in contextual aspects
• Allows properly utilizing the application
• Differences between users are usually clear
– May be problematic in volume aspects
• Nonetheless, proper categorization can solve volume
issues
• Coverage of Tests
– Very good – if working methodologically
– Variant coverage
• Not necessarily wide
• However, Proper testing allows finding the right
variants 21
Manual Penetration Testing
• Test Quality / Proficiency
– Potentially good – but depends greatly on the person
– Main advantage – allows creativity and adaption to
identify non standard vulnerabilities
• Validity
– Usually good – Easier for person to identify false
positives
– Easier to perform exploits
• Business Impact
– Can be considered by tester

22
White box testing:
Static Code Analyzers
• Application Data Coverage
– Generally good (no crawling setbacks)
– Problematic when not all code available
• Coverage of Tests
– Generally good with technical vulnerabilities
– Very limited with logical vulnerabilities
– Variant coverage – wide, but not adaptive

25
Manual Code Review (Static)
• Application Data Coverage
– Can be problematic – Usually impossible to go
over every line of code
– Requires smart analysis of what to review and
what not to review
• Coverage of Tests
– Generally good with technical vulnerabilities
– Somewhat limited with logical vulnerabilities
(often hard to determine full logic of non
running code)
28
Choosing the Right Approach
• Determining the Types of Threats
• Determining the Required Frequency
• Weighing Pros, Cons and Costs
• Usually – A combination of approaches applies:
– Manual Penetration Test + Partial Code Review
– Manual Penetration Test + Scanner (Free/Commercial)
– Scanner + Partial Manual Penetration Test (Validation)
– Scanner + Static Code Analyzer (Correlation)
– Static Code Analyzer + Manual Code Review (Validation)
– Etc…

31
Testing the Tester:
Evaluating Quality of
Security Testing

32
Testing the Tester
• The Hardest Part – Determining the quality
of security testing solution:
• This Consists Of:
– Identify % of false negatives
– Identify % of false positives
– Other aspects (Not discussed here)
• Performance
• Management
• Reporting
• Etc.
33
Testing the Tester
• How NOT to Determine Quality
– Marketing material
– Sales pitches
– Magazine articles (Usually not professional enough)
– Benchmarking on known applications (WebGoat,
Hackme Bank, etc.)
• So What Should We Do?
– References (that we trust…)
– Comparative analysis (on our systems)
– Ideally – Compare with a “perfect” report

34
Product Assessment
• Comparative Analysis
– Run several products on a few systems in the enterprise
– False Negatives
• Ideally – compare against a report containing all findings –
identify percent of false negative in each product.
• Alternatively – unite the real findings from all reports, and
compare against that
– False Positives
• Perform validation of each finding to eliminate all false
positives
• Note the amount of false positives in each product
– Assess other aspects (if needed) – Details of report,
speed of execution, etc. 35
Service Assessment
• Much Trickier
– Hiring a consultant is like hiring an employee
• First of All - References
– Find references of customers with similar
environments and needs
– Ask around yourself – the vendor will always provide
you with their best references!
• Check the Specific Consultant
– It’s not just about the company – it’s about the people
involved in the project
– Check the resume, perform an interview

36
Service Assessment
• Comparative Analysis
– Similar to product – great way of comparing services
– Main problem – Usually expensive
– Important note – The benchmarking should be done
without prior knowledge of the testers!
– False positive & negative assessment:
• Mostly similar
• Business impact, however, now plays a role – a good
tester should eliminate (or downgrade) non hazardous
findings
• See if testing includes strong validation (exploitation)
– Quality of report and information gathered in it should also
be examined
37
More on Security Testing
Test suites & test oracles
To test a SUT (System Under Test) we need two things

1. test suite, ie. collection of input data

2. a test oracle
which decides if a test was passed ok or reveals an error

– ie. some way to decide if the SUT behaves as we want

Both defining test suites and test oracles can be a lot of work!
• In the worst case, a test oracle is a long list of specification cases
which, for each individual test case, specifies exactly what should
happen
• In the best case, as test oracle we simply look if an application
crashes

– Moral of the story: crashes are good ! (for testing)


4
Abuse cases
&
Negative testing

8
Security testing is HARD
• Normal testing will look at right, wanted behaviour for sensible
inputs (aka the happy flow), and some inputs on borderline
conditions

• Security testing also requires looking for the wrong, unwanted


behaviour for really strange inputs

• Similarly, normal use of a system is more likely to reveal


functional problems than security problems:

– users will complain about functional problems,


hackers won't complain about security problems

9
Security testing is HARD
space of all possible inputs

. some input

. . . .. . inputs
normal
. input that triggers
security bug

10
abuse cases & negative test cases
• Thinking about abuse cases is a useful way to come up with
security tests

– what would an attacker try to do?

– where could an implementation slip up?

• This gives rise to negative test cases:

test which are supposed to fail

11
Fuzzing
• Fuzzing aka fuzz testing is a highly effective, largely automated,
security testing technique

• Basic idea: (semi) automatically generate random inputs and see if


an application crashes

• The original form of fuzzing: generate very long inputs and see if
the system crashes with a segmentation fault.

15
Simple fuzzing ideas
What inputs would you use for fuzzing?

• very long or completely blank strings

• max. or min. values of integers, or simply zero and negative values


• depending on what you are fuzzing, include special values,
characters or keywords likely to trigger bugs, eg
– nulls, newlines, or end-of-file characters
– format string characters
– semi-colons, slashes and backslashes, quotes
– application specific keywords halt, DROP TABLES, ...
– ....

16
Fuzzing web-applications
• How could a fuzzer detect SQL injections or XSS weaknesses?
– For SQL injection: monitor database for error messages
– For XSS, see if the website echoes HTML tags in user input
• There are various tools to fuzz web-applications: Spike proxy, HP
Webinspect, AppScan, WebScarab, Wapiti, w3af, RFuzz,
WSFuzzer, SPI Fuzzer Burp, Mutilidae, ...
• Some fuzzers crawl a website, generating traffic themselves,
other fuzzers modify traffic generated by some other means.
• Can we expect false positives/negatives?
– false negatives due to test cases not hitting the vulnerable cases
– false positives & negatives due to incorrect test oracle, eg
• for SQL injection: not recognizing some SQL database errors
(false neg)
• for XSS: signaling quoted echoed response as XSS (false pos)

19
Why Security Testing
Ø For Finding Loopholes
Ø For Zeroing IN on Vulnerabilities
Ø For identifying Design Insecurities
Ø For identifying Implementation Insecurities
Ø For identifying Dependency Insecurities and Failures
Ø For Information Security
Ø For Process Security
Ø For Internet Technology Security
Ø For Communication Security
Ø For Improving the System
Ø For confirming Security Policies
Ø For Organization wide Software Security
Ø For Physical Security
Security Testing Techniques
Ø OS Hardening
v Configure and Apply Patches
v Updating the Operating System
v Disable or Restrict unwanted Services and Ports
v Manage the Log Files
v Install Root Certificate
v Protect from Internet Misuse and be Cyber Safe
v Protect from Malware
Ø Vulnerability Scanning
v Identify Known Vulnerabilities
v Scan Intrusively for Unknown Vulnerabilities
Security Testing Techniques (continued…)
Ø Penetration Testing
v Simulating Attack from a Malicious Source
v Includes Network Scanning and Vulnerability Scanning
v Simulates Attack from someone Unfamiliar with the System
v Simulates Attack by having access to Source Code, Network,
Passwords
Ø Port Scanning and Service Mapping
v Identification and locating of Open Ports
v Identification of Running Services
Ø Firewall Rule Testing
v Identify Inappropriate or Conflicting Rules
v Appropriate Placement of Vulnerable Systems behind Firewall
v Discovering Administrative Backdoors or Tunnels
Ø SQL Injection
v Exploits Database Layer Security Vulnerability
v Unexpected Execution of User Inputs
Security Testing Techniques (continued…)
Ø Cross Side Scripting
v Injecting Malicious Client Side Script into Web Pages
v Persistent, Non-Persistent and DOM based Vulnerabilities
Ø Parameter Manipulation
v Cookie Manipulation
v Form Field Manipulation
v URL Manipulation
v HTTP Header Manipulation
Ø Denial of Service Testing
v Flooding a target machine with enough traffic to make it
incapable
Ø Command Injection
v Inject and execute commands specified by the attacker
v Execute System level commands through a Vulnerable
Application
Security Testing Techniques (continued…)
Ø Network Scanning
v Identifying Active Hosts on a network
v Collecting IP addresses that can be accessed over the Internet
v Collecting OS Details, System Architecture and Running
Services
v Collecting Network User and Group names
v Collecting Routing Tables and SNMP data
Ø Password Cracking
v Collecting Passwords from the Stored or Transmitted Data
v Using Brute Force and Dictionary Attacks
v Identifying Weak Passwords
Ø Ethical Hacking
v Penetration Testing, Intrusion Testing and Red Teaming
Ø File Integrity Testing
v Verifying File Integrity against corruption using Checksum
Security Testing Techniques (continued…)
Ø War Dialing
v Using a Modem to dial a list of Telephone Numbers
v Searching for Computers, Bulletin Board System and Fax
Machines
Ø Wireless LAN Testing
v Searching for existing WLAN and logging Wireless Access
Points
Ø Buffer Overflow Testing
v Overwriting of Memory fragments of the Process, Buffers of Char
type
Ø Random Data Testing
v Random Data Inputs by a Program
v Encoded Random Data included as Parameters
v Crashing built-in code Assertions
Security Testing Techniques (continued…)
Ø Random Mutation Testing
v Bit Flipping of known Legitimate Data
v Byte stream Sliding within known Legitimate Data
Ø Session Hijacking
v Exploitation of Valid Computer Session
v Exploitation of the Web Session control mechanism
v Gain unauthorized access to the Web Server
Ø Phishing
v Masquerading as a trustworthy entity in an electronic
communication
v Acquiring usernames, passwords and credit card details
Ø URL Manipulation
v Make a web server Deliver inaccessible web pages
v URL Rewriting
Security Testing Techniques (continued…)
Ø IP Spoofing
v Creating Internet Protocol (IP) packets with a forged source IP
address
Ø Packet Sniffing
v Capture and Analyze all of the Network traffic
Ø Virtual Private Network Testing
v Penetration Testing
Ø Social Engineering
v Psychological Manipulation of People
v Leaking confidential information
Additional Info
Complexity Vs Security
As Functionality and
hence complexity
increase security
decreases.

Integrating security into


functionality at design time
Is easier and cheaper.
“100 Times More Expensive to Fix
Security Bug at Production Than
Design”
– IBM Systems Sciences Institute

It also costs less in the long-term.


-maintenance cost
A Few Facts and figures:
How Many Vulnerabilities Are Application Security Related?
How do we do it?
• Security Analyst:
– Get involved early in SDLC. Security is a function of
the asset we want to secure, what's it worth?
– Understanding the information held in the application
and the types of users is half the battle.
– Involve an analyst in the design phase and thereafter.
• Developer:
– Embrace secure application development. (Educate)
– Quality is not just “Does it work” Security is a measure
of quality also.
How do we do it? (contd)
• QA:
– Security vulnerabilities are to be considered
bugs, the same way as a functional bug, and
tracked in the same manner.
• Managers:
– Factor some time into the project plan for
security.
– Consider security as added value in an
application.
– $1 spent up front saves $10 during development and $100 after release
Software security tollgates in the SDLC
t
x Cos
r ability to t est,
x Vuln
e
we need
reat do
T h What w eview t
ools
Ris k=
And h
o Code r
Iterative approach

Security Design Static Penetration


requirements Review analysis testing
(tools)
Risk Risk-based
analysis security tests

Code
Requirements Design Test plans Test Field
and use cases results feedback
Code
Review
Application Security Risk
Categorization
• Goal
– More security for riskier applications
– Ensures that you work the most critical issues first
– Scales to hundreds or thousands of applications

• Tools and Methodology


– Security profiling tools can gather facts
• Size, complexity, security mechanisms, dangerous calls
– Questionnaire to gather risk information
• Asset value, available functions, users, environment, threats
– Risk-based approach
• Evaluates likelihood and consequences of successful attack
Application Security Project
Plan
• Define the plan to ensure security at the end
–Ideally done at start of project
–Can also be started before or after development is
complete

• Based on the risk category


–Identify activities at each phase
–Necessary people and expertise required
–Who has responsibility for risks
–Ensure time and budget for security activities
–Establish framework for establishing the “line of sight”
Design Reviews
• Better to find flaws early
• Security design reviews
– Check to ensure design meets requirements
– Also check to make sure you didn’t miss a requirement
• Assemble a team
– Experts in the technology
– Security-minded team members
– Do a high-level penetration test against the design
– Be sure to do root cause analysis on any flaws identified
Software Vulnerability Analysis
• Find flaws in the code early
• Many different techniques
– Static (against source or compiled code)
• Security focused static analysis tools
• Peer review process
• Formal security code review
– Dynamic (against running code)
• Scanning
• Penetration testing
• Goal
– Ensure completeness (across all vulnerability areas)
– Ensure accuracy (minimize false alarms)
Application Security Testing
• Identify security flaws during testing

• Develop security test cases


– Based on requirements
– Be sure to include “negative” tests
– Test all security mechanisms and common vulnerabilities

• Flaws feed into defect tracking and root cause analysis


Application Security Defect
Tracking and Metrics
• “Every security flaw is a process problem”
• Tracking security defects
– Find the source of the problem
• Bad or missed requirement, design flaw, poor implementation, etc…
– ISSUE: can you track security defects the same way as other defects
• Metrics
– What lifecycle stage are most flaws originating in?
– What security mechanisms are we having trouble implementing?
– What security vulnerabilities are we having trouble avoiding?
Configuration Management and
Deployment
• Ensure the application configuration is secure

• Security is increasingly “data-driven”


–XML files, property files, scripts, databases, directories

• How do you control and audit this data?


–Design configuration data for audit
–Put all configuration data in CM
–Audit configuration data regularly
–Don’t allow configuration changes in the field
What now?
"So now, when we face a choice between adding
features and resolving security issues, we
need to choose security."
-Bill Gates

If you think technology can solve your security


problems, then you don't understand the problems
and you don't understand the technology.
-Bruce Schneier
Security Testing--Summary
• Quality of Security Testing is Hard to Measure or
Quantify
• Nonetheless – It is Important to Maintain Adequate
Quality to Address the Threat
• Quality of Security (and Security Testing) Must be
Guaranteed In Advance
• Maintaining Quality has an Associated Cost:
– Testing for quality
– Best products, best tools, best consultants
• Finding the Balance is Crucial

72

You might also like