ATS-15-16 Security Testing Part 3
ATS-15-16 Security Testing Part 3
&
Security Testing
Security Testing
• Security Testing is a Critical Element
– Part of the system’s security lifecycle
– Is in fact the QA of the security in the system
– Provides the only way to assess the quality of security
• Nonetheless, Quality is Uncertain
– No way to measure quality of security testing
– No certification or other type of quality ranking
– And even worse – We only know our testing failed
when it’s too late.
Security Testing
• Creates a Huge Challenge for Organizations
– How to choose the right security testing solution
– How can we guarantee that security testing is
sufficient
• Makes Budget Decisions Harder…
– How can we determine cost effectiveness
– How to deal with purchasing constraints
Security Testing
1. Quality of Security Testing
– False Positives / False Negatives
– Coverage / Validity
– Business Impact / Threat
2. Security Testing Approaches (Pros & Cons)
– Black/Grey/White Box
– Penetration Test vs. Code Review
– Automatic vs. Manual
3. Testing the Tester
– Determining the Right Approach
– Evaluating Tools Quality
– Evaluating Services Quality
1. Quality of Security
Testing
5
Quality of Security Testing
• Quality of Testing is Essentially Measured by Two
Elements:
• False Negatives (Something was missed…)
– Most obvious problem
– Exposes the system to attacks
• False Positives (Something was made up…)
– Surprisingly – equal problem for enterprises
– Generates redundant work and effort
– Creates distrust
– Not necessarily technological (the flaw is there – but
poses no real threat)
Coverage Problems
• Application Data Coverage
– Automatic crawling problems
• Practically infinite links
• Client-side created links (JS/AJAX)
• Proper flow and context data
– Availability
• 3rd Party components can not be tested
• Code unavailable
– Size
• Too many URLs/parameters/code to cover
• Insufficient time 9
Coverage Problems
• Test Coverage
– Vulnerability not tested
• Impossible to test (Logical by Automated, Brute Force by
Manual, etc.)
• Newly Discovered Vulnerability (Not up to date yet…)
• Seemingly Insignificant Vulnerability
• Never-before seen vulnerability (Mostly logical…)
• Test may impair availability or reliability
– Variant not tested
• Too many possible variants (common with injection
problems) – may require very specific tweaking
• Logical vulnerability extremely dependent on actual
application
10
Business Impact
• How Dangerous is This Vulnerability?
• Still Controversial – Do we really want to fix just
vulnerabilities posing immediate threats?
• Can we call it a vulnerability if it does nothing?
• Associating Risk
– Organizations usually prioritize effort by risk
– Do we really give each vulnerability the same risk level
in every system?
• Requires Contextual understanding of system
14
2. Security Testing
Approaches
(Pros & Cons)
16
Security Testing Approaches
(Pros & Cons)
• Black/Grey Box
– Application vulnerability scanners
– Manual penetration test
• White Box
– Static code analyzers
– Manual code review
17
Application Vulnerability Scanners
• Application Data Coverage
– Good in terms of volume (Large applications)
– Problematic in contextual aspects
• Complex Flows
• Multiple User Privileges
• Specific data influences code executed
• Coverage of Tests
– Generally good with Technical Vulnerabilities
– Very limited with Logical Vulnerabilities
– Variant Coverage – Wide, but not adaptive
18
Application Vulnerability Scanners
• Test Quality / Proficiency
– Generally Good (Depends on product…)
– However, fails to adapt to changes and non
standard environments
• Validity
– Limited – Generally high rate of False Positives
• No (or very limited) Exploits
• Validation differentiators suffer from test quality
• Business Impact
– None
19
Manual Penetration Testing
• Application Data Coverage
– Good in contextual aspects
• Allows properly utilizing the application
• Differences between users are usually clear
– May be problematic in volume aspects
• Nonetheless, proper categorization can solve volume
issues
• Coverage of Tests
– Very good – if working methodologically
– Variant coverage
• Not necessarily wide
• However, Proper testing allows finding the right
variants 21
Manual Penetration Testing
• Test Quality / Proficiency
– Potentially good – but depends greatly on the person
– Main advantage – allows creativity and adaption to
identify non standard vulnerabilities
• Validity
– Usually good – Easier for person to identify false
positives
– Easier to perform exploits
• Business Impact
– Can be considered by tester
22
White box testing:
Static Code Analyzers
• Application Data Coverage
– Generally good (no crawling setbacks)
– Problematic when not all code available
• Coverage of Tests
– Generally good with technical vulnerabilities
– Very limited with logical vulnerabilities
– Variant coverage – wide, but not adaptive
25
Manual Code Review (Static)
• Application Data Coverage
– Can be problematic – Usually impossible to go
over every line of code
– Requires smart analysis of what to review and
what not to review
• Coverage of Tests
– Generally good with technical vulnerabilities
– Somewhat limited with logical vulnerabilities
(often hard to determine full logic of non
running code)
28
Choosing the Right Approach
• Determining the Types of Threats
• Determining the Required Frequency
• Weighing Pros, Cons and Costs
• Usually – A combination of approaches applies:
– Manual Penetration Test + Partial Code Review
– Manual Penetration Test + Scanner (Free/Commercial)
– Scanner + Partial Manual Penetration Test (Validation)
– Scanner + Static Code Analyzer (Correlation)
– Static Code Analyzer + Manual Code Review (Validation)
– Etc…
31
Testing the Tester:
Evaluating Quality of
Security Testing
32
Testing the Tester
• The Hardest Part – Determining the quality
of security testing solution:
• This Consists Of:
– Identify % of false negatives
– Identify % of false positives
– Other aspects (Not discussed here)
• Performance
• Management
• Reporting
• Etc.
33
Testing the Tester
• How NOT to Determine Quality
– Marketing material
– Sales pitches
– Magazine articles (Usually not professional enough)
– Benchmarking on known applications (WebGoat,
Hackme Bank, etc.)
• So What Should We Do?
– References (that we trust…)
– Comparative analysis (on our systems)
– Ideally – Compare with a “perfect” report
34
Product Assessment
• Comparative Analysis
– Run several products on a few systems in the enterprise
– False Negatives
• Ideally – compare against a report containing all findings –
identify percent of false negative in each product.
• Alternatively – unite the real findings from all reports, and
compare against that
– False Positives
• Perform validation of each finding to eliminate all false
positives
• Note the amount of false positives in each product
– Assess other aspects (if needed) – Details of report,
speed of execution, etc. 35
Service Assessment
• Much Trickier
– Hiring a consultant is like hiring an employee
• First of All - References
– Find references of customers with similar
environments and needs
– Ask around yourself – the vendor will always provide
you with their best references!
• Check the Specific Consultant
– It’s not just about the company – it’s about the people
involved in the project
– Check the resume, perform an interview
36
Service Assessment
• Comparative Analysis
– Similar to product – great way of comparing services
– Main problem – Usually expensive
– Important note – The benchmarking should be done
without prior knowledge of the testers!
– False positive & negative assessment:
• Mostly similar
• Business impact, however, now plays a role – a good
tester should eliminate (or downgrade) non hazardous
findings
• See if testing includes strong validation (exploitation)
– Quality of report and information gathered in it should also
be examined
37
More on Security Testing
Test suites & test oracles
To test a SUT (System Under Test) we need two things
2. a test oracle
which decides if a test was passed ok or reveals an error
Both defining test suites and test oracles can be a lot of work!
• In the worst case, a test oracle is a long list of specification cases
which, for each individual test case, specifies exactly what should
happen
• In the best case, as test oracle we simply look if an application
crashes
8
Security testing is HARD
• Normal testing will look at right, wanted behaviour for sensible
inputs (aka the happy flow), and some inputs on borderline
conditions
9
Security testing is HARD
space of all possible inputs
. some input
. . . .. . inputs
normal
. input that triggers
security bug
10
abuse cases & negative test cases
• Thinking about abuse cases is a useful way to come up with
security tests
11
Fuzzing
• Fuzzing aka fuzz testing is a highly effective, largely automated,
security testing technique
• The original form of fuzzing: generate very long inputs and see if
the system crashes with a segmentation fault.
15
Simple fuzzing ideas
What inputs would you use for fuzzing?
16
Fuzzing web-applications
• How could a fuzzer detect SQL injections or XSS weaknesses?
– For SQL injection: monitor database for error messages
– For XSS, see if the website echoes HTML tags in user input
• There are various tools to fuzz web-applications: Spike proxy, HP
Webinspect, AppScan, WebScarab, Wapiti, w3af, RFuzz,
WSFuzzer, SPI Fuzzer Burp, Mutilidae, ...
• Some fuzzers crawl a website, generating traffic themselves,
other fuzzers modify traffic generated by some other means.
• Can we expect false positives/negatives?
– false negatives due to test cases not hitting the vulnerable cases
– false positives & negatives due to incorrect test oracle, eg
• for SQL injection: not recognizing some SQL database errors
(false neg)
• for XSS: signaling quoted echoed response as XSS (false pos)
19
Why Security Testing
Ø For Finding Loopholes
Ø For Zeroing IN on Vulnerabilities
Ø For identifying Design Insecurities
Ø For identifying Implementation Insecurities
Ø For identifying Dependency Insecurities and Failures
Ø For Information Security
Ø For Process Security
Ø For Internet Technology Security
Ø For Communication Security
Ø For Improving the System
Ø For confirming Security Policies
Ø For Organization wide Software Security
Ø For Physical Security
Security Testing Techniques
Ø OS Hardening
v Configure and Apply Patches
v Updating the Operating System
v Disable or Restrict unwanted Services and Ports
v Manage the Log Files
v Install Root Certificate
v Protect from Internet Misuse and be Cyber Safe
v Protect from Malware
Ø Vulnerability Scanning
v Identify Known Vulnerabilities
v Scan Intrusively for Unknown Vulnerabilities
Security Testing Techniques (continued…)
Ø Penetration Testing
v Simulating Attack from a Malicious Source
v Includes Network Scanning and Vulnerability Scanning
v Simulates Attack from someone Unfamiliar with the System
v Simulates Attack by having access to Source Code, Network,
Passwords
Ø Port Scanning and Service Mapping
v Identification and locating of Open Ports
v Identification of Running Services
Ø Firewall Rule Testing
v Identify Inappropriate or Conflicting Rules
v Appropriate Placement of Vulnerable Systems behind Firewall
v Discovering Administrative Backdoors or Tunnels
Ø SQL Injection
v Exploits Database Layer Security Vulnerability
v Unexpected Execution of User Inputs
Security Testing Techniques (continued…)
Ø Cross Side Scripting
v Injecting Malicious Client Side Script into Web Pages
v Persistent, Non-Persistent and DOM based Vulnerabilities
Ø Parameter Manipulation
v Cookie Manipulation
v Form Field Manipulation
v URL Manipulation
v HTTP Header Manipulation
Ø Denial of Service Testing
v Flooding a target machine with enough traffic to make it
incapable
Ø Command Injection
v Inject and execute commands specified by the attacker
v Execute System level commands through a Vulnerable
Application
Security Testing Techniques (continued…)
Ø Network Scanning
v Identifying Active Hosts on a network
v Collecting IP addresses that can be accessed over the Internet
v Collecting OS Details, System Architecture and Running
Services
v Collecting Network User and Group names
v Collecting Routing Tables and SNMP data
Ø Password Cracking
v Collecting Passwords from the Stored or Transmitted Data
v Using Brute Force and Dictionary Attacks
v Identifying Weak Passwords
Ø Ethical Hacking
v Penetration Testing, Intrusion Testing and Red Teaming
Ø File Integrity Testing
v Verifying File Integrity against corruption using Checksum
Security Testing Techniques (continued…)
Ø War Dialing
v Using a Modem to dial a list of Telephone Numbers
v Searching for Computers, Bulletin Board System and Fax
Machines
Ø Wireless LAN Testing
v Searching for existing WLAN and logging Wireless Access
Points
Ø Buffer Overflow Testing
v Overwriting of Memory fragments of the Process, Buffers of Char
type
Ø Random Data Testing
v Random Data Inputs by a Program
v Encoded Random Data included as Parameters
v Crashing built-in code Assertions
Security Testing Techniques (continued…)
Ø Random Mutation Testing
v Bit Flipping of known Legitimate Data
v Byte stream Sliding within known Legitimate Data
Ø Session Hijacking
v Exploitation of Valid Computer Session
v Exploitation of the Web Session control mechanism
v Gain unauthorized access to the Web Server
Ø Phishing
v Masquerading as a trustworthy entity in an electronic
communication
v Acquiring usernames, passwords and credit card details
Ø URL Manipulation
v Make a web server Deliver inaccessible web pages
v URL Rewriting
Security Testing Techniques (continued…)
Ø IP Spoofing
v Creating Internet Protocol (IP) packets with a forged source IP
address
Ø Packet Sniffing
v Capture and Analyze all of the Network traffic
Ø Virtual Private Network Testing
v Penetration Testing
Ø Social Engineering
v Psychological Manipulation of People
v Leaking confidential information
Additional Info
Complexity Vs Security
As Functionality and
hence complexity
increase security
decreases.
Code
Requirements Design Test plans Test Field
and use cases results feedback
Code
Review
Application Security Risk
Categorization
• Goal
– More security for riskier applications
– Ensures that you work the most critical issues first
– Scales to hundreds or thousands of applications
72