Software Testing Dictionary
Software Testing Dictionary
1
Verification and Validation (V & V)
• Verification refers to the set of activities that ensure that
software correctly implements a specific function
• Validation refers to a different set of activities that ensure
that the software that has been built is traceable to customer
requirements
• Boehm states that
– Verification: “Are we building the product right?”
– Validation: “Are we building the right product?”
2
Software Testing Steps
Requirement High-order
Tests
Design Integration Test
Code Unit
Test
Unit Testing
interface
Module local data structure
(or unit) boundary conditions
basis paths
error handling paths
Test Cases
3
Unit Testing Environment
interface
Driver local data structure
boundary conditions
basis paths
Module error handling paths
to be tested
Stubs’ Complexity
4
Integration Testing
• Incremental approaches:
– top-down
– bottom-up
– sandwich
• Non-Incremental
– big-bang
5
Integration Testing: Top Down Approach
A
B C D
E F G
H I J K
L M
Breadth First: A B C D E F G H I J K L M
Depth First: A B E H C F I D G J L M K
2110423: Software Engineering lecture 12 11
6
Integration Testing: Bottom-Up Approach
A
B C D
E F G
H I J K
L M
•Build 1: H E B
•Build 2: I F C D
•Build 3: L M J K G
•after that Integrate Build 1, 2, and 3 with module A
Sandwich Approach
• combine top down and bottom up
• consider
– A,B,C,D,G, and J are logic modules ==> top down
– E,F,H,I,K, and L are functional modules ==> bottom up
• when all modules have been appropriately integrated, the
interfaces between the 2 groups are tested one by one
7
Steps for Integration Testing
• All modules should be unit tested
• Choose integration testing strategy
• Do WB/BB, test input/output parameters
• Exercise all modules and all calls
• Keep records (test results, test activities, faults)
System Testing
• starts after integration testing
• ends when
– we have successfully determined system capabilities
– we have identified and corrected known problems
– we confidence that system is ready for acceptance
8
Components of System Testing
• Requirement-based functional tests
• Performance Capabilities
• Stress or Volume tests
• Security Testing
• Recovery Testing
• Quality attribute - reliability, maintainability, integrity
9
Requirement-Based System Test
• look for systematic coverage: use a functional coverage
matrix
• the coverage matrix is different from the unit-testing
• we are now planning the testing for a group of programs
instead of a single one
01
Stress or Volume Testing
• is designed to confront the system with abnormal monitors
• drive the system to its limit and determine whether it breaks
down
• first test to specification, then break and analyze
• determine how many transactions or records can be
operationally supported
• demand resources in abnormal quality, frequency, and
volume
2110423: Software Engineering lecture 12 21
1
Acceptance Testing
• to provide clients/users with confidence and insure that the
software is ready to use
• begins when system test is complete
• test cases are subset of the system test
• acceptance tests are based on functionality and
performance requirements
• take typical day’s transactions, month or year of operation
2110423: Software Engineering lecture 12 23
21
Requirements for Acceptance Testing
• tests must be run on operational hardware and software
• tests must stress the system significantly
• all interfaced systems must be in operation throughout the
test
• the test duration should run a full cycle
• tests should exercise the system over a full range inputs
• all major functions and interfaces should be exercised
• if running the entire test cannot be completed, a new run
should include a complete start
2110423: Software Engineering lecture 12 25
Alpha Test
• conduct at developer’s site
• invite users
• developers interact with users
• record errors, and usage problems
31
Beta Test
• conduct at customer site by users
• developer usually not present
• users record and report problems, developers fix them and
then release
Regression Test
• involves a retesting of software after changes have been
made to insure that its basic functionality has not been
affected by the changes
• insure that no new errors have been introduced
• involves rerunning old test cases
• automation is necessary since this process can be very time
consuming
41
Debugging
• when tests found faults or defects
• is programmer’s responsibility
• purpose of debugging
– locate fault (find causes and prevention)
– correct it
– retest to ensure that you have removed bug and not introduced
other
Bug Consequences
• Mild - misspell output, lack of white space
• Moderate - output may be misleading or redundant
• Annoying - users need tricks to get system to work
• Disturbing - refuses to handle legitimate transaction
• Serious - looses track of transaction and its occurrence
• Very serious - bug causes system to incorrect transaction
51
Bug Consequences (cont.)
• Extreme - problem limited to a few user or a few transaction
type, frequent, and arbitrary
• Intolerable -long term unrecoverable corruption of database,
system shutdown may need to occur
• Catastrophic - system fails
• Infectious - corrupts other systems, system that causes loss
of life
61