Software Testing - Important Notes: 1. Verification vs. Validation
The document outlines key concepts in software testing, including the distinction between verification and validation, various levels of testing (unit, integration, system, regression, acceptance), and the responsibilities of developers, testers, and users. It emphasizes the importance of smoke testing, the cost implications of testing, and the need for diverse testing techniques to avoid the pesticide effect. Additionally, it details types of system and performance testing to ensure software meets specifications and user requirements.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
17 views3 pages
Software Testing - Important Notes: 1. Verification vs. Validation
The document outlines key concepts in software testing, including the distinction between verification and validation, various levels of testing (unit, integration, system, regression, acceptance), and the responsibilities of developers, testers, and users. It emphasizes the importance of smoke testing, the cost implications of testing, and the need for diverse testing techniques to avoid the pesticide effect. Additionally, it details types of system and performance testing to ensure software meets specifications and user requirements.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3
Software Testing - Important Notes
1. Verification vs. Validation
• Verification: Ensures the software is being built right (conformance to previous artifacts). o Done during development stages by developers. o Techniques: Reviews, analysis, simulation, unit testing (static & dynamic activities). • Validation: Ensures the right software is being built (meets the requirements). o Done at the end during system testing by testers. o Technique: Executing the software and checking against specifications (dynamic activity only).
2. Levels of Testing Unit Testing
• Testing of individual units (functions, modules, components) independently.
• Done by developers.
Integration Testing
• Tests whether individual units work together correctly.
• Detects interfacing errors (e.g., mismatched parameters, incorrect data exchange).
System Testing
• Tests the complete integrated system against specifications.
• Done by a separate testing team. • Includes: o Functionality testing (correctness of features). o Performance testing (response time, stress, usability, etc.).
Regression Testing
• Conducted after software updates to ensure that:
1. Changes work correctly. 2. Unchanged parts still function properly. • Prevents regression bugs (previously working features breaking due to new changes).
Acceptance Testing
• Performed by the customer to check if the software meets their requirements.
• Two types: 1. Alpha Testing: Done by the developing team before release. 2. Beta Testing: Done by friendly customers before full deployment.
3. Testing Activities & Responsibilities
Who? Testing Responsibilities Developers Unit Testing, Debugging Testers Integration & System Testing, Test Plan Development Users Acceptance Testing, Usability Testing
4. Importance of Smoke Testing
• Smoke testing ensures basic functionalities work before full testing. • Done frequently (daily/several times a day). • Prevents severe integration issues later.
5. Cost of Testing & Debugging Complexity
• Testing all modules together is expensive. • Debugging is easier when errors are caught early (unit testing). • System testing alone is not cost-effective; a structured approach (unit → integration → system testing) is necessary.
6. Pesticide Effect in Testing
• Bugs that escape detection from a test technique cannot be found using the same method again. • Need multiple testing techniques (e.g., equivalence partitioning, decision table testing, white-box testing). • Repeated use of the same test cases leads to ineffective testing.
7. System Testing Types (Based on Who Tests)
• Alpha Testing – Done by the developer’s organization. • Beta Testing – Done by friendly customers before full release. • Acceptance Testing – Done by the customer before accepting the product. 8. Performance Testing Types • Response Time – How quickly the system reacts. • Throughput – Number of transactions processed in a given time. • Usability – User experience, ease of use. • Stress Testing – System performance under extreme conditions. • Recovery Testing – How well the system recovers from failures. • Configuration Testing – Checking software behavior in different environments.