0% found this document useful (0 votes)
14 views

Software Testing Summary Notes

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

Software Testing Summary Notes

Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Software Testing Summary Notes

Program Testing

 Purpose:
o Show that the program works as intended.
o Identify and fix defects before use.
 Methods:
o Execute the program with artificial data to detect errors.
o Part of the Verification and Validation (V&V) process.

Testing Goals

1. Validation Testing:
o Confirm the system meets requirements.
o Ensure all system features and combinations are tested.
2. Defect Testing:
o Find incorrect or undesirable behaviors (e.g., crashes, incorrect computations,
data corruption).

Validation vs. Verification

 Verification:
o "Are we building the product right?"
o Ensures conformance to specifications.
 Validation:
o "Are we building the right product?"
o Ensures the product meets user needs.
 V&V Confidence:
o Aim is to establish confidence that the system is 'fit for purpose'.
o Confidence depends on:
 Software Purpose: Criticality to the organization.
 User Expectations: Users may tolerate failures if benefits outweigh
recovery costs.
 Marketing Environment: Early market release may take precedence over
defect resolution.
Stages of Testing

1. Development Testing:
o Unit Testing: Testing individual program units (e.g., functions, methods).
o Component Testing: Testing integrated units with a focus on interfaces.
o System Testing: Testing the system as a whole to validate component
interactions.
2. Release Testing:
o Conducted by a separate team.
o Validates system readiness for external use.
o Includes requirements-based and defect testing.
3. User Testing:
o Conducted by actual users in real environments.
o Types:
 Alpha Testing: Collaborative testing at the developer's site.
 Beta Testing: Users provide feedback on a system release.
 Acceptance Testing: Customers test to approve system delivery.

Object Class Testing

 Test operations associated with the object.


 Verify and set all attributes of the object.
 Evaluate the object in all possible states.
 Example:
o For a weather station object:
 Test transitions: Shutdown → Running → Shutdown.
 Ensure proper reporting of status and weather conditions.

Unit Testing Using PHPUnit

 Automated Testing:
o Automate testing to ensure repeatability and speed.
 Components of a Test:

1. Setup: Define inputs and expected outputs.


2. Call: Execute the function or method.
3. Assertion: Compare results with expectations.

 Running PHPUnit Tests:


o Execute all tests in a directory or a specific test file.
o Outputs include number of tests run, assertions made, and any failures.
Component Testing

 Focuses on testing interfaces of composite components.


 Types of Interfaces:
o Parameter Interfaces: Methods use parameters for communication.
o Shared Memory Interfaces: Components share memory blocks.
o Procedural Interfaces: One component encapsulates a procedure set.
o Message-Passing Interfaces: Components request services from others.
 Common Errors:
o Interface misuse: Incorrect parameter use.
o Interface misunderstanding: Misinterpreting interface specs.
o Timing errors: Real-time systems accessing outdated data.

System Testing

 Validates system functionality as a whole.


 Key Differences from Component Testing:
o Integrates off-the-shelf components and newly developed modules.
o Tests interactions between components developed by different teams.
 Use-Case Testing:
o Simulates specific user scenarios to validate system behavior.
o Example:
 A nurse uses a patient management system to access and update records
securely during home visits.

Inspections and Testing

 Software Inspections:
o Analysis of static system representation to discover problems (static verification).
o No execution required.
o Effective at finding early defects.
 Software Testing:
o Dynamic verification by observing system behavior during execution.
o Requires test data to assess behavior.
 Comparison:
o Inspections cannot evaluate non-functional characteristics like performance.
o Testing and inspections are complementary, not opposing, methods.
Advantages of Software Inspection

 No Error Interactions: Errors do not mask each other.


 Cost-Efficiency: Inspects incomplete systems without additional development tools.
 Broader Scope: Evaluates attributes like portability, maintainability, and compliance.

Automated Testing

 Benefits:
o Speeds up regression testing.
o Ensures consistent test execution.
 Structure:

1. Setup inputs and expected outputs.


2. Execute the function or system.
3. Assert results match expectations.

Test-Driven Development (TDD)

was introduced as part of agile methods such as Extreme Programming. However, it


can also be used in plan-driven

 Process:
o Write tests before coding.
o Develop code incrementally.
o Move to the next task only after passing tests.
 Advantages:
o Ensures code coverage.
o Regression testing
o Simplifies debugging.
o Serves as documentation.

Specific Testing Techniques

1. Requirements-Based Testing:
o Develop tests based on individual system requirements.
2. Scenario Testing:
o Simulate real-world usage scenarios to validate combined requirements.
3. Performance Testing:
o Gradually increase load to evaluate system behavior.
o Stress Testing: Overload the system to observe failure points.
4. Regression Testing:
o Ensure new changes do not disrupt existing functionality.
o manual testing process, regression testing is expensive but, with
automated testing, it is simple and straightforward.

Performance Testing and System Reliability

 Load Testing: Ensures performance under anticipated conditions.


 Stress Testing: Evaluates behavior under extreme conditions.
 Emergent Properties:
o Focus on reliability, performance, and usability.

Common Errors in Testing

1. Interface Misuse: Errors in parameter usage.


2. Interface Misunderstanding: Misinterpreting interface specifications.
3. Timing Errors: Issues in real-time systems with shared memory or message-passing
interfaces.

Release Testing Process

 Validation: Confirms software readiness for external use.


 Distinction from System Testing:
o Performed by a separate team.
o Focused on demonstrating system requirements are met.

User Testing Categories

 Alpha Testing: Collaborative developer-user testing.


 Beta Testing: Feedback on pre-release versions.
 Acceptance Testing: Final customer approval for delivery.

Exercise Questions
Multiple-Choice Questions (MCQs)

1. What is the main goal of defect testing? A) Ensure system meets requirements. B)
Identify incorrect behaviors. C) Validate user expectations. D) Measure performance
under stress.
2. Which stage tests integrated system components as a whole? A) Unit Testing B)
Component Testing C) System Testing D) Release Testing
3. Which process ensures high-quality code during development? A) Stress Testing B)
Regression Testing C) Test-Driven Development (TDD) D) Use-Case Testing

True/False Questions

1. Stress testing evaluates the system's behavior under normal conditions. (True/False)
2. Inspections can validate system performance and usability. (True/False)
3. Release testing is performed by the same team responsible for system development.
(True/False)

Short Answer Questions

1. Explain the differences between validation and verification.


2. Why are inspections considered cost-efficient for defect identification?
3. Describe the benefits of scenario-based testing for complex systems.
4. How does stress testing ensure system reliability in extreme cases?

Answers

MCQs:

1. B) Identify incorrect behaviors.


2. C) System Testing.
3. C) Test-Driven Development (TDD).

True/False:

1. False
2. False
3. False

Short Answers:

1. Validation ensures the product meets user needs; verification ensures compliance with
specifications.
2. Inspections uncover issues early without requiring system execution, saving time and
resources.
3. Scenario testing reflects real-world use cases, helping validate combined functionalities.
4. Stress testing reveals performance limits, ensuring robustness under peak loads.

You might also like