Sco203 PP Answers
Sco203 PP Answers
1. Regression Testing – When new features are added, automated tests ensure existing
functionality remains intact.
2. Performance Testing – Automated tools simulate multiple users to test system performance
under load.
3. Repetitive Tasks – Running the same tests frequently (e.g., nightly builds) to ensure stability.
4. Large-Scale Data Validation – Testing databases or APIs with thousands of records manually is
inefficient.
4. Actual Result & Status – Records whether the test passed or failed.
- Definition: Final testing phase where stakeholders verify if the software meets business needs.
- Expected Outcomes:
1. High Initial Cost – Tools like Selenium require licensing and training.
3. Limited to Scriptable Tests – Creativity-based testing (e.g., UX) still needs manual effort.
- Requirements:
- Test Cases:
1. Root Cause Analysis (RCA) – Identify and fix underlying issues (e.g., logging errors to trace
bugs).
4. User Training – Reduce human error (e.g., guiding users on input formats).
- Performance Testing:
- Simulate 1,000 users logging in simultaneously to measure response time (<2 seconds).
- Stress Testing:
- Test Case: Detailed how to test (e.g., "Enter valid credentials → Verify dashboard loads").
- Ensures alignment with project goals (e.g., "Complete 95% test coverage by Sprint 3").
- Alpha Testing: Internal team tests in a lab (e.g., developers validate core features).
- Beta Testing: Real users test in production (e.g., customers report UX issues before launch).
- Verification ≠ Validation – One checks the process, the other checks the product.
- SQA Pitfalls – Late testing and poor documentation are major risks.
4. Integration issues - Problems that arise when different modules or components interact.
```
[Write Test] → [Run Test (Fail)] → [Write Code] → [Run Test (Pass)] → [Refactor Code]
↑_________________________________________________________|
```
1. Write Test: Create a test for a small piece of functionality before writing the code.
2. Run Test (Fail): Verify the test fails (since the code doesn't exist yet).
3. Write Code: Implement just enough code to make the test pass.
4. Run Test (Pass): Verify the code now passes the test.
1. Verify correct credentials - Test that valid username/password combinations grant access.
4. Validate error messages - Confirm appropriate error messages display for failed attempts.
5. Test empty submissions - Verify the system handles blank username or password fields
properly.
1. Early defect detection - Finds errors before they become costly to fix.
3. Standard adherence - Ensures compliance with coding standards and best practices.
2. Portability - The ease with which software can be transferred between environments.
1. Requirement analysis and validation - Thoroughly review and validate all requirements.
2. Comprehensive test planning - Develop detailed test plans covering all aspects.
3. Continuous integration and testing - Implement automated builds and frequent testing.
4. Process audits and improvement - Regularly review and improve development processes.
Quality standards in software development ensure consistency, reliability, and efficiency. Their
importance includes:
3. Compliance – Helps meet regulatory and industry requirements (e.g., ISO 9001, IEEE
standards).
4. Customer Satisfaction – Delivers software that meets user expectations and reduces failures.
5. Cost Efficiency – Minimizes rework, debugging, and maintenance costs by catching issues
early.
b) Five Attributes Low-Level Specification Tests Focus on in Static Black Box Testing (10 marks)
Static black box testing examines requirements and design documents without executing code.
Key attributes include:
a) Steps in Software Test Design Specifications (IEEE 829 Standard) (10 marks)
3. Test Approach – Describes testing techniques (e.g., unit, integration, system testing).
4. Test Case Design – Specifies test inputs, procedures, and expected results.
10. Approval & Review – Ensures stakeholders validate the test design.
b) Five Hardware Configuration Elements to Test When Buying a New Computer (10 marks)
Configuration testing ensures compatibility across different hardware setups. Key elements
include:
a) 10 Quality Indicators for Addressing Management Concerns in Software Projects (10 marks)
Integration testing ensures modules work together correctly. The three main strategies are:
2. Incremental Integration
- Types: