Quality Assurance Notes QB
Quality Assurance Notes QB
ASSURANCE
L5SOD and L5CSA
Q
TABLE OF CONTENTS:
Elements of Competence and Performance Criteria
1. Perform Requirements Analysis
1.1. Terms of reference are properly analyzed according to industry quality
assurance standards.
1.2 Effective examination of requirement specification basing on industry
standards, project objectives, and stakeholder needs.
1.3 Proper analysis of inception report basing on established criteria and project
requirements.
2. Test the System
2.1 Proper preparation of test plan, based on specified criteria and industry best
practices.
2.2 Testing environment proper preparation, based on specified criteria and
industry best practices.
2.3 Effective tests performing according to the specified parameters and
guidelines.
Q
Unit 1: Perform Requirements Analysis
Definition: Terms of Reference (ToR) is a foundational document that outlines the purpose,
structure, and scope of a project. It sets the framework for what is to be achieved, ensuring all
stakeholders have a clear understanding of project goals.
• Common Understanding: ToR ensures that all stakeholders have a shared understanding
of the project goals.
• Scope Management: Clearly defines what is included and excluded from the project,
preventing scope creep.
Q
o Example: Conduct interviews with bank staff to understand their requirements.
4. Assess Quality Standards: Check compliance with industry standards.
o Example: Ensure that the app meets security standards such as PCI-DSS for
handling payment information.
Daily Example: A bakery plans to introduce a new line of gluten-free products. Their ToR
would outline:
• S for Scope
• O for Objectives
• S for Stakeholders
Assessment Questions:
[Start]
|
V
[Identify Objectives]
|
V
[Review Scope]
|
V
[Identify Stakeholders]
|
V
[Assess Quality Standards]
|
V
Q
[End]
Q
3. Prioritize Requirements: Determine which are critical to project success.
o Example: Use MoSCoW method (Must, Should, Could, Won’t) to prioritize
features.
4. Validate Requirements: Ensure they meet the project objectives and stakeholder needs.
o Example: Review requirements with stakeholders to confirm their accuracy and
relevance.
• F for Functional
• N for Non-Functional
• C for Compliance
Assessment Questions:
Functional
________
/ \
/ \
/ \
Non-Functional Compliance
Q
1.3 Proper analysis of inception report basing on established criteria and project
requirements.
Definition: An inception report outlines the initial findings and proposed plan of a project after
its commencement. It acts as a guide for the subsequent phases of the project.
1. Review Project Overview: Ensure alignment with the ToR and objectives.
o Example: Check if the overview reflects the ToR goals.
2. Evaluate Initial Findings: Check for completeness and accuracy.
o Example: Ensure user feedback on initial app design is included.
3. Assess Proposed Approach: Determine feasibility and alignment with requirements.
o Example: Validate Agile methodology with the team.
4. Identify Risks: List potential risks and mitigation strategies.
o Example: Create a risk management plan that includes regular security audits.
Q
• Initial Findings: "Community members want more walking paths."
• Proposed Approach: "Use community engagement to gather input."
• Risks: "Budget overruns could delay project completion."
Assessment Questions:
[Inception Report]
|
| |
[Project Overview] [Initial Findings]
| |
[Proposed Approach] [Risks and Challenges]
Glossary
• Terms of Reference (ToR): A document that outlines the purpose, structure, and scope
of a project.
• Requirement Specification: A detailed description of the features and functionalities of
a system or project.
• Functional Requirement: Specific behaviors or functions that the system must perform.
• Non-Functional Requirement: Attributes of the system that describe how it should
behave (e.g., performance, usability).
• Compliance Requirements: Legal and regulatory standards that the project must adhere
to.
• Inception Report: A document that outlines the initial findings and proposed plan of a
project.
Q
Conclusion
Performing requirements analysis is essential for ensuring quality assurance in any project. By
properly analyzing the Terms of Reference, examining requirement specifications, and analyzing
the inception report, stakeholders can ensure that projects meet industry standards and fulfill user
needs. Through clear communication, structured documentation, and thorough examination,
project success can be achieved.
Q
Unit 2: Test the System
2.1 Proper preparation of test plan, based on specified criteria and industry best
practices.
Definition: A test plan is a document that outlines the strategy, scope, resources, and schedule
for testing activities. It serves as a blueprint for testing and ensures all aspects are covered.
• Structured Approach: Provides a clear and organized way to approach testing, reducing
confusion.
• Resource Management: Helps in identifying resources needed for testing, including
personnel and tools.
• Risk Management: Identifies potential risks in the testing process and outlines
mitigation strategies.
1. Define Test Objectives: Clearly outline what the testing is intended to achieve.
o Example: "Verify that the app meets user requirements and functions correctly."
2. Determine Scope: Identify the features and functionalities to be tested.
o Example: "Focus on user registration and transaction processes."
Q
3. Choose Test Strategy: Decide on the testing methods to be used.
o Example: "Utilize manual testing for usability and automated testing for
regression."
4. Identify Resources: List all tools, personnel, and equipment needed for testing.
o Example: "Selenium for automated tests, JIRA for tracking bugs."
5. Create Test Schedule: Develop a timeline for testing activities.
o Example: "Complete integration testing by the end of Week 4."
Daily Example: A company developing an online retail platform prepares a test plan that
includes:
• O for Objectives
• S for Scope
• S for Strategy
• T for Tools
• R for Resources
Assessment Questions:
[Test Plan]
|
| |
[Test Objectives] [Scope of Testing]
| |
[Test Strategy] [Resources Required]
| |
[Test Schedule] [Risk Management]
Q
2.2 Testing environment proper preparation, based on specified criteria and
industry best practices.
Definition: The testing environment is the setup in which testing will be conducted, including
hardware, software, and network configurations. A well-prepared testing environment is crucial
for accurate testing results.
Q
Daily Example: A team preparing to test a new web application sets up:
• H for Hardware
• S for Software
• N for Network Configuration
• D for Data
Assessment Questions:
[Testing Environment]
|
| |
[Hardware] [Software]
| |
[Network] [Test Data]
Q
2.3 Effective tests performing according to the specified parameters and
guidelines.
Definition: Testing involves executing a system to identify any bugs or issues before it goes live.
Effective testing ensures that the software meets its requirements and functions correctly.
• Quality Assurance: Ensures that the software is reliable and meets user expectations.
• Cost-Effective: Identifies issues early in the development process, reducing costs
associated with fixing bugs later.
• User Satisfaction: Delivers a product that performs as expected, leading to increased
customer satisfaction.
Types of Testing:
1. Execute Test Cases: Run the predefined test cases based on the test plan.
o Example: "Execute the login functionality test case."
2. Record Results: Document the outcomes of each test case.
o Example: "Login test case passed; user can access account."
3. Report Defects: Log any issues found during testing in a tracking system.
o Example: "Found an error where the app crashes upon entering invalid login
credentials."
4. Retest and Validate: After defects are fixed, retest to ensure they have been resolved.
o Example: "Re-run the login test to confirm the fix works."
Q
Daily Example: In a testing phase for a financial application:
• E for Execute
• R for Record
• R for Report
• R for Retest
Assessment Questions:
1. What are the different types of testing, and when are they performed?
2. Why is it essential to record test results?
[Testing Process]
|
| |
[Execute Test Cases] [Record Results]
| |
[Report Defects] [Retest and Validate]
Glossary
• Test Plan: A document that outlines the strategy and approach for testing.
• Testing Environment: The setup in which testing is conducted, including hardware and
software.
• Unit Testing: Testing individual components of a system.
• Integration Testing: Testing the interaction between integrated components.
• System Testing: Testing the complete system against specified requirements.
• Acceptance Testing: Testing to confirm the system meets business needs.
Q
Conclusion
Testing the system is a critical phase in the software development life cycle. By preparing a
comprehensive test plan, establishing a realistic testing environment, and executing tests
effectively, organizations can ensure high-quality products that meet user expectations.
Thorough testing leads to improved user satisfaction and reduces the risk of costly errors in
production.
Q
Unit 3: Generate Test Documentation
Introduction
Definition: Test result consolidation is the systematic gathering and summarization of test
outcomes from various testing phases. This documentation is essential for assessing the
software’s quality and readiness for release.
Q
Daily Example: A software development team tests an online banking application:
• Collect Data: "The team collects results from functional tests, performance tests, and
security tests."
• Organize Findings: "They compile the results into a spreadsheet detailing each test case
and its outcome."
• Summarize Outcomes: "The report shows that 92% of tests passed, with two high-
priority defects that need addressing."
• C for Collect
• O for Organize
• R for Review
• S for Summarize
Assessment Questions:
| |
[Collect Data] [Organize Findings]
| |
[Summarize Outcomes] [Review with Stakeholders]
Q
3.2 Clear providing Final User Acceptance Testing Report (UAT) basing on
acceptance criteria.
Definition: The User Acceptance Testing (UAT) report is a document that summarizes the
results of UAT, which is the final testing phase before a software product is deployed. It ensures
that the software meets all specified acceptance criteria.
• Validation: Confirms that the software functions as intended and meets user
requirements.
• Sign-off: Serves as formal approval for the software to proceed to production.
• Documentation: Provides a detailed account of user feedback and any issues that were
addressed during testing.
1. Define Acceptance Criteria: Clearly outline what the software must achieve for
approval.
o Example: "The software must support 100 simultaneous users without
performance degradation."
2. Conduct UAT: Involve end-users in testing to validate software functionality.
o Example: "Real users perform tasks such as logging in, making transfers, and
generating reports."
Q
3. Compile Results: Summarize the outcomes of the UAT in a structured format.
o Example: "Document the number of passed and failed test cases, along with user
feedback."
4. Present to Stakeholders: Share the UAT report with stakeholders for review and
approval.
o Example: "Hold a presentation with project sponsors to discuss the findings and
obtain sign-off."
• Define Criteria: "The team lists criteria such as user role permissions and report
generation."
• Conduct UAT: "End-users navigate the software to validate functionalities and
usability."
• Compile Results: "The UAT report indicates that 95% of features worked as expected,
with some suggestions for UI improvements."
Assessment Questions:
Q
3.3 Proper generation of Recommendation Report based on comprehensive test
cases.
1. Analyze Test Cases: Review the results of all executed test cases to identify patterns and
issues.
o Example: "Determine which functionalities experienced the most failures during
testing."
2. Formulate Recommendations: Based on the analysis, develop actionable
recommendations for improvement.
Q
o Example: "Suggest increasing server capacity to handle peak loads more
efficiently."
3. Prioritize Actions: Classify recommendations by urgency and potential impact on the
project.
o Example: "High priority: Address any critical defects; Low priority: Consider
adding new features in future releases."
4. Draft and Review the Report: Write the recommendation report and seek input from
team members before finalization.
o Example: "Share the draft report with the QA team for feedback before presenting
it to management."
• Analyze Cases: "The testing team discovers that 25% of test cases failed due to
performance issues."
• Formulate Recommendations: "They recommend implementing caching to improve
load times."
• Prioritize Actions: "The team categorizes performance fixes as high priority to ensure
user satisfaction."
Assessment Questions:
Q
Diagram: Recommendation Report Generation Process
Conclusion
Glossary
• Test Results: Outcomes of executed test cases indicating whether they passed or failed.
• User Acceptance Testing (UAT): The final testing phase where end-users validate the
software before release.
• Recommendation Report: A document that outlines suggestions for improvements
based on testing results.
Q
REFERENCES:
1. IEEE Standard for Software Test Documentation. IEEE. [Link to IEEE Standards]
2. Agile Testing: A Practical Guide for Testers and Agile Teams. Lisa Crispin & Janet
Gregory. [Link to Agile Testing]
3. The Art of Software Testing. Glenford Myers. [Link to Art of Testing]
4. ISO 9001:2015. International Organization for Standardization. [Link to ISO 9001]
5. GDPR Compliance Guidelines. European Commission. [Link to GDPR]
6. Quality Assurance in Project Management. Project Management Institute. [Link to
PMI]
------------------------------------------------------END-------------------------------------------------
Q
QUALITY
ASSURANCE
Q