0% found this document useful (0 votes)
27 views25 pages

Quality Assurance Notes QB

The document outlines the quality assurance processes for L5SOD and L5CSA, focusing on requirements analysis, system testing, and test documentation generation. It details the importance of Terms of Reference, requirement specifications, and inception reports, along with guidelines for preparing test plans and environments. The document emphasizes structured approaches and adherence to industry standards to ensure project success and user satisfaction.

Uploaded by

paccy5000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views25 pages

Quality Assurance Notes QB

The document outlines the quality assurance processes for L5SOD and L5CSA, focusing on requirements analysis, system testing, and test documentation generation. It details the importance of Terms of Reference, requirement specifications, and inception reports, along with guidelines for preparing test plans and environments. The document emphasizes structured approaches and adherence to industry standards to ensure project success and user satisfaction.

Uploaded by

paccy5000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

QUALITY

ASSURANCE
L5SOD and L5CSA

Prepared by Daniel D. 2024-25


Q

Q
TABLE OF CONTENTS:
Elements of Competence and Performance Criteria
1. Perform Requirements Analysis
1.1. Terms of reference are properly analyzed according to industry quality
assurance standards.
1.2 Effective examination of requirement specification basing on industry
standards, project objectives, and stakeholder needs.
1.3 Proper analysis of inception report basing on established criteria and project
requirements.
2. Test the System
2.1 Proper preparation of test plan, based on specified criteria and industry best
practices.
2.2 Testing environment proper preparation, based on specified criteria and
industry best practices.
2.3 Effective tests performing according to the specified parameters and
guidelines.

3. Generate Test Documentation


3.1 Test results proper consolidation based on test findings.
3.2 Clear providing Final User Acceptance Testing Report (UAT) basing on
acceptance criteria.
3.3 Proper generation of Recommendation Report based on comprehensive test
cases.

Q
Unit 1: Perform Requirements Analysis

1.1 Terms of Reference are Properly Analyzed According to Industry Quality


Assurance Standards

Definition: Terms of Reference (ToR) is a foundational document that outlines the purpose,
structure, and scope of a project. It sets the framework for what is to be achieved, ensuring all
stakeholders have a clear understanding of project goals.

Importance of Terms of Reference:

• Common Understanding: ToR ensures that all stakeholders have a shared understanding
of the project goals.
• Scope Management: Clearly defines what is included and excluded from the project,
preventing scope creep.

Key Components of Terms of Reference:

Component Description Example


Project Clear goals that the project aims "Develop a mobile app for online banking
Objectives to achieve. with a user-friendly interface."
Defines what is included and "Includes UI design and testing, but
Scope
excluded in the project. excludes marketing."
Individuals or groups involved or
Stakeholders "Bank management, IT team, end-users."
affected by the project.
Quality Industry standards to which the
"ISO 9001 for quality management."
Standards project adheres.

Steps for Analyzing ToR:

1. Identify Objectives: Ensure they are SMART (Specific, Measurable, Achievable,


Relevant, Time-bound).
o Example: "Launch the app within 6 months with a 90% user satisfaction rate."
2. Review Scope: Confirm that the scope aligns with objectives.
o Example: If the objective is to improve customer engagement, the scope should
include features that enhance user interaction.
3. Identify Stakeholders: Determine their needs and expectations.

Q
o Example: Conduct interviews with bank staff to understand their requirements.
4. Assess Quality Standards: Check compliance with industry standards.
o Example: Ensure that the app meets security standards such as PCI-DSS for
handling payment information.

Daily Example: A bakery plans to introduce a new line of gluten-free products. Their ToR
would outline:

• Objective: "Launch gluten-free products within 3 months."


• Scope: "Includes product development and testing, excludes marketing."
• Stakeholders: "Bakers, suppliers, customers with gluten intolerance."
• Quality Standards: "Adhere to food safety regulations."

Tip for Memorization: Use the acronym SOS:

• S for Scope
• O for Objectives
• S for Stakeholders

Assessment Questions:

1. What is the purpose of the Terms of Reference in a project?


2. List the key components of a Terms of Reference document.

Diagram: Flowchart of Analyzing Terms of Reference

[Start]
|
V
[Identify Objectives]
|
V
[Review Scope]
|
V
[Identify Stakeholders]
|
V
[Assess Quality Standards]
|
V

Q
[End]

1.2 Effective examination of requirement specification basing on industry


standards, project objectives, and stakeholder needs.

Definition: Requirement specification is a detailed description of the features and functionalities


of a system or project. It serves as a blueprint for development.

Importance of Requirement Specification:

• Clear Expectations: Provides a comprehensive understanding of what is needed from


the system.
• Reduces Risks: Minimizes misunderstandings that could lead to project failure.

Key Aspects of Requirement Specification:

Aspect Description Example


Functional What the system should do "Users must be able to transfer money
Requirements (features). between accounts."
Non-Functional Quality attributes "The app should be intuitive and
Requirements (performance, usability). responsive, loading within 2 seconds."
Compliance Legal and regulatory "Must comply with GDPR for data
Requirements standards to meet. protection."

Steps for Examining Requirement Specification:

1. Gather Requirements: Collect from stakeholders through interviews, surveys, and


workshops.
o Example: Conduct a survey with bank customers about their preferred app
features.
2. Categorize Requirements: Separate into functional and non-functional.
o Table: Functional vs. Non-Functional Requirements

Type Description Example


Describes specific behaviors or "The app allows users to view
Functional
functions. transaction history."
Non- Describes system attributes or "The app must handle 1000
Functional quality. simultaneous users."

Q
3. Prioritize Requirements: Determine which are critical to project success.
o Example: Use MoSCoW method (Must, Should, Could, Won’t) to prioritize
features.
4. Validate Requirements: Ensure they meet the project objectives and stakeholder needs.
o Example: Review requirements with stakeholders to confirm their accuracy and
relevance.

Daily Example: For a restaurant ordering system:

• Functional Requirement: "Customers must be able to place orders online."


• Non-Functional Requirement: "The system should be able to process orders within 5
seconds."
• Compliance Requirement: "Must comply with food safety regulations."

Tip for Memorization: Remember the phrase "FNC":

• F for Functional
• N for Non-Functional
• C for Compliance

Assessment Questions:

1. Differentiate between functional and non-functional requirements.


2. Why is it important to validate requirements?

Diagram: Venn Diagram of Requirements Categories

Functional
________
/ \
/ \
/ \
Non-Functional Compliance

Q
1.3 Proper analysis of inception report basing on established criteria and project
requirements.

Definition: An inception report outlines the initial findings and proposed plan of a project after
its commencement. It acts as a guide for the subsequent phases of the project.

Importance of Inception Report:

• Clear Direction: Provides a roadmap for project execution.


• Adjustments: Allows for changes based on initial analysis before full-scale
implementation.

Key Components of an Inception Report:

Component Description Example


Project Summary of the project’s "The project aims to enhance customer
Overview purpose and objectives. satisfaction through an intuitive mobile app."
Insights gathered from
Initial Findings "Users prefer quick access to account balance."
preliminary analysis.
Proposed Strategies for achieving "Adopt Agile methodology for iterative
Approach project goals. development."
Risks and Potential issues and their "Data breaches could lead to loss of customer
Challenges impact. trust."

Steps for Analyzing Inception Report:

1. Review Project Overview: Ensure alignment with the ToR and objectives.
o Example: Check if the overview reflects the ToR goals.
2. Evaluate Initial Findings: Check for completeness and accuracy.
o Example: Ensure user feedback on initial app design is included.
3. Assess Proposed Approach: Determine feasibility and alignment with requirements.
o Example: Validate Agile methodology with the team.
4. Identify Risks: List potential risks and mitigation strategies.
o Example: Create a risk management plan that includes regular security audits.

Daily Example: For a community park renovation project:

• Project Overview: "Improve accessibility and recreational facilities."

Q
• Initial Findings: "Community members want more walking paths."
• Proposed Approach: "Use community engagement to gather input."
• Risks: "Budget overruns could delay project completion."

Tip for Memorization: Use the acronym "PIR":

• P for Project Overview


• I for Initial Findings
• R for Risks

Assessment Questions:

1. What are the key components of an inception report?


2. Why is it important to assess proposed approaches in the inception report?

Diagram: Components of Inception Report

[Inception Report]
|

| |
[Project Overview] [Initial Findings]
| |
[Proposed Approach] [Risks and Challenges]

Glossary

• Terms of Reference (ToR): A document that outlines the purpose, structure, and scope
of a project.
• Requirement Specification: A detailed description of the features and functionalities of
a system or project.
• Functional Requirement: Specific behaviors or functions that the system must perform.
• Non-Functional Requirement: Attributes of the system that describe how it should
behave (e.g., performance, usability).
• Compliance Requirements: Legal and regulatory standards that the project must adhere
to.
• Inception Report: A document that outlines the initial findings and proposed plan of a
project.

Q
Conclusion

Performing requirements analysis is essential for ensuring quality assurance in any project. By
properly analyzing the Terms of Reference, examining requirement specifications, and analyzing
the inception report, stakeholders can ensure that projects meet industry standards and fulfill user
needs. Through clear communication, structured documentation, and thorough examination,
project success can be achieved.

Q
Unit 2: Test the System

2.1 Proper preparation of test plan, based on specified criteria and industry best
practices.

Definition: A test plan is a document that outlines the strategy, scope, resources, and schedule
for testing activities. It serves as a blueprint for testing and ensures all aspects are covered.

Importance of a Test Plan:

• Structured Approach: Provides a clear and organized way to approach testing, reducing
confusion.
• Resource Management: Helps in identifying resources needed for testing, including
personnel and tools.
• Risk Management: Identifies potential risks in the testing process and outlines
mitigation strategies.

Key Components of a Test Plan:

Component Description Example


Goals to be achieved through "Ensure that the mobile app functions as
Test Objectives
testing. intended."
Scope of Defines what will and will not be "Testing will include user login, but not
Testing tested. marketing features."
The overall approach to testing "Adopt a hybrid approach using both
Test Strategy
(e.g., manual, automated). manual and automated testing."
Resources Tools, personnel, and time needed "Use Selenium for automated testing,
Required for testing. require 3 testers."
"Unit testing to be completed by the end
Test Schedule Timeline for each testing phase.
of Week 3."

Steps for Preparing a Test Plan:

1. Define Test Objectives: Clearly outline what the testing is intended to achieve.
o Example: "Verify that the app meets user requirements and functions correctly."
2. Determine Scope: Identify the features and functionalities to be tested.
o Example: "Focus on user registration and transaction processes."

Q
3. Choose Test Strategy: Decide on the testing methods to be used.
o Example: "Utilize manual testing for usability and automated testing for
regression."
4. Identify Resources: List all tools, personnel, and equipment needed for testing.
o Example: "Selenium for automated tests, JIRA for tracking bugs."
5. Create Test Schedule: Develop a timeline for testing activities.
o Example: "Complete integration testing by the end of Week 4."

Daily Example: A company developing an online retail platform prepares a test plan that
includes:

• Test Objectives: "Verify the checkout process works without errors."


• Scope: "Include product search, user registration, and checkout; exclude payment
gateway integration."
• Test Strategy: "Manual testing for user interface, automated testing for functional
checks."

Tip for Memorization: Use the acronym OSSTR:

• O for Objectives
• S for Scope
• S for Strategy
• T for Tools
• R for Resources

Assessment Questions:

1. What are the key components of a test plan?


2. Why is it important to define test objectives?

Diagram: Components of a Test Plan

[Test Plan]
|

| |
[Test Objectives] [Scope of Testing]
| |
[Test Strategy] [Resources Required]
| |
[Test Schedule] [Risk Management]

Q
2.2 Testing environment proper preparation, based on specified criteria and
industry best practices.

Definition: The testing environment is the setup in which testing will be conducted, including
hardware, software, and network configurations. A well-prepared testing environment is crucial
for accurate testing results.

Importance of a Proper Testing Environment:

• Realistic Conditions: Mimics the production environment, leading to more reliable


results.
• Issue Identification: Helps identify issues that may only occur in specific configurations.
• Efficiency: Ensures that testing can be conducted without unnecessary interruptions.

Key Elements of a Testing Environment:

Element Description Example


Physical devices and machines used "Servers, workstations, and mobile
Hardware
for testing. devices."
Applications and tools needed for "Operating systems, browsers, and
Software
testing. testing tools."
Network Setup of networks to simulate real- "Include firewalls, routers, and load
Configuration world scenarios. balancers."
"User accounts, transactions, and
Test Data Sample data used during testing.
product details."

Steps for Preparing the Testing Environment:

1. Identify Hardware Requirements: Determine the necessary physical devices for


testing.
o Example: "Two servers for load testing and three workstations for functional
testing."
2. Select Software Tools: Choose the appropriate software needed for testing.
o Example: "Use JMeter for performance testing and Postman for API testing."
3. Set Up Network Configuration: Establish network settings that reflect real user
conditions.
o Example: "Configure a staging server to replicate the production environment."
4. Prepare Test Data: Create and populate the data needed for testing scenarios.
o Example: "Generate user data for testing registration and login processes."

Q
Daily Example: A team preparing to test a new web application sets up:

• Hardware: "Two servers and four laptops."


• Software: "Chrome, Firefox, and automated testing tools like Selenium."
• Network Configuration: "Simulate a slow network connection to test performance
under load."

Tip for Memorization: Remember the acronym HSND:

• H for Hardware
• S for Software
• N for Network Configuration
• D for Data

Assessment Questions:

1. Why is it important to have a realistic testing environment?


2. What elements should be included in a testing environment?

Diagram: Setting Up a Testing Environment

[Testing Environment]
|

| |
[Hardware] [Software]
| |
[Network] [Test Data]

Q
2.3 Effective tests performing according to the specified parameters and
guidelines.

Definition: Testing involves executing a system to identify any bugs or issues before it goes live.
Effective testing ensures that the software meets its requirements and functions correctly.

Importance of Effective Testing:

• Quality Assurance: Ensures that the software is reliable and meets user expectations.
• Cost-Effective: Identifies issues early in the development process, reducing costs
associated with fixing bugs later.
• User Satisfaction: Delivers a product that performs as expected, leading to increased
customer satisfaction.

Types of Testing:

Type Description Example


Testing individual components for proper
Unit Testing "Test a single function in the app."
behavior.
Integration Testing combined components to ensure "Check if the login module works
Testing they work together. with the database."
Testing the complete system for "Test the entire application for user
System Testing
compliance with requirements. interactions."
Acceptance Final testing to ensure the system meets "User acceptance testing by a group
Testing business needs. of end-users."

Steps for Performing Tests:

1. Execute Test Cases: Run the predefined test cases based on the test plan.
o Example: "Execute the login functionality test case."
2. Record Results: Document the outcomes of each test case.
o Example: "Login test case passed; user can access account."
3. Report Defects: Log any issues found during testing in a tracking system.
o Example: "Found an error where the app crashes upon entering invalid login
credentials."
4. Retest and Validate: After defects are fixed, retest to ensure they have been resolved.
o Example: "Re-run the login test to confirm the fix works."

Q
Daily Example: In a testing phase for a financial application:

• Unit Testing: "Test the interest calculation function."


• Integration Testing: "Verify that the payment processing module interacts correctly with
the database."
• System Testing: "Conduct full application tests to ensure all features work together."
• Acceptance Testing: "Gather feedback from users to validate the application meets their
needs."

Tip for Memorization: Use the acronym ERRR:

• E for Execute
• R for Record
• R for Report
• R for Retest

Assessment Questions:

1. What are the different types of testing, and when are they performed?
2. Why is it essential to record test results?

Diagram: Testing Process Flow

[Testing Process]
|

| |
[Execute Test Cases] [Record Results]
| |
[Report Defects] [Retest and Validate]

Glossary

• Test Plan: A document that outlines the strategy and approach for testing.
• Testing Environment: The setup in which testing is conducted, including hardware and
software.
• Unit Testing: Testing individual components of a system.
• Integration Testing: Testing the interaction between integrated components.
• System Testing: Testing the complete system against specified requirements.
• Acceptance Testing: Testing to confirm the system meets business needs.

Q
Conclusion

Testing the system is a critical phase in the software development life cycle. By preparing a
comprehensive test plan, establishing a realistic testing environment, and executing tests
effectively, organizations can ensure high-quality products that meet user expectations.
Thorough testing leads to improved user satisfaction and reduces the risk of costly errors in
production.

Q
Unit 3: Generate Test Documentation

Introduction

Generating effective test documentation is a critical component of the software development


lifecycle. It ensures that the testing process is thorough, transparent, and understandable for all
stakeholders. Proper documentation facilitates communication, supports decision-making, and
enhances the overall quality of the software.

3.1 Test results proper consolidation based on test findings.

Definition: Test result consolidation is the systematic gathering and summarization of test
outcomes from various testing phases. This documentation is essential for assessing the
software’s quality and readiness for release.

Importance of Consolidating Test Results:

• Clarity: Provides a clear overview of testing outcomes, helping stakeholders understand


software performance.
• Decision-Making: Aids project managers and stakeholders in making informed decisions
about software deployment.
• Traceability: Ensures that all test findings are traceable to specific requirements,
facilitating accountability.

Steps for Proper Consolidation of Test Results:

1. Collect Test Data: Gather data from all testing activities.


o Example: "After running unit tests, integration tests, and user acceptance tests,
compile the results."
2. Organize Findings: Structure the test results for easy review and interpretation.
o Example: "Use a table format to summarize pass/fail rates and defect counts."
3. Summarize Outcomes: Highlight key findings, such as defect counts and overall
performance metrics.
o Example: "The final summary indicates that 90% of tests passed, with five critical
defects found."
4. Review with Stakeholders: Share consolidated results with relevant team members and
stakeholders to gather feedback.
o Example: "Present the consolidated report in a project review meeting."

Q
Daily Example: A software development team tests an online banking application:

• Collect Data: "The team collects results from functional tests, performance tests, and
security tests."
• Organize Findings: "They compile the results into a spreadsheet detailing each test case
and its outcome."
• Summarize Outcomes: "The report shows that 92% of tests passed, with two high-
priority defects that need addressing."

Tip for Memorization: Remember the acronym CORS:

• C for Collect
• O for Organize
• R for Review
• S for Summarize

Assessment Questions:

1. Why is it important to consolidate test results?


2. What steps should be taken to organize test findings?

Diagram: Test Result Consolidation Process

[Test Result Consolidation]


|

| |
[Collect Data] [Organize Findings]
| |
[Summarize Outcomes] [Review with Stakeholders]

Q
3.2 Clear providing Final User Acceptance Testing Report (UAT) basing on
acceptance criteria.

Definition: The User Acceptance Testing (UAT) report is a document that summarizes the
results of UAT, which is the final testing phase before a software product is deployed. It ensures
that the software meets all specified acceptance criteria.

Importance of UAT Report:

• Validation: Confirms that the software functions as intended and meets user
requirements.
• Sign-off: Serves as formal approval for the software to proceed to production.
• Documentation: Provides a detailed account of user feedback and any issues that were
addressed during testing.

Key Components of a UAT Report:

Component Description Example


"This report summarizes the UAT
Overview of the UAT process and
Introduction conducted for the online banking
objectives.
platform."
"The application must allow users to
Acceptance List of criteria that the software
transfer funds and view transaction
Criteria must meet.
history."
Summary of test outcomes, "Out of 50 test cases, 47 passed, 3
Test Results
including pass/fail rates. failed."
Comments and suggestions from "Users reported that the login process
User Feedback
users who conducted UAT. was confusing."
Summary of findings and "The software is ready for deployment,
Conclusion
recommendations for next steps. pending fixes to minor usability issues."

Steps for Creating a UAT Report:

1. Define Acceptance Criteria: Clearly outline what the software must achieve for
approval.
o Example: "The software must support 100 simultaneous users without
performance degradation."
2. Conduct UAT: Involve end-users in testing to validate software functionality.
o Example: "Real users perform tasks such as logging in, making transfers, and
generating reports."

Q
3. Compile Results: Summarize the outcomes of the UAT in a structured format.
o Example: "Document the number of passed and failed test cases, along with user
feedback."
4. Present to Stakeholders: Share the UAT report with stakeholders for review and
approval.
o Example: "Hold a presentation with project sponsors to discuss the findings and
obtain sign-off."

Daily Example: For a new human resources management software:

• Define Criteria: "The team lists criteria such as user role permissions and report
generation."
• Conduct UAT: "End-users navigate the software to validate functionalities and
usability."
• Compile Results: "The UAT report indicates that 95% of features worked as expected,
with some suggestions for UI improvements."

Tip for Memorization: Use the acronym DCPR:

• D for Define Acceptance Criteria


• C for Conduct UAT
• P for Compile Results
• R for Present to Stakeholders

Assessment Questions:

1. What are the key components of a UAT report?


2. Why is user feedback important in the UAT process?

Diagram: UAT Report Creation Process

[UAT Report Creation]


|
----------------------
| |
[Define Acceptance] [Conduct UAT]
| |
[Compile Results] [Present to Stakeholders]

Q
3.3 Proper generation of Recommendation Report based on comprehensive test
cases.

Definition: A recommendation report provides suggestions for improvements or actions based


on the results of testing. This report is created after evaluating all test cases and their outcomes.

Importance of Recommendation Reports:

• Guidance: Offers actionable insights for future development and enhancements.


• Continuous Improvement: Identifies areas for improvement in both the software and
the testing process.
• Stakeholder Communication: Facilitates clear communication of issues and suggested
solutions to stakeholders.

Key Components of a Recommendation Report:

Component Description Example


"This report provides recommendations
Executive Brief overview of the report
based on testing results for the CRM
Summary and its purpose.
software."
Summary of executed test "Out of 120 test cases, 15 need further
Test Case Analysis
cases and their results. attention due to critical issues."
Suggested actions based on the "Enhance load performance by optimizing
Recommendations
test findings. database queries."
"High priority: Fix security vulnerabilities;
Classify recommendations
Prioritization Medium priority: Improve UI
based on urgency and impact.
responsiveness."
Final thoughts and "Implementing these recommendations
Conclusion encouragement for will significantly improve software
implementation. quality."

Steps for Generating a Recommendation Report:

1. Analyze Test Cases: Review the results of all executed test cases to identify patterns and
issues.
o Example: "Determine which functionalities experienced the most failures during
testing."
2. Formulate Recommendations: Based on the analysis, develop actionable
recommendations for improvement.

Q
o Example: "Suggest increasing server capacity to handle peak loads more
efficiently."
3. Prioritize Actions: Classify recommendations by urgency and potential impact on the
project.
o Example: "High priority: Address any critical defects; Low priority: Consider
adding new features in future releases."
4. Draft and Review the Report: Write the recommendation report and seek input from
team members before finalization.
o Example: "Share the draft report with the QA team for feedback before presenting
it to management."

Daily Example: After testing a new project management application:

• Analyze Cases: "The testing team discovers that 25% of test cases failed due to
performance issues."
• Formulate Recommendations: "They recommend implementing caching to improve
load times."
• Prioritize Actions: "The team categorizes performance fixes as high priority to ensure
user satisfaction."

Tip for Memorization: Use the acronym APRDC:

• A for Analyze Test Cases


• P for Prioritize Actions
• R for Recommendations
• D for Draft and Review Report
• C for Conclusion

Assessment Questions:

1. What should be included in a recommendation report?


2. How do prioritization and analysis assist in generating effective recommendations?

Q
Diagram: Recommendation Report Generation Process

[Recommendation Report Generation]


|
------------------
| |
[Analyze Test Cases] [Formulate Recommendations]
| |
[Prioritize Actions] [Draft and Review Report]

Conclusion

Generating comprehensive test documentation is a vital practice in software quality assurance. It


ensures that test results are effectively consolidated, user acceptance is validated through detailed
reporting, and actionable recommendations are provided for continuous improvement. By
following structured approaches for documentation, teams can enhance communication, facilitate
better decision-making, and ultimately deliver higher-quality software products. Remember,
effective test documentation is not just about compliance; it’s about ensuring that software meets
user needs and performs reliably in real-world scenarios.

Glossary

• Test Results: Outcomes of executed test cases indicating whether they passed or failed.
• User Acceptance Testing (UAT): The final testing phase where end-users validate the
software before release.
• Recommendation Report: A document that outlines suggestions for improvements
based on testing results.

Q
REFERENCES:

1. IEEE Standard for Software Test Documentation. IEEE. [Link to IEEE Standards]
2. Agile Testing: A Practical Guide for Testers and Agile Teams. Lisa Crispin & Janet
Gregory. [Link to Agile Testing]
3. The Art of Software Testing. Glenford Myers. [Link to Art of Testing]
4. ISO 9001:2015. International Organization for Standardization. [Link to ISO 9001]
5. GDPR Compliance Guidelines. European Commission. [Link to GDPR]
6. Quality Assurance in Project Management. Project Management Institute. [Link to
PMI]

------------------------------------------------------END-------------------------------------------------

Q
QUALITY
ASSURANCE
Q

You might also like