SOP-Sample Software QA Testing
SOP-Sample Software QA Testing
Page: 1/12
SDLC Process Software (QA) Testing
SOP-00XXXX
Revision A
6/19/2016
QRS.
Confidential and Proprietary, All rights reserved
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 2/12
SDLC Process Software (QA) Testing
Documentation History
Approvals
Authors: Approved by:
Name(s): Name:
Date:
Name:
Date:
Name:
Date:
Name:
Date:
Name:
Date:
Name:
Date:
Name:
Date:
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 3/12
SDLC Process Software (QA) Testing
Table of Contents
DOCUMENTATION HISTORY .................................................................................................................................... 2
APPROVALS .............................................................................................................................................................. 2
GLOSSARY ................................................................................................................................................................ 4
1. PROCESS SUMMARY ........................................................................................................................................ 5
1.1 Test Approaches.......................................................................................................................................... 5
1.2 Best Practices.............................................................................................................................................. 5
1.3 Parent Process ............................................................................................................................................ 5
1.4 Test Activity Procedures .............................................................................................................................. 5
1.4.1 QA Testing Role ...................................................................................................................................... 5
1.4.2 Test Stages ............................................................................................................................................. 6
1.5 Test Case Designs ...................................................................................................................................... 7
1.5.1 Design Procedures .................................................................................................................................. 7
1.5.2 Test Data ................................................................................................................................................. 7
2. INPUT.................................................................................................................................................................. 8
2.1.1 Test Plans................................................................................................................................................ 8
2.1.2 Test Cases .............................................................................................................................................. 8
3. TEST PROCESS ................................................................................................................................................. 8
3.1.1 Scrum Model ........................................................................................................................................... 8
3.2 Unit Testing ................................................................................................................................................. 9
3.3 Functional Testing ....................................................................................................................................... 9
3.4 Integration Testing ....................................................................................................................................... 9
3.5 System Testing ............................................................................................................................................ 9
4. ANALYZE .......................................................................................................................................................... 10
4.1.1 Test Completion Criteria......................................................................................................................... 10
4.1.2 Defects Classification ............................................................................................................................. 10
4.1.3 Testing Measurements........................................................................................................................... 10
4.1.4 Defect Measurements ............................................................................................................................ 10
5. OUTPUTS ......................................................................................................................................................... 10
5.1 Unit Testing ............................................................................................................................................... 10
5.2 Functional Testing ..................................................................................................................................... 10
5.3 Integration Testing ..................................................................................................................................... 10
5.4 System Testing .......................................................................................................................................... 11
6. USER ACCEPTANCE TESTING ........................................................................................................................ 11
6.1 Criteria....................................................................................................................................................... 11
6.2 Output ....................................................................................................................................................... 11
7. REFERENCES .................................................................................................................................................. 12
8. APPENDIX A: DEFECTS CLASSIFICATION...................................................................................................... 12
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 4/12
SDLC Process Software (QA) Testing
Glossary
Term Description
Acceptance Verifies that a completed system meets the original business objectives as described by the
Testing system requirements.
Black-box Focuses on software’s external attributes and behavior. Such testing looks at an application’s
testing expected behavior from the user’s point of view.
where test cases are generated using the extremes of the input domain, e.g. maximum,
Boundary minimum, just inside/outside boundaries, typical values, and error values. It is similar to
Equivalence Partitioning but focuses on "corner cases".
A sprint, in Agile software development, is a set period of time during which specific work has
Sprint
to be completed and made ready for review.
SW Software
UAT User Acceptance Testing
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 5/12
SDLC Process Software (QA) Testing
1. PROCESS SUMMARY
Link to QRS SDLC Flow chart/process diagram – http://
This document provides key testing activities that are tracked and managed during SDLC. They are:
QA Standard Operating Procedures
Set of procedures that includes
o QA processes
o Roles
o Responsibilities
Agile/QA best practices
Before testing of any kind can begin, it is necessary to explain the roles of QA.
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 6/12
SDLC Process Software (QA) Testing
Now that the role of QA has been defined, we’ll discuss the five test stages that are integrated into the Agile
Scrum System. These stages are: Unit, Functional, Integration, Performance (includes Stress, Volume and
Resource), and System).
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 7/12
SDLC Process Software (QA) Testing
Another part of the QA process is to create test cases which includes test data or use cases to see how well the
software performs during each step of the process.
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 8/12
SDLC Process Software (QA) Testing
2. INPUT
A test plan lets you specify what you want to test and how to run those tests. A test plan can be applied to a
specific iteration of your project. You can have just one default test suite for your test cases, or you can create
a test suite hierarchy.
3. TEST PROCESS
QRS uses two lifecycle processes: Agile and Waterfall. For the remaining document, we will focus on Agile
only. This section talks about the five test stages listed above in section 1.5.2. User Acceptance Testing will
be discussed in Section 6 as its own entity.
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 9/12
SDLC Process Software (QA) Testing
Functional
Performance/Load/Stress
Usability
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 10/12
SDLC Process Software (QA) Testing
4. ANALYZE
5. OUTPUTS
The following is a list of different outputs expected from the different tests performed throughout the QA
process. Again, User Acceptance testing can be found in Section 6 below.
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 11/12
SDLC Process Software (QA) Testing
6.1 Criteria
Once all the criteria for testing have been met, then User Acceptance testing can begin. As with any type of
testing, User Acceptance testing has its own criteria and is complete when the following are met:
All critical and major incidents resolved or postponed
Test results and deliverables are provided
Criteria for entering and existing UAT are below:
Entrance Requirements
Entrance Criteria Compliant Not Compliant
UAT plan approved
All UAT team members are identified and roles defined
Delivered target hardware/software installed and configured, including
network connectivity and all backup/restore functions
Exit Requirements
Exit Criteria Compliant Not Compliant
All UAT test cases included in the acceptance test plan have been executed
Joint evaluation of the defect issues discovered during UAT is complete
6.2 Output
The purpose of the acceptance testing is to validate that all elements of the system are:
o Fully and properly integrated
o Overall end-to-end system functionality and performance is achieved
These tests are conducted on a platform as close to the real production environment as possible.
Below is a list of various testing done during user acceptance testing:
Data and Database Integrity Testing, including testing of converted data and all associated ID
documents and parameter files (using production values)
Functional Testing of the delivered System software and hardware
User Interface/Usability Testing
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved
Company Logo PROCEDURE SOP-00XXXX
Page: 12/12
SDLC Process Software (QA) Testing
7. REFERENCES
Term Description
Issue Unrecoverable data loss
S1 Critical
Severity Major functionality is not working
Major functionality is not working with identified work-around
S2 Major
Secondary functionality is not working with no work-around
Secondary functionality is not working with identified work-around
S3 Minor
Seldom-used functionality issues with identified work-around
Inconveniences
S4 Cosmetic
Minor annoyances
Customer’s request to change an existing signed off requirement or
Change additional requirement in addition to signed off system requirements
Issue Type
Request (CR) document and functional specification
May have a significant impact on the program scope
Usually it institutes the improvement to existing functionality and in
most cases has a smaller scope and program’s impact that change
Enhancement request
Enhancement may not affect the signed off requirements and specs
and is subject to be reviewed by Program multifunctional team
Elaboration or request for elaboration of existing requirement and/or
Clarification functionality
May be initiated by Customer
Any deviation from signed off requirements and/or system
Defect
malfunctioning
Action item Defines particular, concrete action to be taken by assignee
SOP-00XXXX Revision A
Confidential and Proprietary, All rights reserved