Test Plan by Amit Rathi
Test Plan by Amit Rathi
Page 1 of 29
INTERNATIONAL-KIDS.COM
Revision History
Version / Author Description Approver Effective
Revision Date
Number
Page 2 of 29
INTERNATIONAL-KIDS.COM
Table of Contents
1 INTRODUCTION.................................................................................................................................................4
1.1 Purpose of this Document............................................................................................................ 4
1.2 Overview........................................................................................................................................ 4
1.3 Scope.............................................................................................................................................. 4
1.3.1 Testing Phases...........................................................................................................................................4
1.3.2 Testing Types.............................................................................................................................................5
1.4 Not in Scope............................................................................................................................... 6
1.5 Reference Documents................................................................................................................... 6
1.6 Definitions and Acronyms............................................................................................................ 6
1.7 Assumptions and Dependencies.................................................................................................. 7
1 TEST REQUIREMENT.......................................................................................................................................7
2.1 Features to be Tested............................................................................................................. 7
2.2 Milestones (Schedule)................................................................................................................. 10
2 TESTING ENVIRONMENT..............................................................................................................................11
1.8 Browsers...................................................................................................................................... 11
1.9 Hardware and Software Requirements ................................................................................... 12
1.10 Human Resources..................................................................................................................... 18
3 ROLES AND RESPONSIBILITIES.................................................................................................................18
4 TEST STRATEGY..............................................................................................................................................20
Test Process Workflow.................................................................................................................... 20
1.11 Test Organize/Review Project Documentation......................................................................21
1.12 Develop System Test Plan........................................................................................................ 21
1.13 Test Design/Development......................................................................................................... 22
1.14 Unit Test Execution.................................................................................................................. 23
1.15 Integration/System Test Execution ........................................................................................ 23
1.15.1Integration Testing.................................................................................................................................23
1.15.2System Testing........................................................................................................................................24
5.6.3 Testing Types ..........................................................................................................................................24
5.6.4 Test Execution workflow .......................................................................................................................28
1.16 Defect Tracking and Management ......................................................................................... 32
1.17 Update Documents and Results............................................................................................... 37
1.18 Test Reports ............................................................................................................................. 37
1.19 UAT and Closure ..................................................................................................................... 38
5 CONFIGURATION MANAGEMENT.............................................................................................................39
6 DELIVERABLES................................................................................................................................................39
Page 3 of 29
1 Introduction
1.1 Purpose of this Document
The purpose of this document is to outline the Test Strategy/Approach and the Quality Assurance process for the
International-kids.com. This document will establish the System test plan for the International-kids.com
application. It will allow the development team, business analysts, and project management to coordinate their
efforts and efficiently manage the testing of the site. The QA process outlined in this System Test Plan will
ensure that a quality International-kids.com application is deployed successfully and on schedule.
The intended audiences for this document are all stakeholders of the International-kids.com project.
1.2 Overview
The current International-kids.com is Windows XP based, compatible with Office2002 and written in PHP,
using MYSQL 5.0 Server database. International-kids.com expectation with the new application is twofold:
1. Front Office functionalities and
2. Back Office functionalities
The focus is primarily on successful migration and implementation of the application.
The main objective of this Test plan is to define the methodology to test International-kids.com application to
check and ensure that
New system preserves all of its current business functionalities.
The enhancements have been implemented to the new system.
Newer enhancements do not adversely affect the current business functionalities.
The system has flexibility /capacity to deal with complex International-kids.com structure and
programs, as it continues to change
1.3 Scope
International-kids.com application will undergo the following types of testing. All types of testing are
explained in detail under Test Strategy section
Activity Teams Responsible
Functionality Testing Performed by testing team during Integration/System testing phase to meet
agreed upon functional requirements of International-kids.com
application.
Following Functional areas will be put to test: (1) Application Submission
(2) Peer Review. All the features put under test are mentioned in brief
under “Features to be tested” section in this plan.
All the features put under test will be described in detail in Test Scenario
documents and Test case documents. On completion of every single
functional area, test scenario and test case documents will be delivered.
Please refer to Deliverables section mentioned below.
Database Testing Performed by testing team during Integration/System testing phase to
qualify database which houses the content that the International-kids.com
application manages, run queries and fulfill user requests for data storage
Database migration testing will be taken care by DBA’s
Security Testing Performed by testing team during Integration/System testing phase to meet
agreed upon Security requirements of International-kids.com application
GUI and Usability Testing Performed by testing team during Integration/System testing phase
Performance and Performed by testing team during System Testing phase.
Load /Volume Testing Automation testing will be performed to carry out these types of testing.
<Tool name to be decided/updated> tool will be used to perform these
tests.
Various Reports that are part of International-kids.com Application will
be one of the main areas while performing load/volume testing
Performance test methodology.
Code Testing Performed by development team during Unit Testing phase at every
method level.
Smoke Testing Performed by development team during Unit Testing phase for qualifying
the build for releasing it to Testing team.
Performed by Testing team during Integration/System phase for qualifying
the build for further tests.
Regression Testing Performed by testing team during Integration/System testing phase for re-
testing an entire or partial system after a modification has been made to
ensure that no unwanted changes were introduced to the system.
Defect fix verification Performed by testing team during Integration/System testing phase for
testing/Defect validation testing) verifying the defect fixes
Compatibility Testing Performed by testing team during Integration/System testing phase to test
the compatibility with respect to base configuration
(a) Browser (IE 6.0), O.S. (Win XP)
(b) Mozilla fire fox( ), O.S (Win XP)
(c) Opera ( ), O.S (Win XP)
Adhoc Testing Performed by Testing team during Integration/System testing phase to test
the (1) Navigations that are unusual and (2) Negative scenarios within and
across the components.
1.4 Not in Scope
1 Stress Testing
2 Crash/Recovery Testing
3 When the scope of the new application has been agreed and signed off, no further inclusions will be
considered for inclusion in this release, except:
Where there is the express permission and agreement of the Business Analyst, Project Manager
and the Client;
Where the changes/inclusions will not require significant effort on behalf of the test team (i.e.
requiring extra preparation - new test conditions etc.) and will not adversely affect the test
schedule.
Acronym Description
QA Quality Assurance
SRD Software Requirement Document
PM Project Manager
PL Project Lead
TL Technical Lead
International-Kids.com International-Kids.com
1 Test Requirement
All the features to be tested will be detailed in respective Test Scenario documents and Test case documents
based on test types mentioned in Scope section.
All the Test Scenario documents will be delivered for review during Pre-construction phase and the Test case
documents will be delivered in the middle of Construction phase just before integration test begins. Please refer
to “Deliverables” section below for deliverable dates.
Milestones (Schedule)
NOTE: Following dates are projected with the assumption of beginning the Construction phase on 10 th
Oct 2007. Actual dates will be modified as per the Project plan once the construction phase begins.
2 Testing Environment
1.18 Browsers
“√” Symbol mentioned above refers to - “Entire test cases will be executed”.
The text , “Certification” mentioned above refers to - “Selected test cases will be executed to verify the
capability of the application on these browsers”.
This section describes the environment setup offshore that is used in the development and testing of the
application.
The Development Environment offshore corresponds to the environment used by the developers during
the construction. Unit Testing on International-kids.com version is performed on this environment. Each
developer machine will have International-kids.com running on Apache Web server. There will be a common
MYSQL development database server and all the developers will be using the same database server.
SOFTWARE
Type Name Version OS
Web Server Apache Apache2.0 Windows XP
Front End Designing PHP(Personal Home Page to Hypertext PHP 4.0/5.0 Windows XP
Tool Preprocessor)
Scripting Language Java script and Ajax Java Script Windows XP
Ajax
Database MYSQL MYSQL Windows 2000 Server
5.0 Windows 2000 Professional
Browser IE 6.0 Windows XP
HARDWARE
Machine type HDD RAM CPU
Web server 40 GB 1 GB Intel Pentium 4, 2.66GHz
Database Server(MYSQL) 80 GB 1 GB Intel Pentium 4, 2.8 GHz
1.2 QA
The QA Environment offshore corresponds to the environment on which Integration Testing is performed for
International-kids.com version.
Offshore QA
SOFTWARE
Type Name Version OS
Web server Apache Apache 2.0 Windows XP
Scripting Language Java Script, Ajax Java script Windows XP
Ajax
Database MYSQL MYSQL5.0 Windows 2000 Server
Browser (Base) IE 6.0 Windows XP
Browser IE 7.0 Windows XP
(Certification) 6.0/7.0 Windows XP
Mozilla Fire Fox 4.0/5.0 Windows XP
Opera 9.22 Windows XP
HARDWARE
Machine Type HDD RAM CPU OS Browser
QA web server 40 GB 1 GB Intel Pentium 4, Windows XP
2.8 GHz Professional
QA Database 280 GB 1 GB Intel Pentium 4, Windows 2000
Server(MYSQL) 2.8 GHz Server
Test 1 (Desktop class) / 40 GB 1 GB Intel Pentium 4, Windows XP IE6.0
Base 2.4 GHz Professional
Test 2 (Desktop class) / 80 GB 1 GB Intel Pentium 4, Windows XP IE 7.0
Certification 2.4 GHz Professional
Bugzilla Server 40 GB 1 GB Intel Pentium 4, Windows XP
2.66 GHz Professional
1.20Human Resources
The QA lead is responsible for preparing the daily test plan that shall include the following:
4 Test Strategy
The Test Strategy presents the recommended approach to the testing of the International-kids.com
Development Project. The previous section on Test Requirements described what would be tested; this describes
how it will be tested.
Documentation reviews provide a means for testing the accuracy and completeness of the planning, requirements
and specifications. Throughout the project, periodic reviews will be held to assure the quality of project
documentation. These reviews will:
Ensure project plans have adequate time allocated for testing activities and determine limitations.
Ensure that the Business Requirements, Information Site Flow, Use Cases, Business Rules, and
Technical Design documents clearly articulate the functionality of the International-kids.com.
1.22Develop System Test Plan
This step of the testing process involves creation of the System Test Plan (this document). This will serve as the
guidepost for development of test cases and for integration of testing with other project activities.
This plan describes at a high level the overall testing plan and strategy for the International-kids.com
Application.
Professional Access will follow this plan to develop test scenarios/cases and scripts that will be used for
system testing.
Test scenarios will be described in separate document(s).
Test Cases will be described in separate document(s)
Professional Access will obtain test accounts and Ids for Interface testing (see Scope).
1.23Test Design/Development
Brief explanation of Test Design/Development workflow with respect to the Process flow diagram displayed
above:
T1,T2 Test lead takes part in preparation of Elaboration phase deliverables like Test plan and updates the
Artifacts in CVS for further reference
T3, T4 From Post elaboration phase to Pre-construction phase, Test Scenarios would be designed by the
Test lead for the modules/features available in SRD. When once final draft version of SRD with all
modules/features specification/s is/are received, Test scenarios are designed and completed during
Pre-construction phase. All the created/updated Test scenarios are hoarded in CVS
T5 Test lead assigns the task of test case/test script creation to test team members during the
Construction phase
T6, T7 For all the Test scenarios created earlier during Post-elaboration/Pre-construction phase, the Test
team members design test cases during Construction Phase. All the created Test cases/Test scripts
are hoarded in CVS
T8, T9 All the created Test cases/Test Scripts are reviewed by Test lead and all the review comments are
updated in CVS
T10 Test team members will check the review comments and update respective test cases and hoard the
same in CVS.
T11 Test Lead will map the requirements to Test cases in Traceability matrix (The objective of this
matrix is to illustrate how to document which test case/s test which functionality of software and
which structural attribute. It maps test requirements to the test cases/Test Scenarios that implement
them.)
Written test cases and scripts will be used to direct system testing efforts. Professional Access test team will
write these in accordance with the System Test Plan.
Tests will be developed to exercise the required functionality for the website, validate data integrity, and
ensure that data is passed or received successfully from external interfaces. Test Cases will be written in
a separate document appended to this plan.
Each test case will document the steps or actions required to exercise a specified area of functionality.
The test cases will be reviewed to verify that they properly validate the intended functionality. Actual
testing will be performed by executing the steps of the test case. A pass/fail notation will be made for
each step.
Each test case will be executed manually and using automated testing tool(for Performance/Load
testing) using the browser versions mentioned in Test environment section. A pass/fail notation will be
recorded for each condition tested, noting the severity and reason for each instance of failure. Test
scripts to perform Performance/Load testing will be executed automatically during the System testing
phase.
Unit testing verifies each module, component, object, or program developed is functionally correct and conforms
to requirements. A unit is defined as a single program function in terms of inputs, processes and outputs. A
program unit is small enough that the developer who developed it can test it in great detail.
The developer that wrote the code is responsible for creating, updating and executing the unit tests after each
successful build in the development environment. Separate document has been prepared drafting Unit test
strategy.
The objective of these tests is to ensure that all the components of the system function properly together and that
the application interfaces properly with external applications.
Entrance Criteria
Exit Criteria
All components delivered and tested function as detailed in the documents in the References portion
of this document
Test cases have been updated if and when functionality has changed
Test results report is developed/updated
All new defects have been logged into the issues tracking database
The test team will conduct a system test to verify that the software matches the defined requirements. Once the
application has executed successfully under integration test, each test suite will be executed against the other
supported configurations to ensure defects are not created because the system configuration has changed. A
separate test environment must be established for all hardware, software, and browser configurations supported
Entrance Criteria
Exit Criteria
All Severity 1 and 2 defects are fixed and have successfully passed regression testing
The risks associated with not correcting any outstanding Severity 3and 4 defects have been
identified and signed off by the Project Manager, Technical Lead, QA Lead
All components delivered and tested function as detailed in the documents in the References portion
of this document
Regression tests have been performed and executed successfully
Test results report is developed/updated
All new defects have been logged into the issues tracking database
Functionality
Database
Smoke
Security
User Interface/Usability
Compatibility
Performance/Load/Volume
Adhoc
Regression
Functionality Testing
The objective of this test is to ensure that each element of the application meets the functional requirements
of the business as outlined in the:
Software Requirement Document/Use cases.
Software Design Document.
Other Functional documents produced during the course of the project i.e. resolution to
issues/change requests/clarifications/feedback.
Secondly, includes specific functional testing, which aims to test individual process and data
flows. This stage will also include Validation Testing, which is intensive testing of the new front-end fields
and screens.
Functionality testing will be performed on every build that is right from when Build series (2 weeks Test
process cycle) commences till the final System-testing pass. In other words, Functionality testing will be
performed by testing team just after the development of set of features as per the decision of Technical
lead/Project manager, basically as part of integration testing. This process will be continued till the
completion of System testing phase.
Database Testing
Testing the database Schema (Stored procedures, triggers, views e.t.c) after migration (done by MYSQL
DBA developer)
Testing the database which houses the content that the International-kids.com application manages,
runs queries and fulfills user requests for data storage (done by Testing team)
Issues to test are:
Data integrity errors (missing or wrong data in tables)
Output errors (errors in writing, editing or reading/retrieving/querying operations in the tables)
Database testing will be performed along with functionality testing on every build, right from the First Build
series (2 weeks test process cycle) till the final Build series.
The usability testing will be accomplished by verifying the information in each window is accurate. Menus,
icons and toolbar functionality will be tested as applicable to the navigation and results panes. Importance
will be given to graphics, contents, data presentation, feedback and error messages, design approach, user
interface controls, formatting, instructions e.t.c. Multi Window Overlapping will be tested because product
supports opening of multiple documents.
GUI/Usability testing will be performed along with functionality testing on every build, right from the First
Build series (2 weeks test process cycle) till the final Build series.
Adhoc Testing
Adhoc Testing is done on every build right from first Build series till the last Build series. This is mostly
experience based testing and carried out from the application usage perspective. Just based on knowledge of
functionality/ies the test team member will perform this test. He/she need not refer to any Test
case/Scenario/Plan. User concentrates on navigations that are unusual, negative or across the components.
During the second week of every Build series (2 week test process cycle) Adhoc testing will be performed.
This test will be performed during Integration test phase as well as System test phase.
Smoke Testing
During Integration testing, which is performed in parallel with development phase, every time before
releasing the Build to QA team, Development team performs Smoke testing to check whether
mentioned/planned set of features have been implemented without getting into details. When once the build
is released to QA team, before accepting the Build for further testing process, Smoke testing is performed to
check whether the application’s planned set of most crucial functionalities work, without bothering with
finer details. It does mean that for a released build, availability of all the features as mentioned in Release
notes will be tested.
When once the System passes smoke test, it would be subjected for further tests. Before commencing
System testing too, QA team would perform smoke testing to check whether all functionalities are
implemented into the System at high level.
Compatibility testing
Browsers:
Compatibility matrix where different brands and versions of browsers are tested to a certain number of
components and settings, for example Applets, client side scripting, ActiveX controls, HTML
specifications, Graphics or Browser settings, has/have been mentioned in Section 3.2
Settings, Preferences:
Depending on settings and preferences of the client machine, web application may behave differently.
Options such as screen resolution and color depth would be considered while testing.
Printing:
Despite the paperless society the web was to introduce, printing is done more than ever. Testing would
be performed to check whether the pages are printable with considerations on:
Test and image alignment
Colors of text, foreground and background
Scalability to fit paper size, e.t.c
Selected set of Usability/GUI test cases will be executed as a part of Compatibility testing during System
testing phase.
Security testing
Security testis will determine how secure the new AHA-RSDP system is. The tests will verify that
unauthorized user access to confidential data is prevented.
This type of testing would be performed to check
That for each known user type the appropriate function / data are available and all transactions
function as expected and run in prior Application Function tests
Directory setup
That without authorization, access permissions will not be provided to edit scripts on the server
Time-out limit
Bypassing login page by typing URL to a page inside directly in the browser, e.t.c
The general approach for load testing is to set up a test website configuration and to run selected test scripts
against it to measure performance. The configuration and test environment should mirror the production
environment. Individual tests will be run to verify correct operation of the scripts. Then the scripts will be
run again in several cycles. Each cycle will increase the number of concurrent users until the required system
capacity has been successfully demonstrated.
The testing process is inherently iterative; since early tests may encounter bottlenecks or defects. The tests
will need to be repeated after the system has been tuned or reconfigured or the defects have been corrected.
In many cases, one bottleneck may obscure the presence of another. Thus, when problems have once been
corrected, it is possible (even likely) to encounter others on subsequent trials.
The goal of performance testing is to be able to:
1) Determine if the customer will experience unacceptable response time when the
store website is under load.
2) Determine if the web server, application server or database server will crash under
load.
3) Tune the application based on performance issues found.
The response time from the point when the web server receives a page request to the point when the
web server serves the requested page is a metric used to test performance. This metric will be
revisited once the pages have been built to determine an acceptable response time. There will be
separate metrics for the search results pages vs. the other pages.
concurrent users *
active users**
* Concurrent users refers to users who are maintaining an active session with the site and may or may not be
actively clicking on the site. (Please see Technical Specification for details)
** Active users refers to those users who are actually clicking on the site at any given time. (Please see
Technical Specification for details)
Regression testing
A Regression test will be performed subsequent to the release of each Build from second release on wards to
ensure that -
There is no impact on previously released software with the addition of new functionality, and
To ensure that there is an increase in the functionality and stability of the software.
There is no impact on previously released software with the resolution of defects.
Test Method:
The following activities will be performed during the test process:
The development team will verify through unit testing that each module, component, object, and
program is functionally correct and conforms to the use case definitions document.
The test team will conduct a functionality/integration test of the larger system to ensure that all the
functionalities/components of the system function properly together and that the application
interfaces properly with external application/s.
The test team will conduct a system test to verify that the software matches the defined
requirements. All the test cases/scripts executed during previous QA cycles will be re-executed to
check the correctness of the system. Once the application has executed successfully under
integration test, each test suite will be executed against the other supported configurations to ensure
defects are not created because the system configuration has changed. A separate test environment
must be established for all hardware, software, and browser configurations supported.
The test team recommends that performance testing be done using performance tool. The purpose
of load testing is to ensure stability of the application under simulated load conditions. Automated
performance testing tools can simulate the load on the system being tested, eliminating the necessity
of employing hundreds of users, huge volume of data, many transactions and obtaining the required
equipment.
The test team will conduct the tests by executing the test cases and scripts. Each test case will test a
specific area of functionality. Test cases will be comprised of several test scripts that detail that
functionality. The test cases will be reviewed to ensure that they cover the scenarios needed to
adequately test the site and its functionality.
Each test case will have an expected result and a pass/fail column. If the expected result is achieved
a value of “Y” will be recorded in the actual results column. If the expected result is not achieved a
value of “N” will be recorded in the actual results column, and the defect will be logged in the
issue-tracking database. The actions that led to the failure and an assessment of its severity will also
be noted in the issue-tracking database.
The development team will fix defects based on the level of severity assigned by the test team. The
defect information will be recorded in the issue-tracking database (Bugzilla), and the developers
will be informed of each new issue via email. The severity levels to be used during the test are
described in the Defect Management portion of this document.
The test team will receive notification via email after each defect has been corrected and unit tested
by the development team. The test team will retest the defect by re-executing the test case and
script in which the defect was found. The regression test will verify that the altered code has not
adversely impacted previously working functionality.
The test team will track all the test cases and test scripts using a Traceability document.
Included within the scope of the test is an external interface test, designed to verify that all
components provided by third party providers interface and interact according to specifications.
A Separate test environment will be established for all hardware, software and browser
configurations supported. Refer to Hardware and software requirements section for more
information.
Following diagram explains the flow of test types/phases followed for International-kids.com Application.
Page 20 of 29
INTERNATIONAL-KIDS.COM
Test Flow:
Testing of International-kids.com application would be performed at feature level. A two week internal build
release approach will be adopted while testing. Integration/Functionality Testing starts as soon as the first set of
features is developed/released by the development team by adopting build series procedure. This process
continues till the completion of System Testing. Development team will decide and inform testing team about
the set of feature that are planned for every build release so that Test Scenarios/Test cases would be developed
and reviewed well in advance
The typical flow of activities that happen in a 2-week QA test process cycle (Build Series) can be summarized
through the Table given below.
Day Series Phase Activities
Monday Start of Build Series N Build Series N: Test initialization activities
Build Series N: Receive Build N and Release notes by 1
P.M
Build Series N: Deploy the build
Build Series N: Run Smoke test cases and Round 1 testing
Tuesday Build Series N: Round 1 Testing
Wednesday Build Series N: Round 1 Testing
Thursday Build Series N: Round 1 Testing
Friday Build Series N: End Round 1 Testing
Build Series N+1: Features/Modules acquisition,
planning, effort estimation, resource allocation
Saturday
Sunday
Monday Build Series N: Start Round 2 Testing
Tuesday Build Series N: Round 2 Testing
Wednesday Build Series N: Round 2 Testing
Build Series N+1: Submit Test Scenarios/Cases for
Review
Thursday Build Series N: Round 2 Testing
Friday End of Build Series N Build series N: Test summery/conclusion report
generation by the end of day
Build Series N+1: Update test cases based on review
feedback, prepare for series N+1
Assuming that the Test case/scripts Execution process begins from 15-Jan-2007(subjected to change) , QA team
will execute the following testing cycles by considering
Page 21 of 29
INTERNATIONAL-KIDS.COM
The defect management process ensures maximum efficiency for defect recognition and resolution. The
objectives of this process are:
QA team will use Bugzilla (a defect tracking tool), which will allow PA developers and QA members to carry
out a full test cycle: find, log, assign, fix, verify, resolve, and close.
The number of defects that surface during the QA testing period, including their potential impacts and
complexity to implement, can be quite unpredictable. The PA Technical Lead / Project Manager will respond to
defects in the minimum time possible, and assign fixes to a particular build. Careful review of the impact of an
implemented fix will minimize reoccurrence and/or the introduction of new problems.
However, since testing alone cannot fully verify that software is complete and correct, PA takes a comprehensive
validation approach. QA processes are integrated into all stages of the PA Development from the start of the
engagement (e.g., large scale planning, unit testing, etc.).
Bugzilla defect tracking tool will be used for defect tracking and reporting purpose. It can be accessed via the
web:
URL =
Project name = International-kids.com
Each team member will be given a User ID and Password
Page 22 of 29
INTERNATIONAL-KIDS.COM
1) A Test engineer executes the test case/script and compares the actual result with the expected. He/She enters
test results under results column in test case document across each test case by marking “Pass”/”Fail”.
1. When a test case fails, after result is updated in test case document, A defect is entered into Bugzilla and the
corresponding defect reference number is mentioned in the test report (test case document used for testing).
2) Following information is entered for every defect in each defect report:
1. Bug number
2. Summery
3. Description
4. Steps to re-create the problem
5. Attachments if any
6. Configuration the problem was found in (Browser/Os/version)
7. Function/component/module the problem was found in
8. Severity of problem
9. Owner/Assigned to
10. URL
11. Status
12. Submit Date
13. Submitter/Reporter
14. Resolution
2. The defect is assigned to the QA lead, who will in turn monitor all the defects for completeness before
submission to Development Tech Lead.
3. All defects will be checked for duplicate defects in Bugzilla before submission to Development Tech Lead.
4. Defects should be reproducible before being submitted to development Tech lead.
5. QA lead will monitor all defects that are in the escalation process. The defects will be classified, managed
and escalated using a process agreed upon between AHA and Professional Access.
6. Tech lead along with module lead will review the defects. If a defect is valid defect, Tech lead will assign it
to respective developer or else reject it by specifying the reason and re-assign it to respective
reporter/submitter
7. Defects will be fixed based on severity. Those defects entered as a Severity 1 (Critical/Showstopper), or
Severity 2 (High) must be corrected prior to the application being deployed. Severity 3 defects (Medium)
will be corrected based on consensus agreement between Project Manager, Technical Lead and QA Test
Lead regarding their criticality.
8. The person, who has been assigned the defect, carries out the impact analysis (identifies the cause of the
problem, identifies the impacted components and also identifies the fix to be carried out) and then fixes the
defect appropriately. He records the impact analysis briefly in the Bugzilla.
9. Integration/System test cases are updated if the defect has been escaped due to the lack of corresponding
Integration/System test case and Integration/System testing that was carried out by respective
Submitter/Reporter.
10. Defects if any are captured and tracked for closure using the Bugzilla.
11. The Regression testing is performed by ideally re-running Integration/System tests of the changed programs.
The modified components are re-baselined on successful conclusion of these tests.
12. The product is re-integrated, revised components are built and re-running of full system and integration
testing is carried out.
Test cases are re-executed under following circumstances -
After a fix / a change / an enhancement.
Re-verify all functions of each build of application.
Page 23 of 29
INTERNATIONAL-KIDS.COM
No new problem introduced by fix / change ("ripple effect").
During System Testing.
Page 24 of 29
Page 25 of 29
Internation-kids.com
Defect Classification:
Defects identified by the PA testing team will be classified based on the guidelines explained in the subsequent
sections. Apart from the guidelines, the context of a defect also has to be considered for proper classification of
the defect. The defects can fall into one of the following categories:
Priority:
Priority describes the importance and order in which a bug should be fixed. The available priorities are:
Page 26 of 29
INTERNATIONAL-KIDS.COM
Effects other features
P2 Medium Resolve the defect at the earliest, before intermediate release (if any)
P3 Low Normal defect, Resolve before Final Client release
P4 Very low Could be fixed based on Triage
Enhancements
Necessity of the bug fix for the final Client release
Update the test scenarios, test cases and scripts if and when functionality workflow changes.
Update the test case documents with results (Pass/Fail) , every time when test cases are executed
Update the test case documents when there is no test case corresponding to the defect raised due to un
usual flows if any.
Update Traceability matrix every time when Scenarios/cases are updated or added.
Develop the Test Results Report (Daily).
Prepare and Review Conclusion Report.
1.28Test Reports
Status Reporting
1) Bugzilla will be used to log bugs. Bug report should have sufficient information to reproduce the bug
2) QA testing will be reported to the Project manager on a daily/weekly by Producing Testing Results
reports.
Test Results Reports should include, but is not restricted to the following:
Report name Fields to include
Individual project status Name of tester
report Types of testing performed
Number of test cases/scripts executed by him/her
Number of test cases/scripts not executed by him/her
Number of defects logged (valid, Invalid, duplicate)
Test case/script execution Number of features available for testing
report Total number of test cases/scripts generated
Number of test cases/scripts executed per tester
Types of testing performed
Percentage of total test scripts completed
Defect status report Total number of defects logged
Total number of defects verified/closed
Total number of open defects
Issues if any
Total number of Severity 1 Defects
Total number of Severity 2 Defects
Total number of Severity 3 Defects
Defects requiring escalation Components/functional areas affected
report Date detected
Current Status
Page 27 of 29
INTERNATIONAL-KIDS.COM
3) QA team and Project Manager will conduct daily/weekly bug scrub meeting. The following information
will be discussed.
Current status vs. planned (are we on schedule?).
Test cases/scripts execution completed (can be at feature level).
Number of defects open and their severity (Bugzilla).
Summery of QA progress
Issues that need clarification/action
Conclusion Report
Upon conclusion of the QA test cycle, the QA/Test Lead will document the results of the test phase of the
International-kids.com system in the Conclusion Report. This report contains information such as:
Test Summary Report would be a combination of all the above reports to present the final testing status during
Intermediate/final Release.
Testing will be deemed complete upon the execution of all of the following:
Page 28 of 29
INTERNATIONAL-KIDS.COM
o System testing is 100% complete and all fixed issues have been regressed and closed.
o Calculation includes the total number of valid open bugs divided by the total number of bugs in
the system. Total Open Bugs include bugs that are Unconfirmed, Assigned, New or Reopened,
whereas Total Bugs include all bugs in the database with no exclusions.
o All Major, Critical, and Blocker/showstopper bugs are closed.
o 95% must be maintained upon the release of the project.
5 Configuration management
Please refer to the Configuration Management document that explains about the complete Configuration
management workflow to be followed.
6 Deliverables
NOTE: Following dates are projected with the assumption of beginning the Construction phase on 11 th
Dec 2006. Actual dates will be modified as per the Project plan once the construction phase begins.
Page 29 of 29