Software Quality Testing
Software Quality Testing
Analysis:
Inputs: User Requirements Document
Business Rules
This phase involves planning and requirement gathering
1. Planning
Understanding of client requirements and specifications
Performing feasibility analysis
Develop solution strategy
Determining acceptance criteria
Planning the development process
The output of this process will give us project plan
2. Requirement Gathering
Analyze allocated requirements
Segregate requirements into technical and non technical category
Prepare a traceability matrix for each gathered requirement
Review all allocated requirements and traceability matrix
The output of this process will give us SRS & RTM
Design:
Design is implemented based on SRS and Project Plan
Here design is divided into two groups HLD & LLD
Record architecture designs and detailed design in DDD – Detailed Design Document
Prepare system test plan based on SRS and DDD
Prepare unit test cases, integration test cases and system test cases
The output of this phase is detailed design document and system test plan
Coding:
Component diagrams are used in this phase
The output of this phase is executable code and Unit test report
Testing:
Testing will be done for validating software according to client specifications
Setup test environment
All test cases are executed
Review and close all reported bugs
Maintenance:
Error corrections
Code modifications
Software Models
Waterfall Model
V Model
Prototype Model
Iterative & Incremental Model
Spiral Model
Agile Scrum
Waterfall Model:
It is classical approach to the software development life cycle
Approach is a linear and sequential
Each phase must begin only when previous phase is done
This model is used for known requirements or repetitive projects
Advantage:
Easy to understand and implement
Phases are processed and complete one at a time
Disadvantage:
It is a document driven model
Client cannot understand by documents
Assumes feasibility before implementation
Working software will be produced only after last phase
Not all requirements are received at once, the requirements from customer
goes on getting added to the list even after the end of "Requirement
Gathering and Analysis" phase, this affects the system development process
and its success in negative aspects.
V Model:
V model is an extension of waterfall model
This model is also called as verification and validation
It means verification and validation will be done side by side
The testing procedures are developed early in the life cycle
Testing starts from requirements phase
Each phase must be completed before the next phase begins
Disadvantage:
This model is very rigid
If any changes happen in the middle not only requirements documents but
also testing documents should be updated
It is cost effective and required more human resources
It needs established process to implements
It can be implemented only by big companies
Prototype Model:
This approach is used to develop a software product quickly
Based on client requirement a prototype will be designed and send for client
feedback after approval only actual engineering starts
Advantages:
Reduced time and cost
Savings of development resources
Client involvement will be more
Disadvantage:
Insufficient analysis
Excessive development time of the prototype
Advantages:
Generates working software quickly and early during the software life cycle.
More flexible – less costly to change scope and requirements
Easier to test and debug during a smaller iteration.
Easier to manage risk because risky pieces are identified and handled during
its iteration.
Each iteration is an easily managed milestone
Disadvantage:
Design issues may arise because not all requirements are gathered up front
for the entire lifecycle
Spiral Model:
The spiral model is similar to the incremental model, with more emphases placed
on risk analysis.
The spiral model has four phases: Planning, Risk Analysis, Engineering and
Evaluation
This is the first model to explain why iteration matters
The iteration were typically 6 months to 2 years long
Requirements are gathered during the planning phase.
In the risk analysis phase, a process is undertaken to identify risk and alternate
solutions.
A prototype is produced at the end of the risk analysis phase.
Software is produced in the engineering phase, along with testing at the end of
the phase.
The evaluation phase allows the customer to evaluate the output of the project to
date before the project continues to the next spiral
Disadvantage:
Can be a costly model to use
Risk analysis requires highly specific expertise.
Project’s success is highly dependent on the risk analysis phase.
Doesn’t work well for smaller projects
Agile - Scrum:
Agile Testing can be defined as testing practice that follows the agile manifesto, treating
development as the customer of testing
Agile testing is used whenever customer requirements are changing dynamically
In agile process at every moments of developing or testing the application the
customer is always present at the desk, so it is easier to do the testing. Testing
starts with the exploration of the requirements and what the customer really
wants.
Scrum Glossary:
Product Backlog: To-do List. That contains the project goals and priorities which is
managed by product owner
Product Owner: The person responsible for the products product backlog and make
sure that the project is working with the right things from a business perspective
Release Backlog: Is same as product backlog but restricted to release of the product
Scrum Master: The Team Leader of the scrum team. The Scrum Master does not
manage the team. Instead, he or she works to remove any impediments that are
obstructing the team from achieving its sprint goals
Sprint: Iteration
Sprint Backlog: To do list for a specific
Sprint Review: It’s an informal meeting (about 4 hours) at the end of sprint, in which
they present what has done in that sprint
Sprint Retrospective: Meeting (about 3 hours) held after each sprint. The Scrum
master and scrum team both review and see what went well and what should be
improved in the next phase
Timebox: A period during which something is to be carried out. A sprint is a result of
timebox thinking
Burn-down chart: A diagram that monitors how much work remains to implement a
segment of the software being developed during a sprint
Why Iterative?
Prototype leads to product
Rapid Feedback
Reduced Risk
Requirement Analysis
During this phase, test team studies the requirements from a testing point of view to identify
the testable requirements. The QA team may interact with various stakeholders to
understand the requirements in detail
Identify types of tests to be performed
Gather details about testing priorities
Prepare Requirement Traceability Matrix (RTM)
Identify test environment details where testing is supposed to be carried out.
Automation feasibility analysis (if required).
Deliverables
RTM
Automation feasibility report (If applicable)
Test Planning
This phase is also called Test Strategy phase. Typically, in this stage, a Senior QA manager
will determine effort and cost estimates for the project and would prepare and finalize the
Test Plan.
Preparation of test plan/strategy document for various types of testing
Test tool selection
Test effort estimation
Resource planning and determining roles and responsibilities
Training requirement
Deliverables
Test plan /strategy document
Effort estimation document
Deliverables
Test cases/scripts
Test data
Test Execution
During this phase test team will carry out the testing based on the test plans and the test
cases prepared. Bugs will be reported back to the development team for correction and
retesting will be performed.
Execute tests as per plan
Document test results, and log defects for failed cases
Map defects to test cases in RTM
Retest the defect fixes
Track the defects to closure
Deliverables
Completed RTM with execution status
Test cases updated with results
Defect reports
Deliverables
Test Closure report
Test metrics
Software Testing
The goal of software tester is to find bugs and make sure they get fixed
Testing: It’s a process of executing a program with the intent of finding bugs
Software Testing: It’s a process used to identify the correctness, completeness and the
quality of the software
Note: If we want to improve our software we should not test more we should develop better
Goals of Testing:
Find the cases where the program does not do things which is supposed to do
Find the cases where the program does things which is not supposed to do
To ensure that system performs all the functions that are listed in the specifications
Early testing
Testing shows presence of defects
Exhaustive testing is impossible
Testing is context dependent
Defect clustering
Pesticide paradox
Absence of errors fallacy
Early testing
Testing activities should start as early as possible in the software or system development life
cycle and should be focused on defined objectives
Defect clustering
A small number of modules contain most of the defects discovered during pre-release
testing or show the most operational failures
Pesticide paradox
If the same tests are repeated over and over again, eventually the same set of test cases
will no longer find any new bugs. To overcome this 'pesticide paradox', the test cases need
to be regularly reviewed and revised, and new and different tests need to be written to
exercise different parts of the software or system to potentially find more defects.
2. Dynamic Testing
Testing by executing and validating the application
Cyclomatic Complexity:
Cyclomatic complexity is a software metric used to measure the complexity of
a program.
It directly measures the number of linearly independent paths through a
program's source code
Statement Testing:
Ensuring that all statements have been executed at least once
Condition Testing
It is a test case design method that test the logical conditions contained in a
procedural specification. It focuses on testing each condition in the program by
providing possible combination of values
Error Guessing:
Test cases can be developed for invalid data to guess errors
Exploratory Testing:
Testing that is not based on formal test plan or test case. Tester will learn the software as
they test it using their experience
Levels of testing
1. Unit / Component Testing
2. Integration Testing
3. System Testing
4. Acceptance Testing
1. Unit Testing:
Unit testing is called component testing
It is a method of testing the correctness of particular module of source code
The goal of unit testing is to isolate each part of the program and shows that
individual parts are correct
Here white box testing techniques will be applied
2. Integration Testing:
In which software units of an application are combined and tested for evaluating
the interaction between them
Integration Approach:
Top – Down
Bottom –Up
Big Bang / Sandwich
Top – Down:
Integrating individual components from top level
Test stubs are needed
Bottom – Up:
Integrating individual components from bottom level
Test drivers are needed
3. System Testing:
In which, software is integrated to the overall system
Alpha Testing:
Testing will be done with test data in controlled environment
Testing will be done at our location
Beta Testing:
Testing will be done with live data in live environment
Testing will be done at client location
Reviews:
Review is a process or meeting during a work product
The mail goal is to indentify defects in the initial stage
There are three reviews
Peer Review
Walkthrough
Inspection
Peer Review:
It is generally one to one meeting
Generally we exchange our test cases with our teammates and perform a
review to see if they miss anything
Walkthrough:
It is an informal meeting for evaluation or informational purpose
We can discuss/raise the issue at peer level
A team of 8 to 10 people
The issues raised are captured and published in a report distributed to the
participants
Inspection:
Inspections are formal review
Inspections are strict and close examinations conducted on specifications,
requirements, design, code and testing
Validation:
It is a dynamic testing
Validation is done to ensure that it meets client requirements
Validation physically ensures that system operates according to plan by
executing series of test
It answers “Did we build the right product”
QC (Quality Control):
QC is product oriented. Oriented to detection
QC activities focus on finding defects in specific deliverables
Inspecting and ensuring the work product
QC make sure the result of “What we done are what we expected”
Smoke Testing:
When a build is received a smoke test is run to determine if the build is stable and
it can be consider for further testing
Testing of major functionalities
TL will perform smoke testing
Smoke testing test cases are positive test cases only
System Testing:
It is a methodology to validate the system as a whole
Testing will be limited to functional and non-functional
Regression:
Testing of all test cases/functionality to ensure that the code fix haven't
introduced any problem to existing code/functionality which was working fine
earlier
Regression Testing is carried out both manually and automation. The
automatic tools are mainly used for the Regression Testing
Regression testing occurs at three times
1. after bug fixes
Test Case:
It is a document which describes input, expected and actual results to determine if a feature
of an application is working correctly or not
Test Scenario:
A set of test cases ensures that the business process flow will be tested from end to end
Test Suit:
A collection of test suit / test cases that are related to each other
Test Environment:
An environment that is created for testing purpose
Test Bed:
It’s an environment where testing is supposed to done
Ad hoc Testing:
Test that is not based on formal test plan or test case, but tester should have significant
knowledge of the software before using it
Setup Testing:
Testing of an installation and un-installation of the software
Normal Installation:
It will have installed shield.
It will ask for Next Next in every window while installing
It will ask for restart machine
Registry:
Start run regedit HKEY_Local_Machine Software (Installed Product)
Services:
Start run services.msc (Installed Product) status should be “started” and startup
type should be “Automatic”
Test Metrics:
It used for measurement to track performance
Like Number of defects found in a specific cycle, cost / effort estimations
Version Control:
Whenever there is a change in software of documentation then it version will be
changed accordingly.
Commonly used version control tools are Win-CVS, Perforce, Microsoft VSS (Visual
Source Safe)
Test Plan:
It is a document which describes the objective, scope, approach and focus of
software testing effort
It contains what activities need to be done in what time schedule
What resources are needed?
What products are delivered need to be considered in advance
Test Strategy:
It is a formal description of how software product will be tested
Test strategy is developed for all levels of testing
It defines what methods, techniques and tools to be used
Test strategy indicates how testing is carried out
References:
List all documents that support Test Plan
Project Plan
Requirement Specifications Document
High level Design Document
Low Level Design Document
Methodology guidelines and examples
Features To be tested:
This is a listing of what is to be tested from the USERS viewpoint of what the system
does.
This is not a technical description of the software, but a USERS view of the functions.
Set the level of risk for each feature. Use a simple rating scale such as (H, M, L): High,
Medium and Low. These types of levels are understandable to a User. You should be
prepared to discuss why a particular level was chosen
Test Tools
Are any special tools to be used and what are they?
Will the tool require special training?
Meetings
Exit Criteria:
Hardware / Software are not available at the time of testing
If application contains one or more show stopper defects
When all test cases are executed
When all defects are closed
When it reaches to deadlines
Resumption Criteria:
It testing is suspended, resumption will only occur when the caused suspension
has been resolved
Test Deliverables:
Unit test Plan
Integration test plan
System test plan
Acceptance test plan
Defect reports and summaries
Test logs
Environmental Needs:
Are there any special requirements for this test plan, such as
Special hardware such as simulators, static generators etc
How will test data be provided. Are there special collection requirements or specific
ranges of data that must be provided?
How much testing will be done on each component of a multi-part feature?
Specific versions of other supporting software
Restricted use of the system during testing
Responsibilities:
Who is in charge?
Setting risks
Selecting features to be tested and not tested.
Setting overall strategy for this level of plan
Ensuring all required elements is in place for testing.
Providing for resolution of scheduling conflicts, especially, if testing is done on the
production system.
Who provides the required training?
Who makes the critical go/no go decisions for items not covered in the test plans?
Schedule:
Test estimations
Estimations for test plan
Glossary:
Used to define terms and acronyms used in the document
Web Testing:
1. Usability Testing
2. Checking Links
3. Browser Compatibility Testing
4. Functionality Testing
5. Credit Card Testing
6. Security Testing
7. Performance Testing
8. Database Testing
Browser Compatibility Testing: Test to validate with different types of browsers with
different configurations
Fonts and Graphics position
Screen resolution 1024 x 768 …
Support for different script and software (Flash)
Performance Testing: To determine how fast system performs under particular work load
Types of performance testing
Baseline Testing
Load Testing
Stress Testing
Soak Testing
Scalability Testing
Baseline Testing
Baseline Testing examines how a system performs under expected or normal load and
creates a baseline against which the other types of tests can be related
Load Testing
Load Testing includes increasing the Load and see how the system behaves under higher
load
Stress Testing
The goal of Stress Test is exactly that; to find the volume of load where the system actually
breaks or is close to breaking. Applying the load by decresing the resources (RAM /
Processor)
Soak Testing
In order to find system instabilities that occur over time, we need to conduct tests that run
over a long period
Scalability Testing
Scalability Testing is very much like Load Testing, but instead of increasing the number of
requests, we instead increase the size or the complexity of the requests sent. This involves
sending large requests, large attachments, or deeply nestled requests.
Database Testing:
Verify the consistency, accuracy and correctness of data stored in database
Entity Integrity:
Entity integrity ensures that each row in a table is uniquely identified
Example: Two customers do not have same ID
Domain Integrity:
Here we check for correct data types, null status, and field size
Referential Integrity:
Keeping the relationship between the tables. This ensures every foreign key matches
primary key
Bug Statuses
New
Open
Assigned
Fixed
Differed
Verified
Closed
Re-Open
Invalid
Duplicate
Hold
Defect Age: It’s a time gap between the defect introduction time to defect close time
Defect Density: Number of defects raised to a measure of size of the program
Latent Bug: The bug that has been found after two or more releases
Bug Leakage: Bug which has to found in analysis phase has found in Design/Code/Test
Bug Triage: It is nothing but making a meeting and examine the open bugs and divide
them into categories
o Bugs to fix now
o Bugs to fix later
o Bugs we will never fix
Localization Testing:
It checks how well the build has been translated into a particular target language
It includes the translation of the application user interface
Internalization Testing:
It makes sure that code can handle all international support without breaking
functionality either data loss or display problems
It checks proper functionality of the product with any of local settings using all types
of international inputs
Linguistic Testing:
Checking of grammatical and contextual errors, language specific settings and spell
checks