0% found this document useful (0 votes)
40 views54 pages

SWT6 Tim

1. The document discusses test management tools and how they support the testing process through capabilities like test planning, scheduling, tracking and reporting. 2. Test management tools classify testing tools by purpose, licensing models, price, supported technologies and testing areas. Intrusive tools can have unexpected side effects called "probe effects". 3. Other types of tools discussed are requirements management tools, defect/issue tracking tools, configuration management tools, continuous integration tools, and static analysis tools. These tools aid testing through requirements tracing, defect tracking, code configuration control, and static code reviews.

Uploaded by

ducvip9a2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views54 pages

SWT6 Tim

1. The document discusses test management tools and how they support the testing process through capabilities like test planning, scheduling, tracking and reporting. 2. Test management tools classify testing tools by purpose, licensing models, price, supported technologies and testing areas. Intrusive tools can have unexpected side effects called "probe effects". 3. Other types of tools discussed are requirements management tools, defect/issue tracking tools, configuration management tools, continuous integration tools, and static analysis tools. These tools aid testing through requirements tracing, defect tracking, code configuration control, and static code reviews.

Uploaded by

ducvip9a2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 54

Software Testing

ISTQB / ISEB Foundation Exam Practice

1 Principles 2 Lifecycle 3 Static testing

4 Test
5 Management 6 Tools
techniques

Test Management
Chapter 6
CONTENT

• Test tool considerations

• Effective use of tools


Tool Classification

Principles for Management of Testing


Tool Selection & Testware
Static Testing
Pilot Project
Test Tool Test Design & Specification
Effective Use Classification
Performance &
of Tools
Dynamic Analysis
Success Factors
for Tools Specialised Needs
Chap 6 – Tool Support for Testing

Benefits & Risks of


Test Automation

Test Tool
Considerations Test Execution
• Capture/Replay
Execution & • Data-driven
Management Tools • Keyword-driven
Considerations • Model-based

Test Management
CONTENT

• Test tool considerations


• Test tool classifications

• Effective use of tools • Benefits & risks of test


automation
• Special considerations for
test exe & test mgt. tools
Tool Classification

Principles for Management of Testing


Tool Selection & Testware
Static Testing
Pilot Project
Test Tool Test Design & Specification
Effective Use Classification
Performance &
of Tools
Dynamic Analysis
Success Factors
for Tools Specialised Needs
Chap 6 – Tool Support for Testing

Benefits & Risks of


Test Automation

Test Tool
Considerations Test Execution
• Capture/Replay
Execution & • Data-driven
Management Tools • Keyword-driven
Considerations • Model-based

Test Management
Tool Classification

Common objectives / purpose for use of tools:


• Improve the efficiency of test activities by automating repetitive
tasks or tasks that require significant resources when done
manually
• Improve the efficiency of test activities by supporting manual test
activities throughout the test process
• Improve the quality of test activities by achieving more consistent
testing and a higher level of defect reproducibility
• Automate activities that cannot be executed manually
• Increase reliability of testing
Tool Classification

How are tools classified?


• By their purpose
• By licencing model
• By price
• By technology used
• By testing activities/areas supported by a set of tools
Tool Classification

Intrusive tools & Probe effect


• Intrusive tools may have unexpected side effects in the SUT
• The consequence of using intrusive tools is called the probe effect
o Performance tools might add extra time to response time of a
transaction
o Coverage tools add instrumentation code into real code & different
coverage measures
o Heisenbugs incidents
Tool Support for Management of Testing & Testware

Defect Mgt.
Tools
Reqs. Mgt. Config. Mgt.
Tools Tools
• Test management tools supports
test management & control part
of a test process, with several
capabilities built in.

Test Mgt. & • AML tools manage testing, dev. &


CI Tools
AML Tools deployment. Focus on
communication, collaboration &
(D)
task tracking. Popular in Agile
development.
Tool Support for Management of Testing & Testware

Defect Mgt.
Tools
Reqs. Mgt. Config. Mgt.
Tools Tools

• Tests are created based on


requirements
Test Mgt. & • Support traceability between CI Tools
AML Tools (D)
requirements and tests
Tool Support for Management of Testing & Testware

Defect Mgt.
Tools
Reqs. Mgt. Config. Mgt.
Tools Tools

• Sometimes referred to as
incident management tools, bug
tracking/management tools
• Info about failures during testing
Test Mgt. & CI Tools
(not defects) is recorded.
AML Tools (D)
• Defect Mgt. tools facilitate the
incident report life cycle
Tool Support for Management of Testing & Testware

Defect Mgt.
Tools
Reqs. Mgt. Config. Mgt.
Tools Tools

• Not exactly testing tool, but…


• Configuration management tools
Test Mgt. & greatly aid testing process, esp. in CI Tools
AML Tools complex environments. (D)
Tool Support for Management of Testing & Testware

Defect Mgt.
Tools
Reqs. Mgt. Config. Mgt.
Tools Tools
• More likely to be used by
developers
• Essential part of Agile toolkit
• Integration of new/changed code
Test Mgt. & with existing code base is very CI Tools
AML Tools frequent. (D)
• Unit tests are often automatically
run when a new build is made.
Tool Support for Static Testing

Static
Review
Analysis
Tools
Tools (D)

• More beneficial when reviews


are formal
• Example: monitor reviewers’
checking rate (calculate & flag
exceptions)
Tool Support for Static Testing

Static
Review
Analysis
Tools
Tools (D)
• Normally used by developers (in
coding, unit testing, code
structure comprehension, coding
standard enforcement)
• An extension of compiler tech
• Can be used on work products
other than source code (e.g.,
requirements, websites)
Tool Support for Test Design & Implementation

Test Data
Preparation
Test-driven
Model-based Tools
Development
Testing Tools Tools (D)
• Help construct test cases (or at
least test inputs)
• Tests can be derive from
o Formal requirements
o Elements on a screen
Test Design ATDD & BDD
o A model of the system
Tools Tools
• Challenge with expected result
generation, too many tests
generated by the tools
Tool Support for Test Design & Implementation

Test Data
Preparation
Test-driven
Model-based Tools
Development
Testing Tools Tools (D)

• Generate test inputs & test cases


from models (e.g., state
transition) of system/software
Test Design • Changes in system will trigger ATDD & BDD
Tools Tools
automatic generation of new
tests
Tool Support for Test Design & Implementation

Test Data
Preparation
Test-driven
Model-based Tools
Development
Testing Tools Tools (D)
• Can be used be developers, and
testers (in system and/or
acceptance testing)
• Esp. useful in performance &
Test Design reliability testing ATDD & BDD
Tools Tools
• Useful for anonymising data
(conform to data protection
rules)
Tool Support for Test Design & Implementation

Test Data
Preparation
Test-driven
Model-based Tools
Development
Testing Tools Tools (D)

• In TDD, tests are written first;


then code is created to pass the
tests.
Test Design • TDD tools provide a framework ATDD & BDD
Tools Tools
to write & run tests, basically unit
tests.
Tool Support for Test Design & Implementation

Test Data
Preparation
Test-driven
Model-based Tools
Development
Testing Tools Tools (D)
• ATDD captures requirements by
writing acceptance tests in
conjunction with users
• BDD focuses on system
Test Design behaviour & functionality (often ATDD & BDD
Tools used by dev team) Tools
• These tools have syntax (rules),
like a natural language
Tool Support for Execution & Logging

Test
Coverage
Harnesses
Tools (D)
(D)
• Interface to the SUT
• Run tests as if run by a real user
• Test scripts in a programmable
language
Test • Data, test inputs & expected Unit Test
Execution results held in test repo. Framework
Tools • Most often used in automated Tools (D)
regression testing
• Capture/replay (problematic)
Tool Support for Execution & Logging

Test
Coverage
Harnesses
Tools (D)
(D)
• Objective measure of what parts
of the software structure was
executed by tests
Test • Identify elements that can be Unit Test
Execution Framework
counted  reports what has &
Tools Tools (D)
has not been covered by those
tests
Tool Support for Execution & Logging

Test
Coverage
Harnesses
Tools (D)
(D)

• A test harness is the test


environment that provides
Test drivers & stubs so that IUT can be Unit Test
Execution tested on as small a scale as Framework
Tools possible. Tools (D)
Tool Support for Execution & Logging

Test
Coverage
Harnesses
Tools (D)
(D)
• Software tools to support writing
and running unit tests.
• Can be used in Agile to automate
tests in parallel with
Test Unit Test
development
Execution Framework
Tools • Tend to be used in component Tools (D)
and component integration
testing
Tool Support for Performance & Dynamic Analysis

Monitoring
Tools
Performance Dynamic
Testing Analysis
Tools (D) • Focus on testing at system level Tools (D)
to see if SUT will stand up to a
high volume of usage.
• Load generation: stimulates
multiple users and/or high
volumes of input data
• Reports: based on logs, graphs of
load against response time.
Tool Support for Performance & Dynamic Analysis

Monitoring
Tools
Performance Dynamic
Testing Analysis
Tools (D) • Continuously keep track of the Tools (D)
status of system in use  provide
earliest warnings & improve
service
• Monitoring tools for servers,
networks, DB, security,
performance, website & internet
usage, etc.
Tool Support for Performance & Dynamic Analysis

Monitoring
Tools
Performance Dynamic
Testing Analysis
Tools (D) • Provide run-time information on Tools (D)
software while it’s running.
• Example:
o allocation, use and de-
allocation of resources, e.g.
memory leaks
o flag unassigned pointers or
pointer arithmetic faults
Tool Support for Specialised Testing Needs

Usability Accessibility Localisation


Testing Testing Testing

Data
Conversion & • In IT-centric orgs, very large Security
Migration volumes of complex & Testing
interrelated data need to be
managed.
• These tools can check data
according to given validation
Data Quality Portability
rules (e.g., a particular field is
Assessment Testing
numeric or of a given length) 
data failing a check get reported.
Tool Support for Specialised Testing Needs

Usability Accessibility Localisation


Testing Testing Testing

Data
Conversion & Security
• These tools can verify data
Migration Testing
conversion & check if data
migration has occurred according
to migration rules.
• Help ensure the correctness,
Data Quality completeness, standard complied Portability
Assessment of processed data, regardless of Testing
the volume
Tool Support for Specialised Testing Needs

Usability Accessibility Localisation


Testing Testing Testing

Data
Conversion & Security
• These tools can help to assess UX
Migration Testing
in using system
o After use survey
o Checking broken links
o monitoring usage through most-
Data Quality clicked links, video recorders, key Portability
Assessment presses, screen capture, eye Testing
movement tracking
Tool Support for Specialised Testing Needs

Usability Accessibility Localisation


Testing Testing Testing

Data
Conversion & • The practice of making sure SW Security
Migration be accessible to those with Testing
disabilities.
o Screen reader
o Design with users with colour-
Data Quality blindness in mind
Portability
Assessment o Increasable text Testing
o Alternative text
Tool Support for Specialised Testing Needs

Usability Accessibility Localisation


Testing Testing Testing

Data
• Examines a system's behaviour in Security
Conversion & relation to a particular location,
Migration Testing
locality, or culture
• Goal: to ensure that a software's
linguistic and cultural characteristics
are acceptable for a certain location
Data Quality (translation, menu items, buttons,
error messages) Portability
Assessment Testing
• An area where human intelligence is
needed
Tool Support for Specialised Testing Needs

Usability Accessibility Localisation


Testing Testing Testing

Data
Conversion & • Used to test security by attempts to Security
Migration break into a system Testing
• Help to identify viruses, detect
intrusions, simulate external attacks,
probe open ports, identify
weaknesses in passwords &
Data Quality password files. Portability
Assessment • Perform security checks during Testing
operation.
Tool Support for Specialised Testing Needs

Usability Accessibility Localisation


Testing Testing Testing

Data
Conversion & Security
Migration Testing
• Portability test tools support testing
on different platforms &
environments
• Help run same tests on different
Data Quality devices Portability
Assessment Testing
Where tools fit

Req Anal Acc Test


Requirements Performance
testing measurement

Function Sys Test

Test running Comparison


Test design
Design Int Test Test harness
Test data & drivers
preparation Debug
Dynamic
Static Coverage analysis
Code Comp. Test
analysis measures

Test management tools


Tool Classification

Principles for Management of Testing


Tool Selection & Testware
Static Testing
Pilot Project
Test Tool Test Design & Specification
Effective Use Classification
Performance &
of Tools
Dynamic Analysis
Success Factors
for Tools Specialised Needs
Chap 6 – Tool Support for Testing

Benefits & Risks of


Test Automation

Test Tool
Considerations Test Execution
• Capture/Replay
Execution & • Data-driven
Management Tools • Keyword-driven
Considerations • Model-based

Test Management
Potential Benefits and Risks

• Potential benefits of using tools


• Risks of using tools
• Special considerations for some types of tools
Potential benefits of using tools

• Reduction of repetitive work


• Greater consistency and repeatability
• Objective assessment
• Ease of access to information about tests or testing.
Risks of using tools

There are many risks that are present when tool support for testing
is introduced and used, whatever the specific type of tool.
• Unrealistic expectations for the tool
• Underestimation of the time, cost and effort for the initial
introduction of a tool, for continuing benefits from the tool, for
maintenance of test assets generated by the tool
• Overreliance on the tool
• failing to consider and manage issues related to relationships and
interoperability between critical tools
Risks of using tools

• Possibility of the tool vendor going out of business, retiring the


tool, or selling the tool to a different vendor, or in the open-
source world
• Poor or completely non-existent vendor response, whether for
support, upgrades, or defect fixes
• Various uncertainties and unforeseen problems, such as the
inability to support a new platform
Tool Classification

Principles for Management of Testing


Tool Selection & Testware
Static Testing
Pilot Project
Test Tool Test Design & Specification
Effective Use Classification
Performance &
of Tools
Dynamic Analysis
Success Factors
for Tools Specialised Needs
Chap 6 – Tool Support for Testing

Benefits & Risks of


Test Automation

Test Tool
Considerations Test Execution
• Capture/Replay
Execution & • Data-driven
Management Tools • Keyword-driven
Considerations • Model-based

Test Management
SpecCons for Test Execution & Management Tools

• Test execution tools


o Capture/Replay
o Data-driven
o Keyword-driven
o Model-based

• Test management tools


Capture/Replay Testing Tools

• Recording actions of a manual tester is an easy approach, but


there are a number of drawbacks:
o Does not scale to large numbers of test scripts
o Does not store test cases (only test data & scripts)
o Script may be unstable when unexpected events occur

• Capturing test inputs can be useful (e.g., exploratory testing,


unscripted test with experienced users, etc.)
Data-driven Testing Tools

• A data-driven testing approach separates out the test inputs and


expected results, usually into a spreadsheet, and uses a more
generic test script that can read the input data and execute the
same test script with different data.
• Testers who are not familiar with the scripting language can then
create new test data for these predefined scripts.

Test Data System /


Data Test Script Enter
Component
File Input Data Under Test
Test Result
Expected Output (Compare) Actual Output
Keyword-driven Testing Tools

• A generic script processes keywords describing the actions to be


taken (aka. action words), which then calls keyword scripts to
process the associated test data
• Testers (even unfamiliar with the scripting language) can define
tests using the keywords and associated data, tailored to the SUT.

Enter
Input Input Data
Actions
Automation
SUT
Script
Test
Data Test Output Send Output
Data Data
Model-based Testing Tools

• Model-Based testing (MBT) tools include test execution capability,


generate test inputs & expected outputs.
• These tools enable a functional specification to be captured in the
form of a model (e.g., state transition diagram, activity diagram) –
generally done by a system designer.
• The MBT tool interprets the model in order to create test case
specifications which can then be saved in a test management tool
and/or executed by a test execution tool.
Test Management Tools

• These tools can generate loads of useful data, but they need to
interface with other tools in order to:
o Produce meaningful information in an accessible format
o Maintain consistent traceability to requirement (w/ reqs mgt. tools)
o Link with test object version information (w/ config mgt. tools)

• ALM tools can provide different types of info according to user


groups (mangers, developers, testers, etc.)
• It is essential to monitor info produced to ensure its currency.
• Recommended approach: define test process  consider
adopted tool(s)  adapt the tool(s) to provide the highest benefit
CONTENT

• Test tool considerations

• Effective use of tools


• Principles for Tool Selection
• Pilot Project
• Success Factors for Tools
Tool Classification

Principles for Management of Testing


Tool Selection & Testware
Static Testing
Pilot Project
Test Tool Test Design & Specification
Effective Use Classification
Performance &
of Tools
Dynamic Analysis
Success Factors
for Tools Specialised Needs
Chap 6 – Tool Support for Testing

Benefits & Risks of


Test Automation

Test Tool
Considerations Test Execution
• Capture/Replay
Execution & • Data-driven
Management Tools • Keyword-driven
Considerations • Model-based

Test Management
Considerations for Tool Selections

• Assessment of the maturity of the organization


• Identification of opportunities for an improved test process
supported by tools
• Understanding of the technologies used by the test object(s)
• Knowledge of current build and continuous integration tools, in
order to ensure tool compatibility and integration
• Evaluation of tools against clear requirements and objective
criteria
Considerations for Tool Selections

• Check if the tool(s) are available for a free trial period (and for
how long)
• Evaluation of the vendor
• Identification of coaching, mentoring, & training needs
• Consideration of pros and cons of various licensing models (e.g.,
commercial or open source)
• Estimation of a cost-benefit ratio
• Finally, a proof-of-concept evaluation should be done
Pilot project

The objectives for a pilot project for a new tool are:


• To learn more about the tool (more detail, more depth);
• To see how the tool would fit with existing processes or
documentation, how those would need to change to work well
with the tool and how to use the tool to streamline existing
processes;
Pilot project

Primary objectives:
• Gaining knowledge about the tool
• Evaluating how the tool fits with existing processes and practices
• Deciding on standard ways of using, managing, storing, and
maintaining the tool and the test assets
• Assessing whether the benefits will be achieved at reasonable
cost
• Understanding the metrics to be collected & reported by the tool,
and configuring the tool to ensure these metrics can be captured
and reported.
Success Factors for Tools

• Rolling out the tool to the rest of the organization incrementally


• Adapting and improving processes to fit with the use of the tool
• Providing training, coaching, and mentoring for tool users
• Defining guidelines for the use of the tool
• Implementing a way to gather usage information from the actual
use of the tool
• Monitoring tool use and benefits
• Providing support to the users of a given tool
• Gathering lessons learned from all users

You might also like