ATLM - Automated Software Testing
ATLM - Automated Software Testing
to test applications. Automated test tools, which perform this capability, were intro-
duced into the market to meet this need and slowly built up momentum. Although
test scenarios and scripts were still generally written down using a word-processing
application, the use of automated test tools nevertheless increased. The more com-
plex test effort required greater and more thorough planning. Personnel performing
the test were required to be more familiar with the application under test and to
have more specific skill requirements relevant to the platforms and network that also
apply to the automated test tools being used.
Automated test tools supporting screen capture and playback have since
matured and expanded in capability. Different kinds of automated test tools with
specific niche strengths continue to emerge. In addition, automated software test
has become increasingly more of a programming exercise, although it continues to
involve the traditional test management functions such as requirements traceability,
test planning, test design, and test scenario and script development.
Definition $1
High-Level Design $2
Low-Level Design $5
Code $10
Unit Test $15
Integration Test $22
System Test $50
Post-Delivery $100+
cause operational downtime. Table 1.1 outlines the cost savings of error detection
through the various stages of the development life cycle [3].
The Automated Test Life-cycle Methodology (ATLM) discussed throughout
this book and outlined in Figure 1.2 represents a structured approach for the imple-
mentation and performance of automated testing. The ATLM approach mirrors the
benefits of modern rapid application development efforts, where such efforts
engage the user early on throughout analysis, design, and development of each soft-
ware version, which is built in an incremental fashion.
In adhering to the ATLM, the test engineer becomes involved early on in the
system life cycle, during business analysis throughout the requirements phase,
design, and development of each software build. This early involvement enables the
test team to conduct a thorough review of requirements specification and software
design, more completely understand business needs and requirements, design the
most appropriate test environment, and generate a more rigorous test design. An
auxiliary benefit of using a test methodology, such as the ATLM, that parallels the
development life cycle is the development of a close working relationship between
software developers and the test engineers, which fosters greater cooperation and
makes possible better results during unit, integration, and system testing.
Early test involvement is significant because requirements or use cases constitute
the foundation or reference point from which test requirements are defined and
against which test success is measured. A system or application’s functional specifi-
cation should be reviewed by the test team. Specifically, the functional specifications
must be evaluated, at a minimum, using the criteria given here and further detailed
in Appendix A.
1.3 The Automated Test Life-Cycle Methodology (ATLM) 9
2. Test Tool
Acquisition
Automated Testing
Life-Cycle Methodology
(ATLM)
testing tool. It considers the process needed to introduce and utilize an automated
test tool, covers test development and test design, and addresses test execution and
management. The methodology also supports the development and management of
test data and the test environment, and describes a way to develop test documenta-
tion so as to account for problem reports. The ATLM represents a structured
approach that depicts a process with which to approach and execute testing. This
structured approach is necessary to help steer the test team away from several com-
mon test program mistakes:
• Implementing the use of an automated test tool without a testing process in
place, which results in an ad hoc, nonrepeatable, nonmeasurable test pro-
gram.
• Implementing a test design without following any design standards, which
results in the creation of test scripts that are not repeatable and therefore
not reusable for incremental software builds.
• Attempting to automate 100% of test requirements, when the tools being
applied do not support automation of all tests required.
• Using the wrong tool.
• Initiating test tool implementation too late in the application development
life cycle, without allowing sufficient time for tool setup and test tool intro-
duction (that is, without providing for a learning curve).
• Involving test engineers too late in the application development life cycle,
which results in poor understanding of the application and system design
and thereby incomplete testing.
The ATLM is geared toward ensuring successful implementation of automated
testing. As shown in Table 1.2, it includes six primary processes or components.
Each primary process is further composed of subordinate processes as described
here.
Test Process Analysis. Test process analysis ensures that an overall test process and
strategy are in place and are modified, if necessary, to allow successful introduction
of the automated test. The test engineer defines and collects test process metrics so
as to allow for process improvement. Test goals, objectives, and strategies must be
defined and test process must be documented and communicated to the test team.
In this phase, the kinds of testing applicable for the technical environment are
defined, as well as tests that can be supported by automated tools. Plans for user
involvement are assessed, and test team personnel skills are analyzed against test
requirements and planned test activities. Early test team participation is emphasized,
supporting refinement of requirements specifications into terms that can be ade-
quately tested and enhancing the test team’s understanding of application require-
ments and design.
Test Tool Consideration. The test tool consideration phase includes steps in which
the test engineer investigates whether incorporation of automated test tools or util-
ities into the test effort would be beneficial to a project, given the project testing
requirements, available test environment and personnel resources, the user environ-
ment, the platform, and product features of the application under test. The project
1.3 The Automated Test Life-Cycle Methodology (ATLM) 13
schedule is reviewed to ensure that sufficient time exists for test tool setup and
development of the requirements hierarchy; potential test tools and utilities are
mapped to test requirements; test tool compatibility with the application and envi-
ronment is verified; and work-around solutions are investigated to incompatibility
problems surfaced during compatibility tests.
Test Planning. The test planning phase includes a review of long-lead-time test
planning activities. During this phase, the test team identifies test procedure cre-
ation standards and guidelines; hardware, software, and network required to support
test environment; test data requirements; a preliminary test schedule; performance
measurement requirements; a procedure to control test configuration and environ-
ment; and a defect tracking procedure and associated tracking tool.
The test plan incorporates the results of each preliminary phase of the struc-
tured test methodology (ATLM). It defines roles and responsibilities, the project
test schedule, test planning and design activities, test environment preparation, test
risks and contingencies, and the acceptable level of thoroughness (that is, test accep-
tance criteria). Test plan appendixes may include test procedures, a description of
the naming convention, test procedure format standards, and a test procedure trace-
ability matrix.
Setting up a test environment is part of test planning. The test team must plan,
track, and manage test environment setup activities for which material procurements
may have long lead times. It must schedule and monitor environment setup activi-
ties; install test environment hardware, software, and network resources; integrate
and install test environment resources; obtain and refine test databases; and develop
environment setup scripts and testbed scripts.
Test Design. The test design component addresses the need to define the number
of tests to be performed, the ways that test will be approached (for example, the
paths or functions), and the test conditions that need to be exercised. Test design
standards need to be defined and followed.
07
Management of Tests
08
09
Requirements Phase
Relationship of 10
2. Test Tool
Acquisition
ATLM (1–6) to 11
System Development
Life Cycle (A–F) 12
13
14
15
16
17
6. Test Program 1. Decision to
Review/Assessment Automate Test 18
19
A. System Life-Cycle Process
20
F. Production and
Maintenance Phase Evaluation and Improvement 21
22
Figure 1.3 System Development Life Cycle—ATLM Relationship 23
24
25
26
of tests (ATLM step 5) takes place in conjunction with the integration and test 27
phase of the system development life cycle. System testing and other testing activi- 28
ties, such as acceptance testing, take place once the first build has been baselined. 29
Test program review and assessment activities (ATLM step 6) are conducted 30
throughout the entire life cycle, though they are finalized during the system devel- 31
opment production and maintenance phase. 32
33
34
1.4.2 Test Maturity Model (TMM)—Augmented by
35
Automated Software Testing Maturity
36
Test teams that implement the ATLM will make progress toward levels 4 and 5 of 37
the Test Maturity Model (TMM). The TMM is a testing maturity model that was 38
developed by the Illinois Institute of Technology [4]; it contains a set of maturity 39 S
40 R
41 L
16 Chapter 1 The Birth and Evolution of Automated Testing
levels through which an organization can progress toward greater test process matu-
rity. This model lists a set of recommended practices at each level of maturity above
level 1. It promotes greater professionalism in software testing, similar to the inten-
tion of the Capability Maturity Model for software, which was developed by the
Software Engineering Institute (SEI) at Carnegie Mellon University (see the SEI
Web site at http://www.sei.cmu.edu/).
17
continued from page 17
Integration. Testing is no longer a phase that This level of test maturity is referred to as
follows coding; rather, it is integrated into the “intentional automation.” At the third level,
entire software life cycle. Organizations can automated testing becomes both well defined
build on the test planning skills they have and well managed. The test requirements and
acquired at level 2. Unlike level 2 (planning the test scripts themselves proceed logically
for testing at TMM), level 3 begins at the from the software requirement specifications
requirements phase and continues throughout and design documents.
the life cycle supported by a version of the Automated test scripts are created based on
V model [12]. Test objectives are established test design and development standards, yet
with respect to the requirements based on the test team does not review automated test
user and client needs and are used for test-case procedures. Automated tests become more
design and success criteria. A test organization reusable and maintainable. At this level of
exists, and testing is recognized as a automated testing, the return on investment is
professional activity. A technical training starting to pay off and a break-even point can
organization pursues a testing focus. Basic already be achieved by the second-regression
tools support key testing activities. Although test cycle (see “Case Study: Value of Test
organizations at this level begin to realize the Automation Measurement” in Chapter 2).
important role of reviews in quality control, The types of tools used at this level include
no formal review program has been requirement management tools, project
established, and reviews do not yet take place planning tools, capture/playback tools,
across the life cycle. A test measurement simulators and emulators, syntax and semantic
program has not yet been established to analyzers, and debugging tools.
qualify process and product attributes.
18
1.4 ATLM’s Role in the Software Testing Universe 19
Optimization, Defect Prevention, and Quality When incorporating the guidelines of the
Control. Because of the infrastructure ATLM described in this book and using the
provided by the attainment of maturity goals applicable tools in an efficient manner, a
at levels 1 through 4 of the TMM, the testing TMM level 5 maturity can be achieved. Tools
process is now said to be defined and managed used at this highest level include the ones
and its cost and effectiveness can be mentioned within previous levels plus test data
monitored. At level 5, mechanisms fine-tune generation tools and metrics collection tools,
and continuously improve testing. Defect such as complexity and size measurement
prevention and quality control are practiced. tools, coverage and frequency analyzers, and
The testing process is driven by statistical statistical tools for defect analysis and defect
sampling and measurements of confidence prevention. (All tools described at the various
levels, trustworthiness, and reliability. An levels are discussed in detail in Chapter 3, and
established procedure exists for the selection tool examples are provided in Appendix B.)
and evaluation of testing tools. Automated
tools totally support the running and
rerunning of test cases, providing support for
test-case design, maintenance of test-related
items, defect collection and analysis, and the
collection, analysis, and application of test-
related metrics.