0% found this document useful (0 votes)
5 views13 pages

ATLM - Automated Software Testing

The Automated Test Life-Cycle Methodology (ATLM) emphasizes early involvement of test engineers in the software development process to enhance product quality and reduce costs associated with error correction. It outlines a structured approach to implementing automated testing, including phases such as decision to automate, test tool acquisition, and test planning, design, and execution. By integrating testing activities with the system development life cycle, the ATLM aims to foster collaboration between developers and testers, ultimately leading to more effective and efficient testing outcomes.

Uploaded by

maggielinwnc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views13 pages

ATLM - Automated Software Testing

The Automated Test Life-Cycle Methodology (ATLM) emphasizes early involvement of test engineers in the software development process to enhance product quality and reduce costs associated with error correction. It outlines a structured approach to implementing automated testing, including phases such as decision to automate, test tool acquisition, and test planning, design, and execution. By integrating testing activities with the system development life cycle, the ATLM aims to foster collaboration between developers and testers, ultimately leading to more effective and efficient testing outcomes.

Uploaded by

maggielinwnc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

1.

3 The Automated Test Life-Cycle Methodology (ATLM) 7

to test applications. Automated test tools, which perform this capability, were intro-
duced into the market to meet this need and slowly built up momentum. Although
test scenarios and scripts were still generally written down using a word-processing
application, the use of automated test tools nevertheless increased. The more com-
plex test effort required greater and more thorough planning. Personnel performing
the test were required to be more familiar with the application under test and to
have more specific skill requirements relevant to the platforms and network that also
apply to the automated test tools being used.
Automated test tools supporting screen capture and playback have since
matured and expanded in capability. Different kinds of automated test tools with
specific niche strengths continue to emerge. In addition, automated software test
has become increasingly more of a programming exercise, although it continues to
involve the traditional test management functions such as requirements traceability,
test planning, test design, and test scenario and script development.

1.3 The Automated Test Life-Cycle


Methodology (ATLM)
The use of automated test tools to support the test process is proving to be benefi-
cial in terms of product quality and minimizing project schedule and effort (see
“Case Study: Value of Test Automation Measurement,” in Chapter 2). To achieve
these benefits, test activity and test planning must be initiated early in the project.
Thus test engineers need to be included during business analysis and requirements
activities and be involved in analysis and design review activities. These reviews can
serve as effective testing techniques, preventing subsequent analysis/design errors.
Such early involvement allows the test team to gain understanding of the customer
needs to be supported, which will aid in developing an architecture for the appro-
priate test environment and generating a more thorough test design.
Early test involvement not only supports effective test design, which is a criti-
cally important activity when utilizing an automated test tool, but also provides
early detection of errors and prevents migration of errors from requirement specifi-
cation to design, and thence from design into code. This kind of error prevention
reduces cost, minimizes rework, and saves time. The earlier in the development
cycle that errors are uncovered, the easier and less costly they are to fix. Cost is mea-
sured in terms of the amount of time and resources required to correct the defect. A
defect found at an early stage is relatively easy to fix, has no operational impact, and
requires few resources. In contrast, a defect discovered during the operational phase
can involve several organizations, can require a wider range of retesting, and can
8 Chapter 1 The Birth and Evolution of Automated Testing

Table 1.1 Prevention Is Cheaper Than Cure


Error Removal Cost Multiplies Over
System Development Life Cycle
Phase Cost

Definition $1
High-Level Design $2
Low-Level Design $5
Code $10
Unit Test $15
Integration Test $22
System Test $50
Post-Delivery $100+

cause operational downtime. Table 1.1 outlines the cost savings of error detection
through the various stages of the development life cycle [3].
The Automated Test Life-cycle Methodology (ATLM) discussed throughout
this book and outlined in Figure 1.2 represents a structured approach for the imple-
mentation and performance of automated testing. The ATLM approach mirrors the
benefits of modern rapid application development efforts, where such efforts
engage the user early on throughout analysis, design, and development of each soft-
ware version, which is built in an incremental fashion.
In adhering to the ATLM, the test engineer becomes involved early on in the
system life cycle, during business analysis throughout the requirements phase,
design, and development of each software build. This early involvement enables the
test team to conduct a thorough review of requirements specification and software
design, more completely understand business needs and requirements, design the
most appropriate test environment, and generate a more rigorous test design. An
auxiliary benefit of using a test methodology, such as the ATLM, that parallels the
development life cycle is the development of a close working relationship between
software developers and the test engineers, which fosters greater cooperation and
makes possible better results during unit, integration, and system testing.
Early test involvement is significant because requirements or use cases constitute
the foundation or reference point from which test requirements are defined and
against which test success is measured. A system or application’s functional specifi-
cation should be reviewed by the test team. Specifically, the functional specifications
must be evaluated, at a minimum, using the criteria given here and further detailed
in Appendix A.
1.3 The Automated Test Life-Cycle Methodology (ATLM) 9

• Completeness. Evaluate the extent to which the requirement is thoroughly


defined.
• Consistency. Ensure that each requirement does not contradict other
requirements.
• Feasibility. Evaluate the extent to which a requirement can actually be
implemented with the available technology, hardware specifications, project
budget and schedule, and project personnel skill levels.
• Testability. Evaluate the extent to which a test method can prove that a
requirement has been successfully implemented.
Test strategies should be determined during the functional specification/
requirements phase. Automated tools that support the requirements phase can help
produce functional requirements that are testable, thus minimizing the effort and
cost of testing. With test automation in mind, the product design and coding stand-
ards can provide the proper environment to get the most out of the test tool. For
example, the development engineer could design and build in testability into the
application code. Chapter 4 further discusses building testable code.
The ATLM, which is invoked to support test efforts involving automated test
tools, incorporates a multistage process. This methodology supports the detailed and
interrelated activities that are required to decide whether to employ an automated

4. Test Planning, 3. Automated Testing


Design, and Development Introduction Process
Management of Tests
5. Execution and

2. Test Tool
Acquisition

Automated Testing
Life-Cycle Methodology
(ATLM)

6. Test Program 1. Decision to


Review and Assessment Automate Test

Figure 1.2 Automated Test Life-Cycle Methodology (ATLM)


10 Chapter 1 The Birth and Evolution of Automated Testing

testing tool. It considers the process needed to introduce and utilize an automated
test tool, covers test development and test design, and addresses test execution and
management. The methodology also supports the development and management of
test data and the test environment, and describes a way to develop test documenta-
tion so as to account for problem reports. The ATLM represents a structured
approach that depicts a process with which to approach and execute testing. This
structured approach is necessary to help steer the test team away from several com-
mon test program mistakes:
• Implementing the use of an automated test tool without a testing process in
place, which results in an ad hoc, nonrepeatable, nonmeasurable test pro-
gram.
• Implementing a test design without following any design standards, which
results in the creation of test scripts that are not repeatable and therefore
not reusable for incremental software builds.
• Attempting to automate 100% of test requirements, when the tools being
applied do not support automation of all tests required.
• Using the wrong tool.
• Initiating test tool implementation too late in the application development
life cycle, without allowing sufficient time for tool setup and test tool intro-
duction (that is, without providing for a learning curve).
• Involving test engineers too late in the application development life cycle,
which results in poor understanding of the application and system design
and thereby incomplete testing.
The ATLM is geared toward ensuring successful implementation of automated
testing. As shown in Table 1.2, it includes six primary processes or components.
Each primary process is further composed of subordinate processes as described
here.

1.3.1 Decision to Automate Test


The decision to automate test represents the first phase of the ATLM. This phase is
addressed in detail in Chapter 2, which covers the entire process that goes into the
automated testing decision. The material in Chapter 2 is intended to help the test
team manage automated testing expectations and outlines the potential benefits of
automated testing, if implemented correctly. An approach for developing a test tool
proposal is outlined, which will be helpful in acquiring management support.
1.3 The Automated Test Life-Cycle Methodology (ATLM) 11

Table 1.2 ATLM Process Hierarchy


Decision to Automate

1.1 Automated Test Expectations Chapter 2


1.2 Benefits of Automated Test Chapter 2
1.3 Acquiring Management Support Chapter 2
Test Tool Acquisition

2.1 Review System Engineering Environment Chapter 3


2.2 Review Tools Available on the Market Chapter 3
2.3 Tool Research and Evaluation Chapter 3
2.4 Tool Purchase Chapter 3
Introduction of Automated Testing

3.1 Test Process Analysis Chapter 4


3.2 Test Tool Consideration Chapter 4
Test Planning, Design, and Development

4.1 Test Plan Documentation Chapter 6


4.2 Test Requirements Analysis Chapter 7
4.3 Test Design Chapter 7
4.4 Test Development Chapter 8
Execution and Management of Automated Test

5.1 Automated Test Execution Chapter 9


5.2 Testbed Baseline Chapter 8
5.3 Defect Tracking Chapter 9
5.4 Test Progress Tracking Chapter 9
5.5 Test Metrics Chapter 9
Process Evaluation and Improvement

6.1 Post-Release: Test Process Improvement Chapter 10


12 Chapter 1 The Birth and Evolution of Automated Testing

1.3.2 Test Tool Acquisition


Test tool acquisition represents the second phase of the ATLM. Chapter 3 guides the
test engineer through the entire test tool evaluation and selection process, starting
with confirmation of management support. As a tool should support most of the
organization’s testing requirements whenever feasible, the test engineer will need to
review the systems engineering environment and other organizational needs. Chap-
ter 3 reviews the different types of tools available to support aspects of the entire
testing life cycle, enabling the reader to make an informed decision with regard to
the types of tests to be performed on a particular project. It next guides the test
engineer through the process of defining an evaluation domain to pilot the test tool.
After completing all of those steps, the test engineer can make vendor contact to
bring in the selected tool(s). Test personnel then evaluate the tool, based on sample
criteria provided in Chapter 3.

1.3.3 Automated Testing Introduction Phase


The process of introducing automated testing to a new project team represents the
third phase of the ATLM. Chapter 4 outlines the steps necessary to successfully
introduce automated testing to a new project, which are summarized here.

Test Process Analysis. Test process analysis ensures that an overall test process and
strategy are in place and are modified, if necessary, to allow successful introduction
of the automated test. The test engineer defines and collects test process metrics so
as to allow for process improvement. Test goals, objectives, and strategies must be
defined and test process must be documented and communicated to the test team.
In this phase, the kinds of testing applicable for the technical environment are
defined, as well as tests that can be supported by automated tools. Plans for user
involvement are assessed, and test team personnel skills are analyzed against test
requirements and planned test activities. Early test team participation is emphasized,
supporting refinement of requirements specifications into terms that can be ade-
quately tested and enhancing the test team’s understanding of application require-
ments and design.

Test Tool Consideration. The test tool consideration phase includes steps in which
the test engineer investigates whether incorporation of automated test tools or util-
ities into the test effort would be beneficial to a project, given the project testing
requirements, available test environment and personnel resources, the user environ-
ment, the platform, and product features of the application under test. The project
1.3 The Automated Test Life-Cycle Methodology (ATLM) 13

schedule is reviewed to ensure that sufficient time exists for test tool setup and
development of the requirements hierarchy; potential test tools and utilities are
mapped to test requirements; test tool compatibility with the application and envi-
ronment is verified; and work-around solutions are investigated to incompatibility
problems surfaced during compatibility tests.

1.3.4 Test Planning, Design, and Development


Test planning, design, and development is the fourth phase of the ATLM. These sub-
jects are further addressed in Chapters 6, 7, and 8, and are summarized here.

Test Planning. The test planning phase includes a review of long-lead-time test
planning activities. During this phase, the test team identifies test procedure cre-
ation standards and guidelines; hardware, software, and network required to support
test environment; test data requirements; a preliminary test schedule; performance
measurement requirements; a procedure to control test configuration and environ-
ment; and a defect tracking procedure and associated tracking tool.
The test plan incorporates the results of each preliminary phase of the struc-
tured test methodology (ATLM). It defines roles and responsibilities, the project
test schedule, test planning and design activities, test environment preparation, test
risks and contingencies, and the acceptable level of thoroughness (that is, test accep-
tance criteria). Test plan appendixes may include test procedures, a description of
the naming convention, test procedure format standards, and a test procedure trace-
ability matrix.
Setting up a test environment is part of test planning. The test team must plan,
track, and manage test environment setup activities for which material procurements
may have long lead times. It must schedule and monitor environment setup activi-
ties; install test environment hardware, software, and network resources; integrate
and install test environment resources; obtain and refine test databases; and develop
environment setup scripts and testbed scripts.

Test Design. The test design component addresses the need to define the number
of tests to be performed, the ways that test will be approached (for example, the
paths or functions), and the test conditions that need to be exercised. Test design
standards need to be defined and followed.

Test Development. For automated tests to be reusable, repeatable, and maintain-


able, test development standards must be defined and followed.
14 Chapter 1 The Birth and Evolution of Automated Testing

1.3.5 Execution and Management of Tests


The test team must execute test scripts and refine the integration test scripts, based
on a test procedure execution schedule. It should also conduct evaluation activities
of test execution outcomes, so as to avoid false-positives or false-negatives. System
problems should be documented via system problem reports, efforts should be
made to support developer understanding of system and software problems and
replication of the problem. Finally, the team should perform regression tests and all
other tests and track problems to closure.

1.3.6 Test Program Review and Assessment


Test program review and assessment activities need to be conducted throughout the
testing life cycle, thereby allowing for continuous improvement activities. Through-
out the testing life cycle and following test execution activities, metrics need to be
evaluated and final review and assessment activities need to be conducted to allow
for process improvement.

1.4 ATLM’s Role in the Software Testing Universe


1.4.1 ATLM Relationship to System Development Life Cycle
For maximum test program benefit, the ATLM approach needs to be pursued in
parallel with the system life cycle. Figure 1.3 depicts the relationship between the
ATLM and the system development life cycle. Note that the system development
life cycle is represented in the outer layer in Figure 1.3. Displayed in the bottom
right-hand corner of the figure is the process evaluation phase. During the system
life cycle process evaluation phase, improvement possibilities often determine that
test automation is a valid approach toward improving the testing life cycle. The asso-
ciated ATLM phase is called the decision to automate test.
During the business analysis and requirements phase, the test team conducts
test tool acquisition activities (ATLM step 2). Note that test tool acquisition can
take place at any time, but preferably when system requirements are available. Ide-
ally, during the automated testing introduction process (ATLM step 3), the devel-
opment group supports this effort by developing a pilot project or small prototype
so as to iron out any discrepancies and conduct lessons learned activities.
Test planning, design and development activities (ATLM step 4) should take
place in parallel to the system design and development phase. Although some test
planning will already have taken place at the beginning and throughout the system
development life cycle, it is finalized during this phase. Execution and management
1.4 ATLM’s Role in the Software Testing Universe 15

D. System Design and


01
C. Small Tool Pilot/Prototype
Development Phases 02
03
4. Test Planning, 3. Automated Testing 04
Design, and Development Introduction Process 05
06
E. Integration and Test Phase

07
Management of Tests

08

B. Business Analysis and


5. Execution and

09

Requirements Phase
Relationship of 10

2. Test Tool
Acquisition
ATLM (1–6) to 11
System Development
Life Cycle (A–F) 12
13
14
15
16
17
6. Test Program 1. Decision to
Review/Assessment Automate Test 18
19
A. System Life-Cycle Process
20
F. Production and
Maintenance Phase Evaluation and Improvement 21
22
Figure 1.3 System Development Life Cycle—ATLM Relationship 23
24
25
26
of tests (ATLM step 5) takes place in conjunction with the integration and test 27
phase of the system development life cycle. System testing and other testing activi- 28
ties, such as acceptance testing, take place once the first build has been baselined. 29
Test program review and assessment activities (ATLM step 6) are conducted 30
throughout the entire life cycle, though they are finalized during the system devel- 31
opment production and maintenance phase. 32
33
34
1.4.2 Test Maturity Model (TMM)—Augmented by
35
Automated Software Testing Maturity
36
Test teams that implement the ATLM will make progress toward levels 4 and 5 of 37
the Test Maturity Model (TMM). The TMM is a testing maturity model that was 38
developed by the Illinois Institute of Technology [4]; it contains a set of maturity 39 S
40 R
41 L
16 Chapter 1 The Birth and Evolution of Automated Testing

levels through which an organization can progress toward greater test process matu-
rity. This model lists a set of recommended practices at each level of maturity above
level 1. It promotes greater professionalism in software testing, similar to the inten-
tion of the Capability Maturity Model for software, which was developed by the
Software Engineering Institute (SEI) at Carnegie Mellon University (see the SEI
Web site at http://www.sei.cmu.edu/).

1.4.2.1 Correlation Between the CMM and TMM


The TMM was developed as a complement to the CMM [5]. It was envisioned that
organizations interested in assessing and improving their testing capabilities would
likely be involved in general software process improvement. To have directly corre-
sponding levels in both maturity models would logically simplify these two parallel
process improvement drives. This parallelism is not entirely present, however,
because both the CMM and the TMM level structures are based on the individual
historical maturity growth patterns of the processes they represent. The testing
process is a subset of the overall software development process; therefore, its matu-
rity growth needs support from the key process areas (KPAs) associated with general
process growth [6–8]. For this reason, any organization that wishes to improve its
testing process throughout implementation of the TMM (and ATLM) should first
commit to improving its overall software development process by applying the
CMM guidelines.
Research shows that an organization striving to reach a particular level of the
TMM must be at least at the same level of the CMM. In many cases, a given TMM
level needs specific support from KPAs in the corresponding CMM level and the
CMM level beneath it. These KPAs should be addressed either prior to or in parallel
with the TMM maturity goals.
The TMM model adapts well to automated software testing, because effective
software verification and validation programs grow out of development programs
that are well planned, executed, managed, and monitored. A good software test
program cannot stand alone; it must be an integral part of the software development
process. Table 1.3 displays the levels 1 through 5 of the TMM in the first column,
together with corresponding automated software testing levels 1 through 5 in the
second column. Column 2 addresses test maturity as it specifically pertains to auto-
mated software testing.
The test team must determine, based on the company’s environment, the TMM
maturity level that best fits the organization and the applicable software applications
or products. The level of testing should be proportional to complexity of design,
and the testing effort should not be more complex than the development effort.
Table 1.3 Testing Maturity and Automated Software
Testing Maturity Levels 1–5
TMM Level 1 Automated Software Testing Level 1 01
02
Initial. Testing is a chaotic process; it is ill This level of automated software testing is
03
defined and not distinguished from referred to as “accidental automation.” At the
debugging. Tests are developed in an ad hoc first level, automated testing is not done at all 04
way after coding is complete. Testing and or only on an ad hoc basis. An automated test 05
debugging are interleaved to get the bugs out tool might be used on an experimental basis. 06
of the software. The objective of testing is to With a capture/playback tool, automated test 07
show that the software works [9]. Software scripts are recorded and played back with only
08
products are released without quality tool-generated scripts being used. Scripts are
assurance. Resources, tools, and properly not modified for reusability or maintainability. 09
trained staff are lacking. This type of No automated script design or development 10
organization would be at level 1 of the CMM standards are followed. Resulting scripts are 11
developed by the Software Engineering not reusable and difficult to maintain and 12
Institute. There are no maturity goals at this must be recreated with each software build. 13
level. This type of automation can actually increase
14
testing costs by 125% or more—for example,
150% of manual test costs with each test cycle 15
(see “Case Study: Value of Test Automation 16
Measurement” in Chapter 2). 17
18
TMM Level 2 Automated Software Testing Level 2 19
20
Phase Definition. Testing is separated from “At this level, testing is becoming a planned 21
debugging and is defined as a phase that activity. This implies a commitment to the com-
22
follows coding. Although it is a planned pletion of testing activities. A project planning
activity, test planning at level 2 may occur after tool will aid the project manager in defining test 23
coding for reasons related to the immaturity activities, allocating time, money, resources, and 24
of the test process. For example, at level 2 personnel to the testing process” [11]. 25
there is the perception that all testing is This level of automated software testing is 26
execution-based and dependent on the code, referred to as “incidental automation.” At the
27
and therefore it should be planned only when second level, automated test scripts are
the code is complete. modified but no documented standards or 28
The primary goal of testing at this level of repeatability exists. 29
maturity is to show that the software meets its The types of tools used at this level can 30
specifications [10]. Basic testing techniques include project planning tools, capture/play- 31
and methods are in place. Many quality back tools, simulators and emulators, syntax 32
problems at this TMM level occur because test and semantic analyzers, and debugging tools.
33
planning takes place late in the software life The introduction of automated test tools
cycle. In addition, defects propagate into the to a new project is not planned and a process 34
code from the requirements and design is not being followed. Test design or develop- 35
phases, as no review programs address this ment standards do not exist. Test schedules or 36
important issue. Post-code, execution-based test requirements are not taken into consider- 37
testing is still considered the primary testing ation or are not reliable when contemplating
38
activity. the use of an automated test tool. As with
level 1, this type of test automation does not 39 S
provide much return on investment and can 40 R
actually increase the testing effort. 41 L
continued

17
continued from page 17

TMM Level 3 Automated Software Testing Level 3

Integration. Testing is no longer a phase that This level of test maturity is referred to as
follows coding; rather, it is integrated into the “intentional automation.” At the third level,
entire software life cycle. Organizations can automated testing becomes both well defined
build on the test planning skills they have and well managed. The test requirements and
acquired at level 2. Unlike level 2 (planning the test scripts themselves proceed logically
for testing at TMM), level 3 begins at the from the software requirement specifications
requirements phase and continues throughout and design documents.
the life cycle supported by a version of the Automated test scripts are created based on
V model [12]. Test objectives are established test design and development standards, yet
with respect to the requirements based on the test team does not review automated test
user and client needs and are used for test-case procedures. Automated tests become more
design and success criteria. A test organization reusable and maintainable. At this level of
exists, and testing is recognized as a automated testing, the return on investment is
professional activity. A technical training starting to pay off and a break-even point can
organization pursues a testing focus. Basic already be achieved by the second-regression
tools support key testing activities. Although test cycle (see “Case Study: Value of Test
organizations at this level begin to realize the Automation Measurement” in Chapter 2).
important role of reviews in quality control, The types of tools used at this level include
no formal review program has been requirement management tools, project
established, and reviews do not yet take place planning tools, capture/playback tools,
across the life cycle. A test measurement simulators and emulators, syntax and semantic
program has not yet been established to analyzers, and debugging tools.
qualify process and product attributes.

TMM Level 4 Automated Software Testing Level 4

Management and Measurement. Testing is a This level of test maturity, referred to as


measured and quantified process. Reviews at “advanced automation,” can be achieved
all phases of the development process are now when adopting many aspects of the ATLM
recognized as testing and quality-control described in this book. This testing maturity
activities. Software products are tested for level represents a practiced and perfected
quality attributes such as reliability, usability, version of level 3 with one major addition—
and maintainability. Test cases from all post-release defect tracking. Defects are
projects are collected and recorded in a test- captured and sent directly back through the
case database to test case reuse and regression fix, test creation, and regression test processes.
testing. Defects are logged and given a The software test team is now an integral part
severity level. Deficiencies in the test process of product development, and test engineers
now often follow from the lack of a defect- and application developers work together
prevention philosophy and the porosity of to build a product that will meet test
automated support for the collection, analysis, requirements. Any software bugs are caught
and dissemination of test-related metrics. early, when they are much less expensive to
fix. In addition to the tools mentioned at the
previous testing levels, defect and change
tracking tools, test procedure generation
tools, and code review tools are used at this
level.
continued

18
1.4 ATLM’s Role in the Software Testing Universe 19

continued from page 18

TMM Level 5 Automated Software Testing Level 5

Optimization, Defect Prevention, and Quality When incorporating the guidelines of the
Control. Because of the infrastructure ATLM described in this book and using the
provided by the attainment of maturity goals applicable tools in an efficient manner, a
at levels 1 through 4 of the TMM, the testing TMM level 5 maturity can be achieved. Tools
process is now said to be defined and managed used at this highest level include the ones
and its cost and effectiveness can be mentioned within previous levels plus test data
monitored. At level 5, mechanisms fine-tune generation tools and metrics collection tools,
and continuously improve testing. Defect such as complexity and size measurement
prevention and quality control are practiced. tools, coverage and frequency analyzers, and
The testing process is driven by statistical statistical tools for defect analysis and defect
sampling and measurements of confidence prevention. (All tools described at the various
levels, trustworthiness, and reliability. An levels are discussed in detail in Chapter 3, and
established procedure exists for the selection tool examples are provided in Appendix B.)
and evaluation of testing tools. Automated
tools totally support the running and
rerunning of test cases, providing support for
test-case design, maintenance of test-related
items, defect collection and analysis, and the
collection, analysis, and application of test-
related metrics.

1.4.3 Test Automation Development


Modularity, shared function libraries, use of variables, parameter passing, condi-
tional branching, loops, arrays, subroutines—this is now the universal language of
not only the software developer, but also the software test engineer.
Automated software testing, as conducted with today’s automated test tools, is
a development activity that includes programming responsibilities similar to those of
the application-under-test software developer. Whereas manual testing is an activity
that is often tagged on to the end of the system development life cycle, efficient
automated testing emphasizes that testing should be incorporated from the begin-
ning of the system development life cycle. Indeed, the development of an auto-
mated test can be viewed as a mini-development life cycle. Like software application
development, automated test development requires careful design and planning.
Automated test engineers use automated test tools that generate code, while
developing test scripts that exercise a user interface. The code generated consists of
third-generation languages, such as BASIC, C, or C++. This code (which comprises
automated test scripts) can be modified and reused to serve as automated test scripts
for other applications in less time than if the test engineer were to use the automated

You might also like