0% found this document useful (0 votes)
749 views8 pages

Methodology To Evaluate Automated Testing Tools

This document outlines the methodology for evaluating automated testing tools, which involves 6 main steps: 1. Define test requirements and goals 2. Set tool objectives based on requirements 3. Conduct selection activities, which involves identifying candidate tools, reviewing them, scoring them, and selecting a tool for informal or formal procurement 4. Customize the selected tool for the testing environment through a proof of concept that automates sample test cases 5. Implement the tool and create test automation scripts 6. Maintain the tool and test automation over time through continuous improvement
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
749 views8 pages

Methodology To Evaluate Automated Testing Tools

This document outlines the methodology for evaluating automated testing tools, which involves 6 main steps: 1. Define test requirements and goals 2. Set tool objectives based on requirements 3. Conduct selection activities, which involves identifying candidate tools, reviewing them, scoring them, and selecting a tool for informal or formal procurement 4. Customize the selected tool for the testing environment through a proof of concept that automates sample test cases 5. Implement the tool and create test automation scripts 6. Maintain the tool and test automation over time through continuous improvement
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Methodology to Evaluate Automated

Testing Tools

This part provides an outline of the steps involved in acquiring, implementing,


and using testing tools. The management of any significant project requires that
the work be divided into tasks for which completion criteria can be defined. The
transition from one task to another occurs in steps; to permit the orderly progress
of the activities, the scheduling of these steps must be determined in advance. A
general outline for such a schedule is provided by the steps described. The actual
time schedule depends on many factors that must be determined for each specific
tool use.

STEP 1: DEFINE YOUR TEST REQUIREMENTS


The goals to be accomplished should be identified in a format that permits later
determination that they have been met (i.e., Step 15). Typical goals include reducing the
average processing time of C++ programs by one fifth, achieving complete
interchangeability of programs or data sets with another organization, and adhering to an
established standard for documentation format. The statement of goals should also
identify responsibilities, particularly the role that headquarters staff may have, and
specify coordination requirements with other organizations. When a centralized
management method is employed, the statement of goals may include a budget and a
desired completion date. Once these constraints are specified, funding management may
delegate the approval of the acquisition plan to a lower level.

STEP 2: SET TOOL OBJECTIVES


The goals generated in Step 1 should be translated into desired tool features and
requirements that arise from the development and operating environment identified.
Constraints on tool cost and availability may also be added at this step. For example, a
typical tool objective for a program format is to provide header identification, uniform
indentation, and the facility to print listings and comments separately for all Pascal
programs. In addition, the program must be able to run on the organization’s specific
computer under its operating system. Only tools that have been in commercial use for at
least one year and at no fewer than N sites should be considered. (The value of N is
determined by the number of sites the organization has.)

STEP 3A: CONDUCT SELECTION ACTIVITIES FOR


INFORMAL PROCUREMENT
The following tasks should be performed when an informal procurement plan is in effect.

Task 1: Develop the Acquisition Plan


The acquisition plan communicates the actions of software management both up and
down the chain of command. The plan may also be combined with the statement of tool
objectives (Step 2). The acquisition plan includes the budgets and schedules for
subsequent steps in the tool introduction, a justification of resource requirements in light
of expected benefits, contributions to the introduction expected from other organizations
(e.g., the tool itself, modification patches, or training materials), and the assignment of
responsibility for subsequent events within the organization, particularly the
identification of the software engineer. Minimum tool documentation requirements are
also specified in the plan.

Task 2: Define Selection Criteria


The selection criteria include a ranked listing of attributes that should support effective
tool use. Typical selection criteria include the following:
1. ■ The ability to accomplish specified tool objectives
2. ■ Ease of use
3. ■ Ease of installation
4. ■ Minimum processing time
5. ■ Compatibility with other tools
6. ■ Low purchase or lease cost
Most of these criteria must be considered further to permit objective evaluation, but this
step may be left to the individual who does the scoring. Constraints that have been
imposed by the preceding events or are generated at this step should be summarized
together with the criteria.

Task 3: Identify Candidate Tools


This is the first step for which the software engineer is responsible. The starting point for
preparing a list of candidate tools is a comprehensive tool catalogue. Two lists are usually
prepared, the first of which does not consider the constraints and contains all tools that
meet the functional requirements. For the feasible candidates, literature should be
requested from the developer and then examined for conformance with the given
constraints. At this point, the second list is generated, which contains tools that meet both
the functional requirements and the constraints. If this list is too short, some constraints
may be relaxed.

Task 4: Conduct the Candidate Review


The user must review the list of candidate tools prepared by the software engineer.
Because few users can be expected to be knowledgeable about software tools, specific
questions should be raised by software management, including the following:
1. ■ Will this tool handle the present file format?
2. ■ Are tool commands consistent with those of the editor?
3. ■ How much training is required?
Adequate time should be allowed for this review, and a due date for responses should be
indicated. Because users often view this as a low-priority, long-term task, considerable
follow-up by line management is required. If possible, tools should be obtained for trial
use, or a demonstration at another facility should be arranged.

Task 5: Score the Candidates


For each criterion identified in Task 2, a numeric score should be generated on the basis
of the information obtained from the vendor’s literature, tool demonstrations, the user’s
review, observation in a working environment, or the comments of previous users. Once
weighting factors for the criteria have been assigned, the score for each criterion is
multiplied by the appropriate factor; the sum of the products represents the overall tool
score. If the criteria are merely ranked, the scoring will consist of a ranking of each
candidate under each criterion heading. Frequently during this process, a single tool will
be recognized as clearly superior.

Task 6: Select the Tool


This decision is reserved for software managers; they can provide a review of the scoring
and permit additional factors that are not expressed in the criteria to be considered. For
example, a report from another agency may state that the selected vendor did not provide
adequate service. If the selected tool did not receive the highest score, the software
engineer must review the tool characteristics thoroughly to avoid unexpected installation
difficulties. (Tool selection concludes the separate procedure for informal procurement.
The overall procedure continues with Step 4.)

STEP 3B: CONDUCT SELECTION ACTIVITIES FOR


FORMAL PROCUREMENT
The following tasks should be performed when a formal tool procurement plan is in
effect.

Task 1: Develop the Acquisition Plan


This plan must include all the elements mentioned for Task 1 of Step 3a, as well as the
constraints on the procurement process and the detailed responsibilities for all
procurement documents (e.g., statement of work and technical and administrative
provisions in the request for proposal).

Task 2: Create the Technical Requirements Document


The technical requirements document is an informal description of tool requirements and
the constraints under which the tool must operate. It uses much of the material from the
acquisition plan, but should add enough detail to support a meaningful review by the tool
user.

Task 3: Review Requirements


The user must review the technical requirements for the proposed procurement. As in the
case of Step 3a, Task 4, the user may need to be prompted with pertinent questions, and
there should be close management follow-up for a timely response.

Task 4: Generate the Request for Proposal


The technical portions of the request for proposal should be generated from the technical
requirements document and any user comments on it. Technical considerations typically
include the following:
1. ■ A specification of the tool as it should be delivered, including applicable
documents, a definition of the operating environment, and the quality assurance
provisions.
2. ■ A statement of work for which the tool is procured. This includes any applicable
standards for the process by which the tool is generated (e.g., configuration
management of the tool) and documentation or test reports to be furnished with the
tool. Training and operational support requirements are also identified in the
statement of work.
3. ■ Proposal evaluation criteria and format requirements. These criteria are listed in
order of importance. Subfactors for each may be identified. Any restrictions on the
proposal format (e.g., major headings, page count, or desired sample outputs) may
be included.

Task 5: Solicit Proposals


This activity should be carried out by administrative personnel. Capability lists of
potential sources are maintained by most purchasing organizations. When the software
organization knows of potential bidders, those bidders’ names should be submitted to the
procurement office. Responses should be screened for compliance with major legal
provisions of the request for proposal.

Task 6: Perform the Technical Evaluation


Each proposal received in response to the request for proposal should be evaluated in
light of the previously established criteria. Failure to meet major technical requirements
can lead to outright disqualification of a proposal. Those deemed to be in the competitive
range are assigned point scores that are then considered together with cost and schedule
factors, which are separately evaluated by administrative personnel.
The automated testing tool may need to be customized to the test automation
environment. To demonstrate the capability and compatibility of the tool with the
application, a proof of concept (POC) should be requested from the vendor. In the POC,
one of the business scenarios should be automated using the tool covering various
business verification points and actions.

Task 7: Select a Tool Source


On the basis of the combined cost, schedule, and technical factors, a source for the tool is
selected. If this is not the highest-rated technical proposal, managers should require
additional reviews by software management and the software engineer to determine
whether the tool is acceptable. (Source selection concludes the separate procedure for
formal procurement. The overall procedure continues with Step 4.)

STEP 4: PROCURE THE TESTING TOOL


In addition to verifying that the cost of the selected tool is within the approved budget,
the procurement process considers the adequacy of licensing and other contractual
provisions and compliance with the fine print associated with all government
procurements. The vendor must furnish the source program, meet specific test and
performance requirements, and maintain the tool. In informal procurement, a trial period
use may be considered if this has not already taken place under one of the previous steps.
If the acquisition plan indicates the need for outside training, the ability of the vendor to
supply the training and any cost advantages from the combined procurement of the tool
and the training should be investigated. If substantial savings can be realized through
simultaneously purchasing the tool and training users, procurement may be held up until
outside training requirements are defined (Step 7).

STEP 5: CREATE THE EVALUATION PLAN


The evaluation plan is based on the goals identified in Step 1 and the tool objectives
derived in Step 2. It describes how the attainment of these objectives should be evaluated
for the specific tool selected. Typical items to be covered in the plan are milestones for
installation and dates and performance levels for the initial operational capability and for
subsequent enhancements. When improvements in throughput, response time, or
turnaround time are expected, the reports for obtaining these data should be identified.
Responsibility for tests, reports, and other actions must be assigned in the plan, and a
topical outline of the evaluation report should be included.
The acceptance test procedure is part of the evaluation plan, although for a major tool
procurement it may be a separate document. The procedure lists the detailed steps that are
necessary to test the tool in accordance with the procurement provisions when it is
received, to evaluate the interaction of the tool with the computer environment (e.g.,
adverse effects on throughput), and to generate an acceptance report.

STEP 6: CREATE THE TOOL MANAGER’S PLAN


The tool manager’s plan describes how the tool manager is selected, the responsibilities
for the adaptation of the tool, and the training that is required. The tool manager should
be an experienced systems programmer who is familiar with the current operating
system. Training in the operation and installation of the selected tool in the form of
review of documentation, visits to the tool’s current users, or training by the vendor must
be arranged. The software engineer is responsible for the tool manager’s plan, and the
tool manager should work under the software engineer’s direction. The tool manager’s
plan must be approved by software management.

STEP 7: CREATE THE TRAINING PLAN


The training plan should first consider the training that is automatically provided with the
tool (e.g., documentation, test cases, and online diagnostics). These features may be
supplemented by standard training aids supplied by the vendor for in-house training (e.g.,
audio- or videocassettes and lecturers). Because of the expense, training sessions at other
locations should be considered only when nothing else is available. The personnel to
receive formal training should also be specified in the plan, and adequacy of in-house
facilities (e.g., number of terminals and computer time) should be addressed. If training
by the tool vendor is desired, this need should be identified as early as possible to permit
training to be procured along with the tool (see Step 4). Users must be involved in the
preparation of the training plan; coordination with users is essential. The training plan
must be prepared by the software engineer and approved by software management.
Portions of the plan must be furnished to the procurement staff if outside personnel or
facilities are used.

STEP 8: RECEIVE THE TOOL


The tool is turned over by the procuring organization to the software engineer.

STEP 9: PERFORM THE ACCEPTANCE TEST


The software engineer or staff should test the tool in an as-received condition with only
those modifications made that are essential for bringing the tool up on the host computer.
Once a report on the test has been issued and approved by the software manager, the tool
is officially accepted.
STEP 10: CONDUCT ORIENTATION
When it has been determined that the tool has been received in a satisfactory condition,
software management should hold an orientation meeting for all personnel involved in
the use of the tool and tool products (e.g., reports or listings generated by the tool). The
objectives of tool use (e.g., increased throughput or improved legibility of listings) should
be directly communicated. Highlights of the evaluation plan should be presented, and any
changes in duties associated with tool introduction should be described. Personnel should
be reassured that allowances will be made for problems encountered during tool
introduction and reminded that the tool’s full benefits may not be realized for some time.

STEP 11: IMPLEMENT MODIFICATIONS


This step is carried out by the tool manager in accordance with the approved tool
manager plan. It includes modifications of the tool, the documentation, and the operating
system. In rare cases, some modification of the computer (e.g., channel assignments) may
also be necessary. Typical tool modifications involve deletion of unused options, changes
in prompts or diagnostics, and other adaptations made for efficient use in the current
environment. In addition, the modifications must be thoroughly documented.
Vendor literature for the tool should be reviewed in detail and tailored to the current
computer environment and to any tool modifications that have been made. Deleting
sections that are not applicable is just as useful as adding material that is required for the
specific programming environment. Unused options should be clearly marked or
removed from the manuals. If the tool should not be used for some resident software
(e.g., because of language incompatibility or conflicts in the operating system interface),
warning notices should be inserted in the tool manual.

STEP 12: TRAIN TOOL USERS


Training is a joint responsibility of the software engineer and the tool users and should
help promote tool use. The software engineer is responsible for the content (in
accordance with the approved training plan), and the tool user controls the length and
scheduling of sessions. The tool user should be able to terminate training steps that are
not helpful and to extend portions that are helpful but need further explication. Retraining
or training in the use of additional options may be necessary and can provide an
opportunity for users to talk about problems associated with the tool.

STEP 13: USE THE TOOL IN THE OPERATING


ENVIRONMENT
The first use of the tool in the operating environment should involve the most qualified
user personnel and minimal use of options. This first use should not be on a project with
tight schedule constraints. Resulting difficulties must be resolved before expanded
service is initiated. If the first use is successful, use by additional personnel and use of
further options may commence.
User comments on training, first use of the tool, and the use of extended capabilities
should be prepared and furnished to the software engineer. Desired improvements in the
user interface, in the speed or format of response, and in the use of computer resources
are all appropriate topics. Formal comments may be solicited shortly after the initial use,
after six months, and again after one year.

STEP 14: WRITE THE EVALUATION REPORT


Using the outline generated in Step 5, the software engineer prepares the evaluation
report. User comments and toolsmith observations provide important input to this
document. Most of all, the document must discuss how the general goals and tool
objectives were met. The report may also include observations on the installation and use
of the tool, cooperation received from the vendor in installation or training, and any other
lessons learned.
Tool and host computer modifications are also described in this report. It may contain a
section of comments useful to future tool users. The report should be approved by
software management and preferably by funding management as well.

STEP 15: DETERMINE WHETHER GOALS HAVE BEEN


MET
Funding management receives the evaluation report and should determine whether the
goals that were established in Step 1 have been met. This written determination should
address the following:
1. ■ Attainment of technical objectives
2. ■ Adherence to budget and other resource constraints
3. ■ Timeliness of the effort
4. ■ Cooperation from other agencies
5. ■ Recommendations for future tool acquisitions

You might also like