0% found this document useful (0 votes)
3 views7 pages

07 - Chapter 3

This chapter presents a comparative analysis of various automated test data generation techniques, focusing on their performance, objective functions, and experimental results. It highlights the effectiveness of algorithms like Particle Swarm Optimization, Artificial Bee Colony, and Negative Selection Algorithm in achieving higher code coverage and efficiency in test data generation. The findings suggest that the NSA outperforms traditional meta-heuristic algorithms in generating optimal test data with fewer generations.

Uploaded by

Google User
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views7 pages

07 - Chapter 3

This chapter presents a comparative analysis of various automated test data generation techniques, focusing on their performance, objective functions, and experimental results. It highlights the effectiveness of algorithms like Particle Swarm Optimization, Artificial Bee Colony, and Negative Selection Algorithm in achieving higher code coverage and efficiency in test data generation. The findings suggest that the NSA outperforms traditional meta-heuristic algorithms in generating optimal test data with fewer generations.

Uploaded by

Google User
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

CHAPTER 3

COMPARATIVE ANALYSIS OF TEST DATA


GENERATION TECHNIQUES

In this chapter automated test data generation has been studied based on several criteria,
including the objective function employed, the number of experiments conducted with
that technique, the results of those experiments, the results of comparisons with other
techniques, the types of parameters employed, and the overall performance of the
algorithm.

3.1 Analysis of Test data Generation Techniques

Table 3.1 Comparative Study of different test data generation techniques

Performance
Technique Fitness/Objective No. of Comparison with Measure
Reference Findings
Adapted function used Experiments other techniques Parameter’s
adopted
Less No. of Test
Required, Low
ABC Based
Time
by combining Independent Test Path Coverage,
Triangle Complexity for
[107] scouts, Path Coverage ABC, GA & ACO Path Sequence
Classification Test data
employed & Criteria Comparison.
Generation,
onlooker bees
Faster &
efficient
Average Test Average Test
cases Generated Cases
Does not Generated Path
Static Based 10 Real Path (ATCPP) and
Perform Well (ATCPP) &
[108] Symbolic Branch Distance World’s Average
for High Value Average
Execution Problem Percentage Percentage
of ATCPP
Coverage (APC) Coverage
metrics (APC) Metrics
No. of Paths
Covered,
Yields Better Number of
Results for Iterations,
9 real world
Comparison ----------- GA & ACO Large and Number of Test
[120] Programs
Complex Cases.
Problems Time taken for
number of
Generation
Regression
Yields 100 %
[109] Augmentation Branch Coverage 8 Test Suites Path Coverage
coverage
testing
Generate
ABC based Optimal Results
Time
approach with and Converges
[121] Path Coverage 6 ACO Complexity, Path
heuristic in with a smaller
Coverage
each test case. Number of Test
Runs.

68
Outperform
IGA & PSO in
Average
Immune Genetic terms of
Improved Branch Coverage Triangle Iteration Time,
[99] Algorithm and Convergence
PSO Fitness Classification Convergence
PSO Speed,
Rate.
Efficiency &
Performance
Coverage Ratio
High Coverage
No. of Test
Ratio,
Hybrid GA & Cases.
Multi -Objective GA & PSO Less
[100] PSO(GPSCA) 7 No. of
Generation.
Generation

Traditional
Triangle Cost of Time is
PSO New All Path Cost and time
Classification Half as
[101] With new Objective Single Path Data. of test data
& Binary compared to
objective Function generation
Search Single Path
function
Better
Performance
and Great Code
PSO-TVAC
Coverage Code Coverage
(modified 5 Benchmark
[102] Code Coverage. PSO Variants Capability, and test case
Time Varying Programs
Control on Generation.
Acceleration.
Local and
Global
Optimum
Outperforms Average
PSO Based
TDGen_GA Coverage,
Test Data
5 Real World TDGen_GA, CL- and CA-PSO in Successful Rate,
[103] Generation Branch Coverage
Programs PS terms of Average
(TDGen_PSO
Coverage & Generations and
)
Generations Average Time
Avoid
Premature
Convergence. Total No. of
Modified
Triangulation Fast Coverage
[104] Genetic Branch Fitness Traditional GA
network Convergency, Time
Algorithm
High test data Coverage Rate
generation
efficiency
Synthesize
multiple test Average
GA Based Lin;s & Pei’s
data, More Generation
[105] Test data Multi Path Fitness 7 work based on
Effective & Average
Generator GA
Efficient than Coverage
similar tools
Find more error
prone paths, Paths
reduce Identification,
[122] GA Branch 11 Random Testing
Development Cost &
Cost & Improve efficiency
Efficiency
Maintain
Colony
Test Data
Polymorphism,
Generation,
Avoid
PSO used Individual Sa test Triangle Test Data
[106] GA and ACO Premature
inside GA Case Classification Convergence,
Convergence.
Colony
Improve
Maintenance
Convergence
Speed
Anneal Preserve the Selection &
Similarity Based Triangle GA and Random
[37] Mechanism best probability, Elitist
Fitness Function Classification testing
into GA Effective & Crossovers

69
(Hamming Efficient than Mutation
Distance) other techniques Simulated
Annealing and
Convergence
Based on Approximation
Three Level and Branch
Evolutionary Distance SA generated
[39] 10 GA, SA & PSO Convergence
Approaches (Evaluating the quality Data
GA, SA & Distance between
PSO actual path)
Local
Outperform
transfer, Average
Genetic
global Coverage,
8 Benchmarks Genetic Algorithm and
transfer, Customize Branch Successful rate,
[123] programs has Algorithm, SA Simulated
pheromone fitness function Average
been used and PSO Annealing,
update has Convergence,
Comparable to
been re Average time
PSO
defined.
Single ACO Data Flow Pheromone-
Markov Generate
[110] NA Telephone testing & ACO factor, Cost, and
model quality
Experiment Markov Chain user Parameters
Local
Transfer, Average
Global Coverage,
5 Benchmarks
transfer, Branch fitness Successful rate,
[124] programs has GA, SA, ACO Outperforms
pheromone function Average
been used
update has Convergence,
been re - Average time
defined.
Improved
Local
Pheromone Statement Average
strategy and Coverage, Branch Triangle Random Coverage and
Improved
pheromone Coverage and classification Algorithm and Average
[95] coverage and
volatilization Modified and collision Genetic Generation
generation.
Co-efficient Condition/Decisio detection. Algorithm
and Global n Coverage.
Path
Pheromone
Complete
State The
Coverage,
Transition Enrolment
GA & STT Generation of
based testing state machine, Better coverage
[97] NIL (Software Optimal Test
and its transition than GA
Transition testing) Sequence,
coverage system state
Enhancement of
level machine
the Tool.
Classic Generation of
Average
triangle Optimal &
Statement, Branch Coverage,
[29] Ant Colony Classification, RND, GA, SACO Minimal Test
& modified Average
Optimization & collision & ACO Sequence for
decision/coverage minimal
avoidance Complete
generation
system Coverage.
NSA is efficient
Benchmark in time of
Test Data
Hamming Program Random testing & execution &
[125] NSA Generation,
Distance Triangle GA effective in
Execution Time
Classifier generation of
test data.
Outperforms
other methods
11 Real world in reducing the Path Coverage,
Application Hamming Random Testing,
[56] Benchmark number of test Effectiveness
of NSA Distance GA & ACO
programs data that covers & Efficiency.
all program
paths.

70
Automated Test
Benchmark Outperforms
Negative Case
Hamming Program Random
[53] Selection Random Testing Generation,
Distance Triangle Testing for Path
Algorithm Effectiveness &
Classifier Coverage.
Efficiency.
Clonal Al & NBD, Mean Number
Sthamer
Selection Approximation of Generation,
Traingle Hundred Poor Generation
[46] Algorithm Level with Mean
Classifier Experiments (run) & Coverage
(Immune normalized Percentage
Problem
Algorithm) Branch Distance Coverage.
Elite Test Data
Generation
Korel Distance Performance of
9 Benchmark Random, GA & Technique,
[47] GA and CSA Function for Test Data
Program CSA Generate
Branch Predicate Generation
Optimal Test
Data
Average
High Path
Coverage,
11 Coverage with
Hybrid NSA Hamming Random, NSA, Average test
[57] Benchmark minimum
& GA Distance NSA-GA Data,
Program number of
Average
Generation.
Generation.

3.2 Year wise Trends in Test Data Generation

Year Wise Publications


2016
2015
2014
2013
Year of Publications

2012
2011
2010
2009
2008
2007
2006
0 10 20 30 40 50 60 70
Number of Publications

Publishing Date

Figure 3.1 Publication frequency of test data generation articles

71
3.3 Type of Experiments Deployed in Test Data Generation

11 11
No. of Experiments

1 1 1 1

Series1

Figure 3.2 Type of Experiments used in Test data generation

3.4 Type of Parameters used in Test Data Generation

13
No. of Experiments

12

5
3
2 2
1 1 1 1 1 1 1 1 1 1 1 1

Series1

Figure 3.3 Type of Parameters used in Test data generation

72
3.5 Comparison Conducted by Different Approaches

16
No of Comparisons

6 6
5
3
1 1 1 1 1 1 1

Series1

Figure 3.4 No. of comparison conducted by various approaches

3.6 Summary

In this chapter, a comparative analysis of Metaheuristics and Artificial Immune


algorithms has been performed on the generation of software test data, preferably for
structural testing because structural testing is generally regarded as the preferred
method for detecting errors and bugs in software code. Nevertheless, the question of
how to generate test data with a significantly higher code coverage capability remains
unanswered. Particle Swarm Optimization, Artificial bee colony Genetic Method, Ant
Colony Optimization algorithm, and Firefly algorithm were chosen for comparative
analysis owing to their vast usefulness in the area of automated software test data
creation and many engineering applications. Negative Selection Algorithm (NSA) and
Clonal Selection Algorithm were also chosen from the artificial immune system class
due to their advantages over Meta–Heuristics algorithms. The majority of the work
performed in automated software test data production has been for measures such as
average coverage, success rate, average generation, and average time. The objective
function is also essential for validating the test data. Different objective functions, such
as statement coverage, single path coverage, branch coverage, and multipath coverage,
have been predicted in this study, with branch coverage and path coverage being the
most generally selected objective function that also significantly improves the quality
of test data. In lieu of objective function, Artificial Immune Algorithm (NSA) uses
73
hamming distance to evaluate test data. Several benchmark programmes, including
triangle classification, even odd, greatest number, leap year, quadratic equation, and
telephone system, call day, binary search , linear search, bubble sort, have been
extensively used in experiments to confirm the efficiency and efficacy of the approach
for producing test data. Adapted ACO methodology outperforms ACO, IACO, ABC,
and GA in terms of coverage capability, convergence speed, and consistency, and is
significant compared to the PSO-based method in a few experimental setups.
Furthermore, a survey of experimental results from numerous research papers revealed
that NSA is more efficient and cost-effective than meta-heuristic algorithms. It has
superior finding capabilities with fewer generations. The results indicate the NSA has
the ability to reduce the quantity of test data. The hybrid approach designed on NSA
and GA also presents significant improvement in coverage ratio and reduce the number
of generations. This research investigates a viable strategy for producing test results.

74

You might also like