0% found this document useful (0 votes)
10 views31 pages

Darma

This document is an internship report submitted by Komminen Dharma Teja for the Bachelor of Technology in Computer Science & Engineering (Data Science) at Sri Venkataeshwara College of Engineering and Technology. It outlines the internship conducted at Q Spiders, detailing the automation testing frameworks and methodologies used, along with acknowledgments and a structured index of the report's contents. The report emphasizes the importance of automated testing in software development to enhance efficiency and reduce costs associated with manual testing.

Uploaded by

Dola Krishna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views31 pages

Darma

This document is an internship report submitted by Komminen Dharma Teja for the Bachelor of Technology in Computer Science & Engineering (Data Science) at Sri Venkataeshwara College of Engineering and Technology. It outlines the internship conducted at Q Spiders, detailing the automation testing frameworks and methodologies used, along with acknowledgments and a structured index of the report's contents. The report emphasizes the importance of automated testing in software development to enhance efficiency and reduce costs associated with manual testing.

Uploaded by

Dola Krishna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

SRI VENKATAESHWARA COLLEGE OF ENGINEERING

AND TECHNOLOGY
(AUTONOMOUS)
R.V.S Nagar, Chittoor – 517 127. (A.P)
(Approved by AICTE, New Delhi, Affiliated to JNTUA, Anantapur)
(Accredited by NBA, New Delhi & NAAC A+, Bangalore)
(An ISO 9001:2000 Certified Institution)
2023-2024

INTERNSHIP REPORT
A report submitted in partial fulfilment of the requirements for the Award of
Degree of
BACHELOR OF TECHNOLOGY
IN
COMPUTER SCIENCE & ENGINEERING
(DATA SCIENCE)
BY
KOMMINENI DHARMA TEJA
Regd.No.21781A3256
Under supervision of

Q SPIDERS
(Duration: 01/02/2025 to 30/04/2025)
SRI VENKATAESHWARA COLLEGE OF ENGINEERING
AND TECHNOLOGY
(AUTONOMOUS)
R.V.S Nagar, Chittoor – 517 127. (A.P)
(Approved by AICTE, New Delhi, Affiliated to JNTUA, Anantapur)
(Accredited by NBA, New Delhi & NAAC A+, Bangalore)
(An ISO 9001:2000 Certified Institution)
2023-2024

CERTIFICATE
This is to certify that the “Internship training report” submitted by
KOMMINENI DHARMA TEJA (Regd.No.:21781A3256) is work done by
him and submitted during 2024-2025.Academic year, in partial fulfilment
of the requirements for the award of the Degree of BACHELOR OF
TECHNOLOGY in COMPUTER SCIENCE & ENGINEERING
(DATA SCIENCE), at Q SPIDERS

MR.M.NAVALAN MR.A.LIBONCE
Internship coordinator Head of Department
CSE(DATASCIENCE) CSE(DATA SCIENCE)
ACKNOWLEDGEMENT

 A Grateful thanks to Dr.R.VENKATASWAMY Chairman of Sri


Venkateshwara College of Engineering & Technology(Autonomous)
for providing education in their esteemed institution. I wish to record
my deep sense of gratitude and profound thanks to our beloved Vice
Chairman, Sri R.V. Srinivas for his valuable support throughout the
course.
 I express our sincere thanks to Dr.M.MOHAN BABU, our beloved
principal for his encouragement and suggestion during the course of
study.
 With the deep sense of gratefulness, I acknowledge Dr.A.LIBONCE,
Head of the Department, Computer Science Engineering (CSD), for
giving us inspiring guidance in undertaking internship.
 I express our sincere thanks to the internship coordinator
Mr.M.NAVALAN, for his keen interest, stimulating guidance,
constant encouragement with our work during all stages, to bring this
report into fruition.
 I wish to convey my gratitude and sincere thanks to all members for
their support and cooperation rendered for successful submission of
report.
 Finally, I would like to express my sincere thanks to all teaching, non-
teaching faculty members, our parents, and friends and for all those
who have supported usto complete the internship successfully.

(NAME: KOMMINENI DHARMA TEJA)


(ROLL NO:21781A3256)
INDEX

1. ABSTRACT

2. INTRODUCATION
2.1. AUTOMATED TESTING
2.2. AUTOMATION TESTING FRAMEWORKS
2.3. PROGRAMMABLE AUTOMATION

3. TYPES OF TESTING FRAMEWORKS


3.1. KEYWORD DRIVEN AUTOMATION TESTING
FRAMEWORKS
3.2. RELATED WORK FOR TESTING FRAMEWORKS
3.3. HYBRID TESTING FRAMEWORKS

4.PROJECT DICUSSION
4.1. TECHNOLOGIES USED
4.2. SOFTWARE REQUIREMENTS

5. SCREENSHOTS
6. FRAMEWORKS
7. CONCLUSION AND FUTURE WORK
8. REFERENCES
1. ABSTRACT

Manual software testing has been traditionally used in the software industry. It
depends completely on human testers without the help of any tool to detect the
unexpected behavior of an application. However, the main problem in the manual
testing approach is that it is a time-consuming task in addition to the fact that tests
cannot be reused. Automation software testing has been introduced to reduce
testing efforts and detect as many faults as possible. Test cases are executed not
only to test the functional requirements for the first time, but also to check the
functions which have been already tested. This study aims to present the main
features of different automation testing frameworks. In addition, an overview of
different scripting techniques is presented during the study.

Keywords

Software Testing, Automated Software Testing, Test Data, Test Case, Test Script,
Manual Testing, Software Under Test.
2. INTRODUCTION

There are many ways through the Software Development Life Cycle (SDLC) to
control the product quality, such as careful design process management, analysis
and implementation. However, software testing is the major method to control
and monitor quality . Software testing helps software programmers to fix bugs as
early as possible in the SDLC to decrease the bug fixing cost . This opens research
on how to achieve the best possible quality in less time. The National Institute of
Standards and Technology mentioned in a report that software failures cost nearly
$60 billion every year . Testing comes exactly before the final delivery of the final
version of the software. Software testing activities often consume from 30% to
40% of the total development costs . illustrate that during the past ten years, the
software testing field has grown rapidly because applications are getting more
and more complex.

Software testers face a problem in manual testing when they need to run test cases
repeatedly especially if the application versions change frequently. The same
problem occurs if the tester wants to run test cases over multiple browsers or
multiple platforms. Therefore, manual testing is a tedious job because testers
repeat testing with every change in the SUT. Moreover, manual tests cannot be
reused . It is most suitable for non-repeatable tasks. It is usually used for revealing
new and unexpected defects.

Automation of software testing is the process of creating a program (test script)


that simulates the manual test case steps in whatever programming/scripting
language with the help of other external automation assisting tools. Since
automating tests means automating the manual process which is currently in use
, automation testing requires clear manual testing process to be able to automate
it . Testing engineers must implement and run a program to test the SUT . In other
words, they implement toolkits to test the already implemented source code.
Automated testing is a development activity which involves automating an
already existing manual process.

Automation focuses on execution phase . It increases the test execution speed as


it can be used many times with no more effort. For sure, for the first run, it will
take long time to achieve this. However, after the test scripts are ready, the human
tester can execute them automatically on the SUT . It has a very high impact on
saving the cost of the software testing phase .

2.1. AUTOMATED TESTING

A test script is a sequence of processing steps executed by the application. Each


step may have parameters such as the value to be put in specific HTML control.
These steps are implemented using any high level programming language.
Creating test scripts is a programming activity that describes test case input,
output and expected behavior. Any test script is composed of three main
components. The first component is responsible for starting up the SUT, the
second one is responsible for exercising the main scenario steps and the last one
is responsible for verification of the expected results.

1. Automation feasibility and planning: involves discussing the scope of


testing, practices to be applied and deciding whether to automate the project
or not.

2. Automation design: involves selecting specific test cases to be covered


in the automation since not all test cases are good candidates for automation
as well as selecting an automation tool. In addition, it involves assigning
automated testing tasks to the appropriate team members.
3. Test scripts development: involves implementation of test scripts that
simulate test cases steps.

4. Test scripts deployment: getting the automation project ready for use.

5. Automation execution: the implemented test scripts are executed on the


SUT.

6. Test verification: actual results extracted from the second step are
compared against the expected results to mark every test case as either
passed or failed.
7. Automation maintenance: test scripts need to be updated frequently to
match any update in the source code.

2.2. AUTOMATION TESTING FRAMEWORKS


Historically, early automated software testing frameworks adopted the
record/playback approach, then moved to the data-driven approach, and
nowadays it is currently moving to be keyword-driven . These approaches can be
dived into two main automation testing approaches.

Record/Playback Automation

The first approach is record/playback automation testing framework is attractive


particularly for non-programmers because of its ease of use. All what is needed
is simply clicking the record button to record user actions, then clicking the
playback button to replay the auto-generated scripts. There are many
record/playback testing tools that record all user actions and data input to the
different web pages of the SUT. Actions may vary such as clicking buttons,
selecting values, inputting data values, etc. These auto-generated scripts are used
later to run automatically without any user interaction as shown in Fig. 1. Creating
test cases in the record/playback approach does not require any advanced testing
skills or any programming skills. Testers just need to run the web application and
record their actions. However, the auto-generated test scripts are very fragile and
sensitive to any simple change. Any minor change in the application GUI might
break the auto-generated test script. This means that test scripts are tightly
coupled to the web pages. For example, test scripts may fail to locate a hyperlink
or an input field that is changed from dropdownlist to checkbox or submission
button because of layout change. The solution to this problem is either to repair
the test script to match the new UI change or re-record the user scenario again on
the new release of the application and generate the test script from scratch.

Fig. 1. Record/playback Automation Approach

2.3. Programmable Automation


The second approach is programmable automation testing framework which aims
to automate applications by using all the features, guidelines and best practices
of the traditional development. In this type, testing engineer can use: handling
conditional execution to select one path from multi paths, loops to execute
specific portion of code many times, handling exceptions and logging reuse of
common methods, reference elements, parameterizing methods. It is built on the
concept of encapsulation.
Nguyen and Robbins named the programmable automation testing as script-
based automation testing which asks testing engineers to implement test scripts
to control the GUI.

Programmable approach requires elevated level programming skills because it is


a normal development project, so it requires high initial effort in scripts
development. The programmable test scripts that are implemented using this
approach are more flexible than the scripts which are generated by
record/playback tools. It is based on the manual implementation of test scripts as
shown in Fig. 2.

Test scripts can be implemented using any general-purpose programming


language (such as C++, Java, and Ruby). They also use specific UI libraries that
can catch browser instance and provide commands that deal with HTML UI
objects.

Fig. 2. Test Script Life Cycle


3. TYPES OF TESTING FRAMEWORKS

3.1. KEYWORD DRIVEN AUTOMATION TESTING


FRAMEWORKS

Keyword-driven frameworks are based on the concept of separating not only test
data but also keywords. Keywords are translated into actions using an automation
driver. It is an extension of the data- driven framework where user actions are
separated as keywords in addition to test data. Every keyword is related to a
specific functionality, and the sequence of keywords is automatically run using a
driver program. The suite of automated test cases will later run without any
human intervention. It works on a higher level of abstraction as it implements
reusable functionalities in the form of keywords that represent test case steps. The
test script engine is responsible for calling the corresponding method for the
appropriate keyword. Fig. 3 illustrates the high-level architecture diagram for a
keyword-driven framework.

A keyword-driven automation framework consists of three main components:

External data files: which consists of keywords and test data.

o Keywords: The keywords sequence represents the test case flow.


Based on these keywords, specific functions will be called.

o Test data: includes test case inputs and outputs. Input values can
either be stored with the keywords repository or separated in an external
data file.
Fig. 3. General Keyword Driven Framework
o Test function libraries: These functions should open and read the external
data source line by line and then map each keyword to its corresponding
function. It is also responsible for mapping each test step to the automation
source code (e.g. Selenium, Watin, QTP, etc.) that integrates with the
framework.

o Driver test scripts: It is responsible for initiating the function library to


start execution.

The main benefit of keyword-driven frameworks is that it reduces the overall cost
of test scripts maintenance because of the high level of abstraction. Moreover,
tests are easier to understand by inexperienced testers/users. Using keyword-
driven automation frameworks, the tester can create new tests without having
programming knowledge.

3.2. Related Work for Testing Frameworks


Leotta et al.performed an experimental analysis to calculate the cost/benefit
tradeoff of the record/playback scripting approach (using Selenium IDE) and the
programmable scripting approach (using Selenium Web driver). They do not only
calculate the cost for creating the test scripts from scratch but also the cost for
maintaining the test scripts after publishing a new release of the application. They
asses the two approaches on both the short term and long term. The results of
experiments on testing six different web applications indicated that:

o Cost for the development of test scripts using the programming scripting
technique is more expensive than using the record/playback scripting
technique as it has additional time required ranging from 32% to 112%.
However, test scripts maintenance in the programming approach cost less
than in record/playback approach as it saves from 16% to 51% of the
required time. They noticed that after about one to three releases of the same
application, the cost of developing test scripts in the programmable approach
is less than the cumulative cost of maintaining record/playback scripts. The
saving cost value increases gradually after each release. This means that for
any web application which is expected to have three releases over its
progress, programmable test scripts will have more return on investment
than record/playback test scripts.

o The more the reuse of page objects across test cases, the lower the
maintenance cost needed to update test scripts because shared page objects
will be maintained only once. This depends on the modularity of the web
application under test.

Bhondokar et al. propose usage of a hybrid testing framework which combines


both data-driven and keyworddriven concepts. This type of framework can be
used widely in any type of web application for automation testing as shown in
Fig. 4.
Fig. 4. Hybrid Testing Framework

They propose another solution to reduce the cost of automation testing by


transforming the English manual written test cases to a well-organized keyword-
driven form sheet. These auto-generated keywords will be used later by the
automation framework to test the SUT. The main issue in this solution is that
output test steps from the English written test cases are not 100% guaranteed to
be correct because the user can express a test case in many different forms. This
requires human intervention to revise the auto-generated steps before running
them. Revision of hundreds of steps is a difficult and time-consuming task.
Therefore, there are recent researches that focus on detecting user intent from
natural languages.

Lau propose the Co-Tester system suggesting a new language called ClearScript.
The tester should provide the cotester system with segmented test steps so that
the system can handle them. Little and Miller proposed another solution to
transform the tester’s keywords to the user interface of a specific system.

Fei Wang proposes an automated framework for testing web applications based
on Selenium and JMeter. It is used for performance testing by simulating a heavy
load on a server. This framework has four main components. The first component
is the model which is responsible for converting each test case for object models
such as elements, actions and assertions. The second component is the translator
which is responsible for converting each test case into a set of actions. Then, these
actions are converted into its corresponding test script. The third component is
ActionWorker which is responsible for calling the testing tool to execute the
actions. The last component is the comparator which is responsible for comparing
the actual test results against the expected results.

Anuja proposes a keyword-driven framework which is called WAT (Web-Based


Automation Testing) developed in Java. It depends on generating
GUIWebObjects for the web page to be tested to perform GUI actions on these
HTML web objects. Then, functional testing is performed using these GUI action-
events. This framework architecture diagram is shown in Fig. 5.

WAT framework consists of the following components:

o Web objects: a test case consists of test steps, each test step works on a
different HTML web object such as button, dropdownlist, radio button,
checkbox and tabs to perform the required test case step.


 Fig. 5. Keyword Driven Framework Architecture

o JSoupParser and WebOperation: each HTML web object has a different


tag such as id, name and class. Java soup parser is responsible for creating
XPath for each HTML control. This XPath will be used later to locate the
web element in the web page of SUT so that the automation framework can
use this HTML control to achieve the business scenario such as Click,
SelectValue, Type…. etc.

o Configuration File: this file is responsible for specifying the test script
that will run on which web browser This file is editable so that the tester
can update it to run the test script on a different browser.

o Client: It sends the commands of the test script to the selenium engine to
run them on the web browser.

and propose two frameworks which are based on integrating the keyword-driven
scripting technique with Selenium automation tool. Both authors propose almost
the same main features and components. Keyword-driven testing simulates user
actions on the SUT. It is used by testers to execute test cases and then extract the
final test results. Using this framework, testers do not any programming skills.
The main idea is the use of keywords which are related to functions. These
functions are parameterized, so tester can update keyword parameters and create
new test cases using keywords lookup. This framework integrates with Selenium.
Singla lists the main common components of keyword-driven automation testing
framework as follows:

o Functionality class library: each functionality in the SUT has a


corresponding method.

o Test data sheet: this sheet has the test data which is needed to execute the
test case. The main columns for

o this sheet Are: test case id, object type, object identifier, keyword, and
data.

o Selenium web driver: To start test case execution, the code must have a
web driver to initialize an instance of the web browser.
o Result reports: this report has the final testing result for the executed test
cases.

o Driver script: this script is responsible for reading the keywords and
mapping them to their corresponding methods to be executed on the SUT.

Ashutosh propose development of a software automation testing framework for


avionics system only. However, its main problem was that this model cannot be
used to test different kinds of applications. Yalla and Shanbhag [46] state that the
best way to reduce testing effort is mixing reusable testing framework with open
source automation tool.

Stresnjak demonstrates usage of Robot framework in automation testing. It is a


keyword driven framework which manipulates test cases which is stored in
external source. Used keywords should be implemented in test libraries to be
executed on SUT. Pajunen [34] describes the Robot framework as a generic
keyword-driven software testing automation framework. Test cases are composed
of higher level user keywords which are composed of lower level keywords
which are translated to test scripts. First, lower level keywords are packed
together in framework libraries. Then, the tester can create new user keywords by
using different combinations of the lower level keywords. After that, the test cases
will be executed on the SUT.

Madhavan propose a semi-automated keyword driven automation framework


called “Autotestbot” to be used in the acceptance testing phase. However, it is
tightly coupled with Selenium automation tool in Firefox web browser only. Fig.
6 shows the framework architecture with its main components:

o Test cases repository: The framework needs a repository of acceptance


test cases from similar application types as input. These test cases are then
manipulated using the framework NLP engine. This repository will be the
knowledge base for the proposed framework during execution.
Fig. 6. Semi-automated Keyword Driven Automation
4. PROJECT DISCUSSION

Today Online Examination System has become a fast growing


examination method because of its speed and accuracy. It is also needed
less manpower to execute the examination. Almost all organizations now-
a-days, are conducting their objective exams by online examination system,
it saves students time in examinations. Organizations can also easily check
the performance of the student that they give in an examination. As a result
of this, organizations are releasing results in less time. It also helps the
environment by saving paper.
According to today’s requirement, online examination project in php is
very useful to learn it.

What is an Online Examination System?


In an online examination system examine get their user id and password
with his/her admit card. This id is already saved in the examination server.
When examine login to the server he/she get his/her profile already
register. On the certain time examine gets the message to start the
examination. All answers given by examine are saved into the server with
his/her profile information. Online examination system also allows to
correct the answer if the examine needed to change any answer in the
examination time duration, however, after the time duration any change
will not allow. This also makes c checking the answer easy and error proof
as computers are more accurate than man and provide fast results too. Php
is a web base language so we can create an online examination system
Administrator of Online Examination has multiple features such as Add,
Delete, Update Topics and Question.
To Login as Admin put inside your browser
"www.applicationname/admin"
The user will automatically get the updated version by logging
using the user ID and Password provided at the time of
registration.
No need of reprinting, appearance, vigilance and the job is done.

Online examination system features

1. Login system must be present and secured by password.


2. Ability to save the answer given by the candidate along with the question.
3. Answer checking system should be available.

4. Could Update Profile


5. Log out after the over.
6. Admin Panel

Project objective:

Online examination system is a non removable examination pattern of


today’s life. We need more time saving and more accurate examination
system as the number of applicants is increasing day by day. For all IT
students and professionals, it is very important to have some basic
understanding about the online examination system.
4.1. TECHNOLOGIES USED

1. Tools to Be Used

 Database design:- MySQL


 Website design:- Bootstrap with Custom Designing
using CSS3, Wordpress
 Coding(logic):- PHP and JavaScript
 Server:- XAMPP
 Platform:- Windows
 Application:- Notepad++
2. Requirements and setting up system for PHP development
o What we need to know:
 Designing part of the website is done with the help of
Bootstrap 4.0 and CSS3 and for the database designing we use
My SQL
o What we need to have (System Requirements):
 To run Website we need a browser and to code we need
application like Notepad++,atom, etc.

4.2. SOFTWARE REQUIREMENTS

1. Software Requirements
Initially we need to have a development machine that is running any of the
following operating systems:
o Windows XP, Vista, Windows 7, 8 •
Development environment (Notepad++).
o XAMPP server.

2. Software Requirement Analysis: The software requirement


specification is produced at the culmination of the analysis task. The
function and performance allocated to software as part of system
engineering are refined by establishing a complete information
description, a detailed functional description, a representation of
system behavior, an indication of performance requirement and design
constraints appropriate validation criteria, and other information
pertinent to requirement.

The introduction to software requirements specification states the


goals and objectives of the software, describing it in the context of the
computer based system. The Information Description provides a
detailed description of the problem that the software must solve.
Information content, flow and structure are documented.
5. SCREENSHOTS

1) Home Page

2) Registration Page
3) Log In page for an

4) Dashboard for creating the test with Test_name and Description about the test
5) Creating Multiple Choice Questions

6) Giving the Online Test as an registered user


7) Result Page after giving the online test

8) Varity of tests to choose from


6. Framework

o Keywords repository: Mapping is created between actions in the test

cases steps and their appropriate selenium web driver keywords. Thus,
each action will be mapped to selenium keyword in a dictionary.
o Preprocessing module: This module has the responsibility of reading the

input test cases and uses existing tool kit for text preprocessing operations.
o POS tagger module: This module is responsible for reading the processed

test cases and assigning parts of the selected tags to these tokenized test
cases using the test cases repository.
o Keyword mapper module: This module is responsible for selecting the

corresponding Selenium action that matches the test case step.


o Code generator module: This module is responsible for generating the

test script in Python. A test case is mapped to Python test method that works
upon Selenium web driver.
7. CONCLUSION AND FUTURE WORK

Maintenance of application is always required to resolve defects, add new


features, and enhance existing features. Therefore, regression testing is an
important software testing phase that it executed after every application change
[50] [51] especially for large scale applications that are frequently updated.
However, regression testing consumes large amounts of time as well as effort
because it requires re-running test cases which were already executed [52].
Chittimalli et al. [53] mentioned that regression testing consumes about 80% of
the total software testing estimated budget. For these reasons, it is better to
automate test cases that will be reused in later software testing phases [54].

This study presents an overview of the main software automated testing


framework. From out the review, we emphasize that using the programmable
automation testing approach is preferable. However, due to its high cost, new
techniques should be developed to overcome this problem.
8. REFERENCES

1. Q. A. Malik, "Combining Model-Based Testing, Stepwise Formal


Development," Abo Akademi University, Department of Information
Technologies Joukahaisenkatu, [PhD Thesis], 2010.
2. Banerjee, B. Nguyen, V. Garousi and A. Memona, "Graphical User
Interface (GUI) Testing Systematic Mapping and Repository," Information
and Software Technology, vol. 55, no. 10, p. 1679–1694, 2013.
3. P. Yadav and A. Kumar, "An Automation Testing Tool Using Selenium,"
International Journal of Emerging Trends & Technology in Computer
Science (IJETTCS), vol. 4, no. 5, pp. 068-071, 2015.

4. G. Tassey, "The Economic Impacts of Inadequate Infrastructure for


Software Testing," National Institute of Standards and Technology
Acquisition and Assistance Division, 2002.

5. Jain and S. Sharma, "An Efficient Keyword Driven Test Automation


Framework for Web Applications," International Journal of Engineering
Science & Advanced Technology, vol. 2, no. 3, pp. 600-604, 2012.

6. K. M. Mustafa, R. E. Al-Qutaish and M. I. Muhairat, "Classification


Software Testing Tools Based on the Software Testing Methods," Second
International Conference on Computer and Electrical Engineering, vol. 2,
2009.

7. O. A. Lemosa, F. C. Ferrari, M. M. Eler, C. J. Maldonado and P. C. Masiero,


"Evaluation Studies of Software Testing Research in Brazil, In The World:
A Survey of Two Premier Software Engineering Conferences," Journal of
Systems and Software, vol. 86, no. 4, p. 2013, 951–969.

8. Santiago, W. P. Silva and N. L. Vijaykumar, "Shortening Test Case


Execution Time for Embedded Software," Second International
Conference on Secure System Integration and Reliability Improvement,
2008.

9. R. K. Chauhan and I. Sing, "Latest Research and Development on Software


Testing Techniques and Tools," International Journal of Current
Engineering, Technology, vol. 4, no. 4, 2014.

10.Singh and B. Tarika, "Comparative Analysis of Open Source Automated


Software Testing Tools: Selenium, Sikuli, Watir," International Journal of
Information and Computation Technology, vol. 4, pp. 1507-1518, 2015.

11.Divya and S. D. Mahalakshmi, "An Efficient Framework for Unified


Automation Testing: A Case Study on Software Industry," International
Journal of Advanced Research in Computer Science & Technology, vol. 2,
2014.

12.T. Kanstrén, "A Review of Domain-Specific Modelling, Software


Testing," The Eighth International MultiConference on Computing in
the Global Information Technology, 2013.

13.Pillai, "Designing Keyword Driven Framework mapped at Operation


Level," 2017. [Online]. Available:
http://www.automationrepository.com/2012/08/keyworddriven-
framework-mapped-at-operation-level-part-1/.

14.S. Thummalapenta, S. Sinha, N. Singhania and S. Chandra, "Automating


Test Automation," 34th International Conference on Software Engineering
(ICSE), 2012.

15.V. N. Maurya and R. Kumar, "Analytical Study on Manual vs. Automated


Testing Using with Simplistic Cost Model," International Journal of
Electronics and Electrical Engineering, vol. 2, no. 1, 2012.

You might also like