QTP Framework
QTP Framework
With use of functional decomposition approach, test scripts are decomposed into their primary tasks such as
login, navigation, data processing, logout, reporting etc. Then the scripts can be combined in a hierarchical fashion to
build larger tests. The functional components are reusable.
Above example will need lot of different input data (e.g. for username & login) and verification data files (e.g. count of
emails in Trash etc) for various screens.
For an error a "FALSE" condition is returned to the calling script. The calling script in turn returns FALSE to its calling
script until the control is returned back to driver script. The Driver can continue with the next Test Case or exit depending
on how you have handled it.
Advantages:
It provides division between data and scripts which in turn provides better maintainability and reduces redundancy and
repetitions in creating automated test scripts.
It delivers script reusability as individual scripts each identifying a business function can be combined into a higher level
script(s) in order to create composite and large scripts.
It makes available single maintenance point for each functionality or screen. If functionality changes, we only need to
update a particular "Business Function" script.
Limitations:
Team must have a programming background or expertise in any scripting language.
Multiple data-files may be required for each Test Case. Changes in Test Cases necessitate updates to propagate to
several sets of input/verification files for each Test Case.
Or
Keyword-driven testing is an application-independent framework utilizing data tables and self-explanatory keywords
Keyword-driven testing splits the test procedure into logical components. These logical components are then used
Below I have given 3 examples of Keyword Driven Automation Framework. These may not be complete in them, but are
a sort of starter in order to give you a little feel so that you can start well on your own.
Example 1
Framework Pseudo-Code
Textbox.VerifyValue Function:
Example 2
Code:
Logout
Here, the keywords are Launch, Verify, Login, and Logout, but they could be any actions
relevant to your testing context.
Example 3
Test 1
VerifyScreen WelcomeScreen
Logoff
Test 2
Test 3
Above, the first column commands are called keywords or action words.
Test driver reads the test file, searches for the library function associated with the keyword (Login) and then performs
execution using remaining data on the row (John, AAA) as arguments to the function.
Advantages:
Spreadsheet can be used to write detailed test plan including all input and verification data.
"Generic" utility scripts created for testing an application can be reused to test another application.
Higher levels of maintainability and efficiency can be attained by using the same principles of functional decomposition
(modularity) with Keyword driven framework and separation of script and data (data driving).
Another advantage of the keyword driven approach is that testers can develop tests without a functioning application as
long as preliminary requirements or designs can be determined.
Disadvantages:
Over the past several years there have been numerous articles done on various approaches to test automation.
Anyone who has read a fair, unbiased sampling of these knows that we cannot and must not expect pure capture
and replay of test scripts to be successful for the life of a product. We will find nothing but frustration there.
Sometimes this manifesto is hard to explain to people who have not yet performed significant test automation
with these capture\replay tools. But it usually takes less than a week, often less than a day, to hear the most
repeated phrase: "It worked when I recorded it, but now it fails when I play it back!"
Data driven scripts are those application-specific scripts captured or manually coded in the automation tools
proprietary language and then modified to accommodate variable data. Variables will be used for key
application input fields and program selections allowing the script to drive the application with external data
supplied by the calling routine or the shell that invoked the test script.
Nearly everything discussed so far defining our ideal automation framework has been describing the best
features of "keyword driven" test automation. Sometimes this is also called "table driven" test automation. It is
typically an application-independent automation framework designed to process our tests. These tests are
developed as data tables using a keyword vocabulary that is independent of the test automation tool used to
execute them. This keyword vocabulary should also be suitable for manual testing, as you will soon see.
The most successful automation frameworks generally accommodate both keyword driven testing as well as
data driven scripts. This allows data driven scripts to take advantage of the powerful libraries and utilities that
usually accompany a keyword driven architecture.
The framework utilities can make the data driven scripts more compact and less prone to failure than they
otherwise would have been. The utilities can also facilitate the gradual and manageable conversion of existing
scripts to keyword driven equivalents when and where that appears desirable.
On the other hand, the framework can use scripts to perform some tasks that might be too difficult to re-
implement in a pure keyword driven approach, or where the keyword driven capabilities are not yet in place.
Some commercially available keyword driven frameworks are making inroads in the test automation markets.
These generally come from 3rd party companies as a bridge between your application and the automation tools
you intend to deploy. They are not out-of-the-box, turnkey automation solutions just as the capture\replay tools
are not turnkey solutions.
They still require some up-front investment of time and personnel to complete the bridge between the
application and the automation tools, but they can give some automation departments and professionals a huge
jumpstart in the right direction for successful long-term test automation.
Two particular products to note are the TestFrame product led by Hans Buwalda of CMG Corp, and the
Certify product developed with Linda Hayes of WorkSoft Inc. These products each implement their own
version of a keyword driven framework and have served as models for the subject at international software
testing conferences, training courses, and user-group discussions worldwide. Im sure there are others.
It really is up to the individual enterprise to evaluate if any of the commercial solutions are suitable for their
needs. This will be based not only on the capabilities of the tools evaluated, but also on how readily they can be
modified and expanded to accommodate your current and projected capability requirements.
The following automation framework model is the result of over 18 months of planning, design, coding, and
sometimes trial and error. That is not to say that it took 18 months to get it working--it was actually a working
prototype at around 3 person-months. Specifically, one person working on it for 3 months!
The model focuses on implementing a keyword driven automation framework. It does not include any additional
features like tracking requirements or providing traceability between automated test results and any other
function of the test process. It merely provides a model for a keyword driven execution engine for automated
tests.
The commercially available frameworks generally have many more features and much broader scope. Of
course, they also have the price tag to reflect this.
The project was informally tasked to follow the guidelines or practices below:
o Implement a test strategy that will allow reasonably intuitive tests to be developed and
executed both manually and via the automation framework.
o The test strategy will allow each test to include the step to perform, the input data to use, and
the expected result all together in one line or record of the input source.
o Implement a framework that will integrate keyword driven testing and traditional scripts,
allowing both to benefit from the implementation.
The first thing we did was to define standards for source code files and headers that would provide for in-
context documentation intended for publication. This included standards for how we would use headers and
what type of information would go into them. Each source file would start with a structured block of
documentation describing the purpose of the module. Each function or subroutine would likewise have a leading
documentation block describing the routine, its arguments, possible return codes, and any errors it might
generate. Similar standards were developed for documenting the constants, variables, dependencies, and other
features of the modules. We then developed a tool that would extract and publish the documentation in HTML
format directly from the source and header files. We did this to minimize synchronization problems between the
source code and the documentation, and it has worked very well.
It is beyond the scope of this work to illustrate how this is done. In order to produce a single HTML document
we parse the source file and that source files primary headers. We format and link public declarations from the
headers to the detailed documentation in the source as well as link to any external references for other
documentation. We also format and group public constants, properties or variables, and user-defined types into
the appropriate sections of the HTML publication. One nice feature about this is that the HTML publishing tool
is made to identify the appropriate documentation blocks and include them pretty much "as is". This enables the
inclusion of HTML tags within the source documentation blocks that will be properly interpreted by a browser.
Thus, for publication purposes, we can include images or other HTML elements by embedding the proper tags.
o File Handling
o String Handling
o Buffer Handling
o Variable Handling
o Database Access
o Logging Utilities
o System\Environment Handling
o Application Mapping Functions
o System Messaging or System API Enhancements and Wrappers
The essential guiding principles we should follow when developing our overall test strategy (or evaluating the
test strategy of a tool we wish to consider):
The test design and the test framework are totally separate entities.
The test framework should be application-independent.
The test framework must be easy to expand, maintain, and perpetuate.
The test strategy/design vocabulary should be framework independent.
The test strategy/design should isolate testers from the complexities of the test framework.