Win Runner
Win Runner
Win Runner
Softsmith Infotech
WinRunner Overview
What is WinRunner?
WinRunner is a test automation tool, designed to help customers save testing time
and effort by automating the manual testing process
manual process: perform operations by hand, visually check results, and log
results by hand
automated process: create a test script that will perform the same operations
as a Human operator, check the same information, and create a summary
report showing the test status
Recording
Recording Modes
Context-sensitive mode
Analog mode
Tests can combine both recording modes
Context-Sensitive is the default mode
Switch between modes using same record key (F2)
Context-Sensitive Mode
Object-based
Unaffected by minor UI changes
Maintainable (readable/editable)
Generally used with GUI applications
Portable script
Context-Sensitive Mode
Set focus to the window
set_window ("Save As");
edit_set(File Name,output14);
button_press(OK);
Button press
output14
Analog Mode
Position-dependent
Works with any application
UI changes force test script changes
Usually drives tests with mouse, keyboard and
other such manual user inputs
Less maintainable
Analog Mode
mouse drag
move_locator_track (1);
mtype (" <T55> <kLeft>-<kLeft>+");
type (" <t3>output14" );
mouse click
move_locator_track (2);
mtype (" <T35><kLeft>-<kLeft>+ ");
keyboard
timing
output14
Recording Modes
Context-Sensitive mode statements can be
recorded or programmed
record: button_press, win_activate
program: list_get_num_items, edit_get_text
recommended for most situations due to greater robustness
Recording Tips
plan your work
decide exactly what actions / data to record
Recording Tips
Recording Tips
Run Modes
Debug
debug is good to use while the test script is being debugged
these test results are overwritten with each new run
Verify
Corresponds to actual results
Generally used when executing testing sessions where results need to be stored
Update
Corresponds to expected results. Expected results are the benchmarks used to
verify test results
Test runs in Update mode generate the expected results for future runs to
compare back against
These test results become the expected results for subsequent test runs in
Verify mode
Synchronization
Enhances a test script to ensure reliable replay
accounts for delays in order to prevent the automated script from running faster
than the tested application
critical for successful test automation implementation
among the main reasons why record-n-playback is not reliable
In Context-Sensitive mode
Examples: (operations)
In Analog mode
Examples: (operations)
wait for a window bitmap to appear / refresh
wait for a specific amount of time
Window Synchronization
Bitmap Synchronization
button_press(Submit);
obj_wait_bitmap (Object,Img1,10);
button_press(Confirm);
win_wait_bitmap (Screen", "Img2", 10, 209, 170, 81, 20);
win_wait_bitmap, obj_wait_bitmap
Waits for a bitmap to be drawn onscreen. Bitmap may be
complete window/object or partial area. Bitmap is captured
and stored during recording.
Object Synchronization
win_wait_info, obj_wait_info
Waits for a window or object attribute to reach a specified
value.
Time Synchronization
wait(10);
wait
Waits for the specified amount of time.
Analog Synchronization
win_wait_bitmap
Waits for a window bitmap to appear onscreen. Bitmap may
be full/partial window area. Optionally, bitmap filename may
be omitted, thus synchronizing on window refresh/redraw.
In analog mode, this is invoked using softkeys.
Synchronization Controls
GUI Map
(*)
Recording
object is stored in GUI map first
object is assigned a name
based on object class and name, statement is
generated in WinRunner script
Replay
WinRunner searches the current window context in
the GUI map (set_window)
WinRunner searches window for the object name
Physical description is used to locate object
(*)
(*)
Regular Expressions
.
[0-9]
[A-Z]
[a-z]
[mf]
^
|
&
*
.*
Regular Expressions
.
[0-9]
[A-Z]
[a-z]
[mf]
^
|
&
*
.*
[answer: a and d]
Regular Expressions
.
[0-9]
[A-Z]
[a-z]
[mf]
^
|
&
*
.*
[answer: c and d]
Eg. $30,000,000 lottery pot which regular expression is equivalent to this number?
a) $[2-8].*0^[a-z]
b) $[2345].*0.*[a-z]
c) ..0.*[aeiou]ey.* pot
d) .*lottery.
Answer: The answer is given on the next page in the upper-right hand corner
[answer: b]
modify the script to automatically load and use the GUI Map file youve
created
window/object functions
environment functions
reporting functions
database query functions
file/spreadsheet functions
Win32 functions
Function Generator
Language Syntax
*** Same
Variables
Basic Rules
Arrays
single dimension: cust[1], cust[2], cust[3]
multi-dimension: address[1,1], address[1,2]
Can be indexed with number
address[1], address[2]
Operators
Math
+ - * / ^ % ++ -Logical
&& || !
Relational
== != >= <= > <
Assignment
= += -= *= /= ^= %=
Concatenation
&
Test Verification
Enhancing a test script to verify data onscreen
Context-Sensitive verification
Analog verification
Checkpoints
Definition: A checkpoint is a WinRunner statement which determines
whether a particular object property is as expected. This is determined by
either comparing previously captured results to current results or defining an
expected result to compare to the actual result. Expected results are
captured when running in Update mode.
GUI
Bitmap
Text
Database
GUI Checkpoints
(skim)
set_window(Insert Order);
button_press(OK);
obj_check_gui (ProgressBar",list1.ckl, gui1,25);
set_window(Reports, 10);
menu_select_item(Analysis;Reports);
win_check_gui (Reports, list2.ckl, gui2, 4);
win_check_gui, obj_check_gui
Verifies that object(s) properties match the expected results.
Properties to verify are saved in a checklist. The checklist is
used to capture the expected results during recording, and is
also used to capture the actual results for comparison.
Bitmap Checkpoints
(skim)
set_window(Insert Order);
button_press(OK);
obj_check_bitmap ( ProgressBar",Img1",25);
obj_check_bitmap ( StatusBar",Img2",25, 0, 10, 50, 10);
set_window(Reports, 10);
win_check_bitmap( Reports, Img3, 4);
win_check_bitmap, obj_check_bitmap
Verifies a object/window bitmap matches its expected image.
Bitmap may be full/partial window area.
If a partial area is selected, the coordinates of the partial area
are captured (relative to the object).
Text Checkpoints
(skim)
obj_get_text
retrieves the text within an area (absolute coordinates)
tl_step
logs message to the WinRunner report and changes test
status
Error Handling
addresses specific predictable errors
Using error-handler routines
error codes
most TSL statements have a return code
this is used as a basis for error-checking
Error Handling
Exception/Recovery Handling
(*)
object exceptions
object property value changes
TSL exceptions
TSL error codes
Functions
public function flight_login( in uid, in passwd )
{
set_window( Login, 10);
edit_set( Agent Name:, uid );
edit_set(Password:, passwd );
button_press(OK);
}
function type
public (global)
static (local)
function name
first character cannot be numeric
parameters can be overloaded
Functions
public function flight_login( in uid, in passwd )
{
set_window( Login, 10);
edit_set( Agent Name:, uid );
edit_set(Password:, passwd );
button_press(OK);
}
function parameters
in
out
inout
arrays must be indicated with []
Functions
public function flight_login( in uid, in passwd )
{
auto x;
set_window( Login, 10);
edit_set( Agent Name:, uid );
edit_set(Password:, passwd );
button_get_info(OK, state,variables
x );
if ( x == ON)
button_press(OK);
}
Compiled Modules
Compiled Modules
(*)
Support
Some testing environments are friendlier towards
automated testing tools than others
Good
Bad
Ugly
Out-of-the-Box support
Support
Custom environments poorly programmed
Custom objects (3rd party APIs)
Unrecognized objects
Every object is displayed as a generic object
difficult to map to a class and work reliably
Conclusion
It is obvious at this point that Test Automation is not as simple as record-nplayback regardless of how good the test automation tool may be. The
more powerful the test automation tool, either greater rewards will be
reaped or more pitfalls will be encountered. It all depends on the skill and
training of the automation specialist. Hopefully, this presentation has
provided a grounding in the basics that will be required to effectively
implement test automation with WinRunner.