0% found this document useful (0 votes)
4 views7 pages

Development of An Automated Testing Software For Real Time Systems

The document discusses the development of an automated testing software for real-time systems, specifically for embedded devices running eVC++ on WinCE. It outlines the architecture, functionality, and testing processes involved in ensuring the software's performance and robustness, including both manual and automated testing approaches. The automated tool was designed to handle various test sequences, manage multiple devices, and facilitate regression testing while ensuring compliance with accuracy standards in weight and price calculations.

Uploaded by

boobeashb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views7 pages

Development of An Automated Testing Software For Real Time Systems

The document discusses the development of an automated testing software for real-time systems, specifically for embedded devices running eVC++ on WinCE. It outlines the architecture, functionality, and testing processes involved in ensuring the software's performance and robustness, including both manual and automated testing approaches. The automated tool was designed to handle various test sequences, manage multiple devices, and facilitate regression testing while ensuring compliance with accuracy standards in weight and price calculations.

Uploaded by

boobeashb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/235974971

Development of an automated testing software for real time systems

Conference Paper · December 2009


DOI: 10.1109/ICIINFS.2009.5429865

CITATIONS READS
3 223

2 authors, including:

Arunkumar Balakrishnan
Coimbatore Institute of Technology
13 PUBLICATIONS 41 CITATIONS

SEE PROFILE

All content following this page was uploaded by Arunkumar Balakrishnan on 25 January 2017.

The user has requested enhancement of the downloaded file.


Developing and using an automated testing software for real time
systems

Arunkumar Balakrishnan, Anand NK


Mettler Toledo Turing Softwares, India
[email protected], [email protected]

Abstract provided by the software and ensuring good per-


formance and robustness in the environment of
An automated software testing tool was built usage. The software on the device allows storage
on PERL to test the eVC++ software running of product information in a database, handles key-
on WinCE platform on the embedded devices. An board and touch screen interfaces and produces
explicit model was built of the core functions of the printouts on a label printer. In addition to testing
software on the device. Test sequences were then the functionalities, there is also a need to monitor
generated that could call on these core functions. the Random Access Memory available (to check
The PERL software received the input of the for memory leaks in the application) and to check
test sequence(s) to be run and the device(s) on that the software responds in appropriate time to
which it was to be run. It would then retrieve user actions. These latter requirements are due to
the corresponding test sequence and expand the the embedded nature of the software.
functions called from the core function set. Each The testing was planned using both, manual and
of the instructions was converted into the format automated testing. Manual testing was necessitated
recognized by the device and sent over the network. to verify the displays on the screen of the device,
retention of configuration data over upgrades and
The architecture displayed scalability by being ex- for verification of the layouts and field rendering
tended to test the software on an enhanced device. in the printed outputs. Automated testing was con-
sidered beneficial for the following reasons:
1. Introduction • The domain of application involved two entities
that required precision - Weight and Price. These
The embedded device that was tested is basically fields have constraints / dependencies on accuracy
a weighing device (Point of sale) used in retail and displayable (recordable) values. Specific rules
stores. At the base level the device is capable have also been formulated by statutory bodies
(NTEP)[1], concerning the rounding off to be done
of storing product details, and doing transactions, on these values. The software was to follow these
which involve retrieving product details, accessing rules strictly. Automated testing was used to ensure
the weight value of the product, computing the that these rules were implemented correctly, by
total price and printing the details of the trans- giving a complete range of weight and unit price
action on a label printer. The device can also be combinations and verifying that the total price
calculated met the constraints imposed by the rules.
configured to match the environment requirements, • The software rendered around eighty screens on
like network parameters, load cell characteristics, the device. Each of these screens had around ten
operator requirements, database entries and printer to sixteen fields / action buttons. Manually testing
settings. During a transaction the operator may also every one of these ”screen X field” combinations
be allowed to override certain stored values of the would take too much of an effort and would not
provide a totally dependable result.
product. • Repeated database retrieval over many runs had
Testing the embedded device involved checking to be exercised to check for correct retrieval and
for the correct working of all the functionalities presentation.
• Various configurations settings have to be tried to start from a specified screen and use this screen
ensure that they can be recorded and used. definition file to systematically try each of the
• With regular cycles of development and releases, navigation buttons and check that the next screen
regression testing of the functionalities of the soft-
ware on the device, required a major effort. This is the expected one. The testing tool handled this
was facilitated by running the regression tests using search in a depth first fashion.
the automated test software, while the test engi-
neers focused on testing the new features of the 4. Architecture of the Automated Test-
iteration, from a functional perspective.
We decided to develop two automated testing
ing Tool - Functional Testing
software: one for testing screen navigation of the
The embedded software on the device was ar-
embedded software and another for testing the
chitected with testability in mind. A test interface
functionality of the embedded software.
was provided in the embedded device, which could
This paper describes the architecture, develop-
receive test commands and send responses back to
ment and use of an automated testing software built
the test tool. Prior to the start of the test activity the
to test embedded software on a set of weighing
test tool was required to authenticate itself with the
devices. The next section describes the architecture
embedded software. This authentication allowed
on the automated test software. This is followed
test specific commands to be responded to.
by the process/stages of developing the software.
The automated testing tool was architected fo-
Finally the testing of the devices by the automated
cusing on re-usability. We first decided to separate
software is presented.
test sequences from the software that would ac-
tually send the test commands to the embedded
2. Issues in using Off-the-Shelf Auto- software. This would enable the communicating
mated Test Tool software to be re-used. This now required another
module that could parse the test sequences and pass
Environment of the software : We surveyed avail- the command onto the communication module (this
able automation test tools (including embedded could also be re-used).
test tools) like WINRUNNER, MessageMagic and The second factor that decided the architecture
VectorCAST. We found that these tools fell short of the system was the need to be event driven. The
of our requirements primarily because they did not automated testing tool was required to be capable
support testing eVC++ code on WinCE platform. of catching any unexpected response from the
Development platform : We decided to use PERL embedded device. To achieve this the automated
language to develop the automated testing tool, to testing tool was constantly listening on a separate
provide portability between various development port where all messages and state change indi-
platforms. cations were posted by the test interface running
on the embedded device. The automated testing
3. Architecture of the Automated Test- tool would then check whether the message was
ing Tool - Screen Navigation as expected or not and accordingly write an error
log.
One of the basic checks on embedded devices/- The third factor that influenced the architecture
software is to ensure that the user is always able was the need for the automated testing tool to
to navigate between all screens. To check for this represent and use the core functionality provided
we developed an automated testing tool, that used by the embedded software. This representation
model based testing ideas[2][4][5]. We recorded provided a model of the embedded software being
the screen attributes as screen-id, number of tested. This core model also allowed re-usability.
buttons and fields and the identity of the navigation Test sequences could now be built based on the
buttons. The automated testing tool would then core functionality. A test sequence could be com-
posed of a set of core functions called in a specific can result in one or more Test Actions based on
manner. the set of Test Data applicable for the test action.
At the top level the automated testing tool had The Test Action included the validation actions to
1) A set of core functionality specified in a readable check for the expected result. The development
format - similar to the specification of the test of the test cases in this manner helped both the
cases.
2) Test sequences also specified in the above format.
manual testing as well as automated testing. In
3) A parser software that could read the test se- the process of developing the test cases the test
quences and pick up the test command to be sent engineers built up their familiarity and knowledge
to the embedded software. of the embedded software, while the automated
4) A communication software that would be respon- testing tool could directly use these test sequences
sible for sending the command to the embedded
device and would wait for the response from the
once the parser and communication software were
device. This would be checking for any response developed. Both the test case generation and the
on both the ports (normal and unexpected) of parser/communicator software were developed in
communication. parallel.
5) A logging module that would record the test action
sent and the response received, along with the any
noted errors.

4.1. Representation for core functionalities


and test sequences

The first step towards developing the automated


testing tool was to identify and record the core
functionalities provided by the embedded software.
Figure 2. Flow of the Automated Testing Soft-
The second step was to create Test cases. This
ware
was done by generating Use Cases from studying
the Requirements Specification document and then
generating Test cases based on them. We used an
4.2. Flow of the automated testing tool
extension of Use cases enhanced by contracts[3].
The format of the test case document, including the
relevant Use Case is given in Figure 1 (the same The automated testing software starts by com-
format is used for the representation of the core municating on a pre-designated port to the test
functionality) interface running on the embedded software. First,
an authentication message is sent. The appropriate
Spec Uc UC Variable TC Test Test message from the embedded software includes two
Id Fields Data Action new port numbers. The first of these port numbers
is used for normal communications while the sec-
”Spec” : Requirement Specification Identification ; ”UC Id” :Use Case Identification ond one is constantly listened to for any unexpected
”UC” : Use case ; ”Variable fields” : The set of variables (parameters) in the Use Case messages. The automated test software now opens
”TC” : Test Case ; ”Test Data” : The data to be used for the test action the test sequence to be sent to the device. The first
command is retrieved and parsed. If the command
Figure 1. Use Case and Test Case Document
is a direct action (specified by an action on a button
Requirements traceability was provided by the name) then the command is directly converted to
requirement specification name being mentioned in a encoded communication protocol and sent to the
the first column. Each Use Case can result in one or embedded software. If the command is preceded by
more Test Cases based on the number of variable FUNC, then the parser software identifies it as a
fields involved in the Use Case. Each Test Case function call of one of the core functions available
in the core function list. The automated testing 4.4. Handling multiple devices
software then accesses the appropriate entry in the
core function list. This is now parsed similarly. Discussions with the test team raised a need
After sending the command to the embedded to run same / different test sequences to many
software the automated testing software reads the embedded devices in the same session. To provide
second (unexpected communication) port to check this facility we decided to use multiple threads.
for any event. If a message is found in that port Each thread would handle the test sequence being
the automated testing software reads it and checks transmitted / checked with a particular device. To
its current action to confirm whether it is an implement this we faced some difficulties. Primar-
acceptab;e one. If it is not then appropriate entry ily we faced difficulties in sending OLE (Object
is made in the error log. The automated testing Linking and Embedding) files over threads in the
software then closes the current action sequence PERL environment (Our test cases were created
and starts the next one. The testing is therefore able as Excel files, which could be sent as OLE files
to handle unexpected events. After every action is only). We bypassed this issue by creating array
sent the automated testing software queries the em- files of each test sequence and sending this to each
bedded device for the amount of available Random thread. The architecture was now enhanced, with
Access Memory and records the same. Also, the a master thread that would take the information
time at which every action is sent and the time at of the number of embedded devices to be tested
which every response is received are also recorded. in a session and the test sequence to be sent to
each of these devices and would then create a
4.3. Commands sent to the embedded soft- thread for each of these devices and send the test
ware in test sequences sequence array to the thread. The test sequence
array would be in the expanded form, meaning
The functions in the test sequence will be of six that references to core functions and multiplicity
types.
of execution would all be specified exactly in the
1) Action functions/button activate of the device
2) Get response from the device form of specific actions to the embedded device.
3) Validate the values that are read from the device. Each of the child threads would now send the test
4) Read the cell data in the core functionalities/test actions to the device, get the responses (in the two
sequence file. communication ports) and log the results.
5) Parse the data read.
6) Write data into the log file.
In addition there will be four commands in the 5. Automated Testing Software - Setup
test sequence.
• ”Repeat up to ” will be used to mark the set of
functions to be repeated. We also developed a test setup software, which
• ”Vary x” will describe how a parameter value is to would be the primary user interface. This would
be changed in the course of the repetition. receive the input from the user of the number of
• ”While condition” will describe what is to be embedded devices to test and the test sequence
checked for to allow the set of functions to repeat.
• ”If condition” which will provide a control flow to test each with. Each test session could also be
during the execution of the test sequence. The saved, allowing re-running of test sessions during
condition being checked could refer to either a regression testing easily. The saved sessions also
particular message, screen or field value. reduced the pre-processing time, created by the
These four conditions provide the capacity for the need to generate the test sequence arrays for each
control flow within a test action and within a test sequence. Saved sessions immediately started
test sequence. The parser software has appropriate execution while new sessions would experience a
actions done when it receives these commands in few seconds of delay while the test sequence arrays
the test sequence. are being built.
6. Test Case Generation Aid facilitated by the model-based architecture of the
testing tool.
To aid the generation of test cases and primarily At each stage of a test action, we generated a
to avoid syntactical errors while writing the test random number. If this was greater than 0.5 then
cases we developed a software which would help the action specified in the test sequence was sent to
the user to write the test cases in the format re- the embedded device. If the random number was
quired. The user could choose whether the current less than 0.5 then a wrong (un-specified) action was
entry is an action to the device entry or a test sent. To choose the (un-specified) action the screen
sequence control entry (referring to control flow definition file was used. Of the set of buttons that
commands). If the choice is device action, then were not to be used at that instance, one of them
the user is presented with the option of sending was selected randomly. This action was then sent
a Read command or a Button press command or to the embedded device. The response from the
of choosing a Core function command. If the last device was received and the process repeated.
option is chosen then the list of core functions are
shown and the user can choose from one of them.
If the choice is a control flow one, then the options 9. Usage of the Automated Testing Tool
of If, While, Repeat are presented. The choice is
entered into the test sequence file. If the chosen Once the initial core functionalities and the
entry requires data to be given then the user is parser / communication software was built, we exe-
given a set of default values with the choice to cuted test sequences on a daily basis. We developed
around 100 test sequences and core functionalities.
override with specific values. These exercised the basic functionalities of the
device like:
7. Advantages of Model Based Testing • Add product information records to the database
• Do standard transactions
We found many changes during the development • Do operator override transactions
time in the screen layouts and the flow. We could • Do configuration changes
handle this easily by making changes to the screen These test sequences were run regularly. As has
definitions, rather than to actual code. We also had been mentioned the automated test tool was also
changes to the flow of the software, which were recording the memory usage and time of response.
easily handled by changes to the explicitly defined Thus the performance and the functionality of the
core set of functionalities. embedded device were constantly monitored. The
behavior of the embedded software was improved
8. Randomized Automated Testing Tool constantly. The randomized testing, in particular
was effective in finding performance and robust-
The basic set of test sequences and core func- ness issues.
tionalities exercised the actions on the device in
the correct order. We checked for robustness of
the embedded software by using data that went out
10. Scalability Provided
of the allowed range and checking for appropriate
recognition and action by the software. We then The next automated test activity involved testing
added another feature to the automated testing an enhanced device in the same genre. Here we
tool that expanded its testing range. This was to only had to change the core functionality set. The
simulate situations where a novice user would be test sequences of the first device could be reused
using the device. To simulate such situations we directly. We also added test sequences specific to
introduced a random element into the actions sent the extra features in the new device primarily
to the embedded device. This randomness was also having a client server feature.
11. Lessons Learned embedded software developed for a retail point of
sale device. We have highlighted how usage of a
The use of the screen model for the screen model base made the task more manageable and
navigation tests allowed the many changes in the scalable, even though there was a gestation time
screens to be handled with ease. This showed the for the project while the models were being defined
advantage of using a model of the components of and represented.
the applciation for testing purposes. Separating the We look at extending the methods towards mak-
test cases from the software that did the automation ing the automation test tool capable of testing a
allowed easy upgradation of the test cases to match family of products. There is also a line of action
the enhanced features of the next version in the to have the test sequence generation aid software
applicaion genre. this showed the advantage of become a test case generation software for the
keeping business logic separate from the imple- product family.
mentation.
References
12. Related Work
[1] NTEP National conference on weights and measures
In [7], the authors propose a model based ap- Digital electronic scales, NCWM Publication 14,
proach for validation testing of a model based 2007
embedded system for automotive applications. The [2] Bill Hasling, Helmut Goetz and Klaus Beetz. Model
advantages of using model based approach for de- Based Testing of System Requirements using UML
velopment are described and similar advantages for Use Case Models, First International Conference on
model based validation are expected. [8] describes Software Testing, Verification and Validation, 2008.
the usage of an external model checker to generate
[3] B.Meyer Applying Design by Contract in Computer
unit tests. Applying operational research and artifi- (IEEE), vol. 25, no. 10, October 1992, pages 40-51.
cial intelligence techniques to test case design for
embedded systems is discussed in [6]. The test case [4] Eckard Bringmann, Andreas Krmer, Model-based
design is transformed into an optimization problem Testing of Automotive Systems, International Con-
ference on Software Testing, Verification, and Vali-
that is solved using evolutionary techniques and
dation, IEEE, 2008.
simulated annealing techniques. This approach is
adaptable to the domain. These approaches have [5] Csar Andrs, Luis Llana, and Ismael Rodrguez, For-
emphasized the need for a model based approach mally comparing user and implementer model-based
for automating the testing of embedded systems. testing methods,International Conference on Soft-
ware Testing Verification and Validation Workshop
This paper has extended this emphasis by creating
(ICSTW’08), IEEE, 2008.
models of the embedded software being tested first
as a screen model that captures the attributes of [6] Harmen Sthamer, Andre Baresel and Joachim We-
each screen and the navigation between screens gener Evolutionary testing of embedded systems, in
and second as a core set of functional test cases 14th International Internet and software quality week
2001.
that represent the basic behavior of the embedded
software. This latter model highlighted re-usability [7] Kum Dae-Hyun Automated testing for automo-
in being used entirely in the second generation of tive embedded systemsin SICE-ICASE International
the embedded product. Joint Conference, South Korea

[8] Zoltan Micskei and Istvan Majzik Model based


13. Conclusion autiomatic test generation for event driven embedded
systems using model checkers, IEEE proceedings
We have presented the architecture of an auto- of the International conference on dependability of
mated testing tool that we developed for testing the computer systems, 2006.

View publication stats

You might also like