User Manual: R&S Contest Advanced User Procedures
User Manual: R&S Contest Advanced User Procedures
(;ÙÊJ2)
1175.6026.02 ─ 02
Test & Measurement
User Manual
© 2016 Rohde & Schwarz GmbH & Co. KG
Mühldorfstr. 15, 81671 München, Germany
Phone: +49 89 41 29 - 0
Fax: +49 89 41 29 12 164
Email: [email protected]
Internet: www.rohde-schwarz.com
Subject to change – Data without tolerance limits is not binding.
R&S® is a registered trademark of Rohde & Schwarz GmbH & Co. KG.
Trade names are trademarks of the owners.
The following abbreviations are used throughout this manual: R&S® CONTEST is abbreviated as R&S CONTEST, Microsoft Win-
dows® is abbreviated as Windows, Jenkins® is abbreviated as Jenkins.
R&S® CONTEST Contents
Contents
1 Infrastructure..........................................................................................7
1.1 System Controller......................................................................................................... 7
1.2 User................................................................................................................................ 9
1.3 Central Report Server................................................................................................. 10
2 Data Handling....................................................................................... 11
2.1 Test Report Directory................................................................................................. 11
2.1.1 Introduction................................................................................................................... 11
2.1.2 Stored Information.........................................................................................................12
2.1.3 Changing the Name or Location of the Test Report Directory...................................... 13
2.1.4 Importing Test Reports into the Test Report Directory..................................................15
2.2 Central Report Server................................................................................................. 16
2.2.1 Introduction................................................................................................................... 16
2.2.2 Setting up the Database Server.................................................................................... 16
2.2.3 Setting up the File Server..............................................................................................17
2.2.4 Configuring R&S CONTEST......................................................................................... 17
2.2.5 Transferring Reports..................................................................................................... 21
2.3 Report Transfer Service............................................................................................. 23
2.3.1 Introduction................................................................................................................... 23
2.3.2 Status Application......................................................................................................... 23
2.3.3 Advanced Configuration................................................................................................28
3 Reporting.............................................................................................. 30
3.1 Test Result Header Reports....................................................................................... 30
3.1.1 Introduction................................................................................................................... 30
3.1.2 Stored Information.........................................................................................................30
3.1.3 Configuring the Test Result Header Generator.............................................................33
3.1.4 Generating Test Result Header Reports.......................................................................35
3.2 JSON Reports..............................................................................................................36
3.2.1 Introduction................................................................................................................... 36
3.2.2 Stored Information.........................................................................................................36
3.2.3 Generating JSON Reports............................................................................................ 37
3.3 PDF Reports................................................................................................................ 38
3.3.1 Introduction................................................................................................................... 38
3.3.2 Stored Information.........................................................................................................39
3.3.3 Generating PDF Reports...............................................................................................47
5 DUT Automation...................................................................................78
5.1 Custom DUT Remote Control Plugins.......................................................................78
5.1.1 Introduction................................................................................................................... 78
5.1.2 Interface Description..................................................................................................... 78
5.1.3 Configuring R&S CONTEST......................................................................................... 79
5.1.4 Example Visual Studio Solution.................................................................................... 81
5.2 DUT Automation Applications................................................................................... 81
5.2.1 Introduction................................................................................................................... 81
5.2.2 Interface Description..................................................................................................... 81
5.2.3 Automation Commands.................................................................................................82
5.2.4 Configuring R&S CONTEST......................................................................................... 85
5.3 IP Trigger Applications...............................................................................................86
5.3.1 Introduction................................................................................................................... 86
5.3.2 Interface Description..................................................................................................... 86
Index....................................................................................................102
1 Infrastructure
This chapter describes the infrastructure around R&S CONTEST by illustrating and
explaining the correlation between different topics of the present documentation.
R&S CONTEST
During the execution of a test plan, R&S CONTEST writes test report files into the test
report directory, shared in the network. Additionally, measurement data and test rela-
ted information is written into a database. If the R&S CONTEST Reportal server is run-
ning, the test report of a currently running test case is made available to R&S CON-
TEST Reportal and can thus be monitored remotely via web browser by any user
within the same network as the system controller. Using the built-in Jenkins reporting
plugin, a JUnitReport is written and stored on the Jenkins server.
For more information refer to:
● Chapter 2.1, "Test Report Directory", on page 11
● the separate R&S CONTEST Reportal documentation
● Chapter 4.3, "Jenkins Integration", on page 75
Remote Server
R&S CONTEST Remote Server is used to remotely perform essential functions on the
test system. Controlled by a (possibly remote) SOAP client, R&S CONTEST Remote
Server uses the Command Line Interface to interact with R&S CONTEST. R&S CON-
TEST Remote Server acts as a single interface for all available R&S CONTEST ver-
sions.
For more information about R&S CONTEST Remote Server refer to Chapter 4.1,
"Remote Server", on page 49.
Figure 1-2: Illustration of the Infrastructure of R&S CONTEST Command Line Interfaces
The applications listed in Figure 1-2 don't represent a comprehensive list, but rather an
exemplary selection for illustration purposes. A complete list of all available applica-
tions and test cases can be obtained using the --info parameter of the respective
Command Line Interface (see Table 4-5).
For more information about the Command Line Interface refer to Chapter 4.2, "Com-
mand Line Interface", on page 60.
R&S CONTEST Report Transfer Service the files can be replicated to the test report
directory on a Central Report Server.
For more information about the test report directory refer to Chapter 2.1, "Test Report
Directory", on page 11.
Database
The database on the system controller holds measurement data and test related infor-
mation written by R&S CONTEST during the execution of a test plan. It is used by R&S
CONTEST Report Manager for finding, viewing and comparing past test case runs.
Using R&S CONTEST Report Transfer Service database entries can be replicated to
the database on a Central Report Server.
Reportal
R&S CONTEST Reportal uses measurement data and test information provided by
R&S CONTEST to remotely display test reports in a web browser. Similar to the Online
Report, R&S CONTEST Reportal allows a live view of a currently running test report,
as well as all test case reports of the most recently run test plan. It can be accessed
from any client computer within the same network as the system controller.
For more information about R&S CONTEST Reportal refer to the separate R&S CON-
TEST Reportal documentation.
1.2 User
In this context, user refers to a client computer within the same network as the system
controller.
SOAP Client
Using a SOAP client R&S CONTEST Remote Server can be accessed in order to
remotely perform essential functions on a test system. The SOAP client is not provided
by Rohde & Schwarz.
For more information about R&S CONTEST Remote Server refer to Chapter 4.1,
"Remote Server", on page 49.
Report Manager
R&S CONTEST Report Manager uses the test report directory and the database on
the system controller to access measurement data and test reports generated by the
test system. In case a Central Report Server is set up, R&S CONTEST Report Man-
ager also can access the test report directory and database of the server. Using R&S
CONTEST Report Transfer Service, R&S CONTEST Report Manager is furthermore
able to replicate test reports and database entries from the system controller to a Cen-
tral Report Server.
For more information about R&S CONTEST Report Manager refer to the separate
R&S CONTEST Report Manager documentation.
Browser
Via web browser on a users' computer, the status of a currently running test plan, as
well as its test case reports can be monitored by accessing R&S CONTEST Reportal.
Additionally, by accessing the web front end of Jenkins, the automatic and unattended
execution of test plans can be set up and scheduled.
For more information refer to:
● Chapter 4.3, "Jenkins Integration", on page 75
● the separate R&S CONTEST Reportal documentation
Database
The database on a Central Report Server contains copies of database entries that
have been transferred from the database on a system controller using R&S CONTEST
Report Transfer Service. This transfer can either be performed via R&S CONTEST
directly or using R&S CONTEST Report Manager.
For more information about the database on a Central Report Server refer to Chap-
ter 2.2, "Central Report Server", on page 16.
2 Data Handling
2.1.1 Introduction
R&S CONTEST uses a Windows share to handle test reports and related files. The
default name of the network share is ContestReports, pointing to the test report direc-
tory ContestReports, located in the C:\ root directory on the system controller. This
directory is read-only except for the system and authorized users. The name and loca-
tion of the test report directory are configurable using Report Folder Manager that
comes with R&S CONTEST and R&S CONTEST Report Manager.
During the execution of a test plan a folder will be created for each test plan run, con-
taining subfolders for each test case run in the test plan. The test report directory is
structured as follows:
Figure 2-1: Example of the folder structure within the test report directory
The default network path of the shared test report directory is:
\\<ComputerName>\ContestReports. Note that in this path, <ComputerName> is
a placeholder for the actual name of the computer.
The following sections describe the information that is stored in the test report directory
and common tasks.
The following sections list the types of information stored in the test report directory.
Depending on the configuration, parametrization and available licenses not all ele-
ments may be present in the test report directory.
ContestReports Folder
Test plan Folder One folder per test plan run. Con-
tains the respective test case fold-
ers and related files.
Test system error log File (.txt) Error log files for each test system
writing into the Windows share.
These files are only generated
when error occur.
R&S CONTEST test plan File (.rstt) A standard R&S CONTEST test
plan file.
Error log File (.txt) Error log files. These files are only
generated when errors occur dur-
ing the execution of the test plan.
Test Result Header Report File (.xml) The Test Report Header report
file.
To change the name or the location of the test report directory, proceed as follows:
1. Close any open R&S CONTEST instances.
4. From the "Programs" sections of the search results, select Report Folder Manager.
The "CONTEST Report Folder Manager" dialog appears.
6. Navigate to the desired folder, or create a new folder using the "'Make New Folder"
button.
To import test reports into the test report directory, proceed as follows:
1. Close any open R&S CONTEST instances.
4. From the "Programs" sections of the search results, select Report Folder Manager.
The "CONTEST Report Folder Manager" dialog appears.
2.2.1 Introduction
During the execution of a test plan, related meta information and measurement data
from test case runs are stored in a local database. Associated test report files such as
Online Reports or Summary Report are being written into the test report directory. By
setting up a central report server, the database entries and their associated files can
be transferred to an external machine in the network. The transfer can either be per-
formed manually using R&S CONTEST or R&S CONTEST Report Manager or auto-
matically following specified rules. For this to be achieved, a database server and a file
server need to be set up and R&S CONTEST has to be configured accordingly. The
database server and the file server can be configured separately and don't necessarily
have to run on the same network computer.
The main benefit of the central report server is, that it can be used to store information
generated by multiple test system in one central location. Users of R&S CONTEST
Report Manager for example will then have control over all test reports generated by all
test systems. If for one R&S CONTEST Report Manager accessing the central report
server, the Report Analyzer license (R&S TS8-KT150) has been acquired, it will auto-
matically be inherited by all other R&S CONTEST Report Manager clients connecting
to the same central report server.
As the database on the central report server serves as a backup for the local R&S
CONTEST database, the schema of both databases has to be the same. To assure
the correct set up of the database server, it is recommended to perform the installation
using R&S MCT Installation Manager. If you are unfamiliar with installing software
using R&S MCT Installation Manager, refer to the Software Installation manual.
In R&S MCT Installation Manager, install the most recent official POSTGRESQL ver-
sion. After the installation of the database server using R&S MCT Installation Manager,
the database setup matches the one of the local R&S CONTEST database: the
schema and the access data are the same.
After the database server is set up, R&S CONTEST can be configured accordingly.
The central report server uses the file server functionality of the operating system by
means of a network share. The easiest way to create a network share under Windows
is by using the NET SHARE command with the following parameters:
NET SHARE <ShareName>=<PathToFolder> /GRANT:<UserName>,
<Permissions>. Permissions can be either READ, CHANGE or FULL.
Example:
NET SHARE ContestReports=C:
\ContestReports /GRANT:SampleUser,FULL
(written all in one line)
This command would create a network share named ContestReports, pointing to
C:\ContestReports with read and write permissions for the user SampleUser. A
complete liste of parameters for theNET SHARE command is given by entering NET
SHARE /HELP
After the file server is set up, R&S CONTEST can be configured accordingly.
After setting up the database server and the file server, R&S CONTEST can be config-
ured to access the respective servers and automatically transfer data and test reports
to the central report server. The configuration is done via the "CONTEST Report
Server Settings" dialog.
The report server settings can be saved and loaded into other R&S CONTEST and
R&S CONTEST Report Manager applications to ensure consistency between the con-
figurations.
In the "CONTEST Report Server Settings" dialog, the "Database Server Settings" tab
page contains the following GUI elements:
"Database Host" The database host can be given either by the logical
network name or the IP address of the server where
the database server is running.
"Password" The password set for the database user. The pass-
word for the database user contest is contest@1sp1
and cannot be changed.
"Test Database Creation and Access" Button to test the database connection and the pos-
sible creation of a new database.
To configure the access to the database server on the central report server, proceed
as follows:
1. In the R&S CONTEST menu bar, select "Settings" → "CONTEST Report Server
Settings"
The "CONTEST Report Server Settings" dialog appears.
3. In the "Database Server Settings" tab page, enter the following information:
● Database Host
● Database Name
● Port
5. If the connectifity test was successful, click the "OK" button to confirm the database
server settings.
R&S CONTEST is now able to copy data to the specified database server. In order
to configure an automatic transfer of data and reports, refer to Chapter 2.2.5.1,
"Transferring Reports Automatically", on page 21.
In the "CONTEST Report Server Settings" dialog, the "File Server Settings" tab page
contains the following GUI elements:
"Use database server also for file server" Checkbox to specify whether the host specified in
the "Database Server Settings" tab page should
also be used as host for the file server.
"Server Name or IP or Domain Name" The file server host. It can either be specified by
server name, IP address or domain name. Only
available when "Use database server also for file
server" is disabled.
"Share Name" The name of the network share. Only existing net-
work shares can be specified. By default, the name
of the network share is ContestReports.
"Report Folder" Path to the desired test report directory within the
specified network share.
"Provide credentials for file server access" Checkbox to specify whether credentials are to be
provided while accessing the file server. If no cre-
dentials are provided, the currently logged in users'
credentials are used for the connection attempt.
"Domain\User Name" Domain name or user name for accessing the file
server. Only available if "Provide credentials for file
server access" is enabled.
"UNC Path to Report Location" Indicating the UNC path to the test report directory
on the central report server.
"Test File Server Access" Button to test the file server connection.
To configure the access to the file server on the central report server, proceed as fol-
lows:
1. In the R&S CONTEST menu bar, select "Settings" → "CONTEST Report Server
Settings"
The "CONTEST Report Server Settings" dialog appears.
3. In the "File Server Settings" tab page, enter the required information.
5. If the connectifity test was successful, click the "OK" button to confirm the file
server settings.
R&S CONTEST is now able to copy test reports to the specified file server. In order
to configure an automatic transfer of data and reports, refer to Chapter 2.2.5.1,
"Transferring Reports Automatically", on page 21.
With R&S CONTEST and R&S CONTEST Report Manager, data and test reports can
be transferred automatically or manually to a central report server. The actual transfer
process is handled by R&S CONTEST Report Transfer Service. For more information
about R&S CONTEST Report Transfer Service refer to Chapter 2.3, "Report Transfer
Service", on page 23. During the transfer process, data and test reports are copied
to a central report server. The original data and test reports remain on the local system
controller.
In the "CONTEST Report Server Settings" dialog, the "Automatic Transfer Settings" tab
page contains the following GUI elements:
"Enable automatic report transfer for the selected Checkbox to specify whether data and test reports
verdicts to this server" should be automatically transferred to the central
report server.
"Passed and Passed with Restrictions" Checkbox to specify whether data and test reports
of test case runs with the verdicts Passed and
Passed with Restrictions should be transferred. Only
available when is "Enable automatic report transfer
for the selected verdicts to this server" is enabled.
"Failed and Failed with Restrictions" Checkbox to specify whether data and test reports
of test case runs with the verdicts Failed and Failed
with Restrictions should be transferred. Only availa-
ble when is "Enable automatic report transfer for the
selected verdicts to this server" is enabled.
To configure the automatic transfer of data and test reports to the central report server,
proceed as follows:
1. In the R&S CONTEST menu bar, select "Settings" → "CONTEST Report Server
Settings"
The "CONTEST Report Server Settings" dialog appears.
3. In the "Automatic Transfer Settings" tab page, enable the checkbox "Enable auto-
matic report transfer for the selected verdicts to this server".
This setting does not affect data and test case reports that have been created prior to
enabling the automatic transfer. Previous reports (data and files) have to be transferred
manually to a central report server.
With R&S CONTEST or R&S CONTEST Report Manager, data and test reports of test
plan or test case runs can be manually transferred to a central report server. The man-
ual transfer is performed via the context menu of a test plan or test case report. The
following procedure describes the manual transfer in R&S CONTEST.
To manually transfer data and test reports to a central report server using R&S CON-
TEST, proceed as follows:
1. In the explorer pane (on the left), select the "Reports" tab.
A list of test plan runs appears in the tab page.
2. To transfer data and test reports of whole a test plan run, proceed as follows:
a) Right click on the desired test plan run.
b) In the context menu, select "Copy Selected Report(s) to Central Report Server
(DB: <DatabaseServer>- Files: <FileServer>)"
The data and test reports of the selected test plan run — including all test case
runs — will be transferred.
3. To transfer data and test reports of a test case run, proceed as follows:
a) Double click the desired test plan run.
b) In the test case overview in the right pane, select the desired test case run(s).
c) In the context menu, select "Copy Selected Report(s) to Central Report Server
(DB: <DatabaseServer>- Files: <FileServer>)"
The data and test reports of the selected test case run(s) will be transferred.
In the Windows system tray, the R&S CONTEST Report Transfer Service indicates
the status of the transfer.
2.3.1 Introduction
R&S CONTEST Report Transfer Service is a service tool, that manages the transfer of
data and test reports between test systems and central report servers. It replaces the
Report Handler integrated in R&S CONTEST and R&S CONTEST Report Manager.
While the former Report Handler transferred data and test reports synchronously, R&S
CONTEST Report Transfer Service performes the transfer sequentially and in an
ansynchronous manner, in order to operate independently of R&S CONTEST or R&S
CONTEST Report Manager.
It comes with the installation of R&S CONTEST or R&S CONTEST Report Manager,
and is preconfigured, to work "out of the box". The configuration can however be cus-
tomized to match individual requirements. For more information refer to Chapter 2.3.3,
"Advanced Configuration", on page 28. R&S CONTEST Report Transfer Service
consists of a Windows Service and a status application, that are started by R&S CON-
TEST or R&S CONTEST Report Manager. The service will be restarted every 24
hours, and in the event of unexpected failures.
The following sections describe the status application of R&S CONTEST Report Trans-
fer Service and advanced configuration parameters.
The status application of R&S CONTEST Report Transfer Service offers a brief status
of the data and test report transfer and allows some basic control. It is started upon
system start or once the R&S CONTEST Report Transfer Service becomes active.
Alternatively, it can be started manually via the "Report Transfer Service" start menu
entry. The status application runs as a system tray icon and consists of a status win-
dow and a report window. The system tray icon indicates the status of the service:
A left click on the system tray icon reveals the status window, while a right click opens
the context menu, giving access to the following controls:
● Start Report Transfer Service
● Stop service and abort all transfers
● Continue transferrings reports
● Pause the transfer of reports
● Open Status Window
● Open Report Window
● Open current log file
● About
● Close Application and Service
2.3.2.1 Overview
Figure 2-3: Screenshot of the R&S CONTEST Report Transfer Service Status Window
In the title bar of the status window several buttons for controlling R&S CONTEST
Report Transfer Service are placed. In particular those are:
Continue the transfer (after pause or stop) or start a new instance if currently no service is available
Open the Report Window (see Chapter 2.3.2.3, "Report Window", on page 26)
Close a warning
Depending of the current state of R&S CONTEST Report Transfer Service not all items
are displayed.
Report Window
Default View
The transfer of a report consists of two steps: a transfer to the database server and a
transfer to the file server. In the default view, the number of pending transfers for both
steps is shown. Additionally, the average duration of a transfer and the duration of the
last transfer, as well as the times for the start and end of the last completed transfer
are given.
Warning View
Whenever a report could not be transferred successfully or another problem occurs
which doesn’t terminate the service, the status window switches to the warning view
and displays a failure message. For a short summary of the most common failure mes-
sages and their solutions refer to Chapter 2.3.2.2, "Failure Outline", on page 26.
By closing the warning the view will change back to the default view.
Error View
Whenever the service is not available, e.g. because it was terminated by an external
event or by an internal error, the status window changes to the error view. As soon as
The following table gives a summary of the most common failure messages and possi-
ble solutions:
Message Description
Database or file server not availa- An error occured while testing the connection to the destination serv-
ble ers.
● Check if the destination servers are reachable from the internet.
● Check if the central report server is enabled in R&S CONTEST
or R&S CONTEST Report Manager.
An automatic retry is scheduled.
Currently transferring A report with this ID and destination is already pending in R&S CON-
TEST Report Transfer Service. The newer report will be discarded.
A file is currently used by another The file is still being used by R&S CONTEST or R&S CONTEST
process Report Manager. An automatic retry is scheduled.
A directory is currently not availa- R&S CONTEST or R&S CONTEST Report Manager have not com-
ble pleted creating all report files. An automatic retry is scheduled.
No temporary file can be created The database request returns no valid file path(s). An automatic retry
is scheduled.
File transfer was marked as not The test report directory could not be mounted on the destination file
successful server.
● Check if R&S CONTEST Report Transfer Service is running in
the currently active user's session.
● Check if write permissions for the current user are available on
the destination file server.
An automatic retry is scheduled.
Aborted by an unknown reason The transfer of a report couldn’t be performed properly for reasons
not caused by R&S CONTEST Report Transfer Service.
● Check the log file for more information.
An automatic retry is scheduled.
The Report Window of R&S CONTEST Report Transfer Service is a tabular display of
transfer jobs. Using the icon the history of the last 7 days from the server can be
loaded into the job list. By clicking on a row, further information about the select job is
displayed.
Figure 2-6: Screenshot of the Report Window within the status application of R&S CONTEST Report
Transfer Service
Column Description
"DB" Database transfer. The icon represents the status of the respective job.
"FS" Fileserver transfer. The icon represents the status of the respective job.
"Enque-time" Date and time the job has been added to the queue.
"Finished" The time the respective job has been finished or aborted.
In the "DB" and "FS" columns, the different icons represent the status of the respective
job:
Success
Pending
R&S CONTEST Report Transfer Service inherits the configuration of the destination
servers from R&S CONTEST or R&S CONTEST Report Manager. Therefor changes to
the configuration are not required to use R&S CONTEST Report Transfer Service. A
customization of the configuration however is possible via the modification of the R&S
CONTEST's global configuration file Contest.Settings.xml. Therefor a node
<ReportTransferService> has to be added, directly within the
<ConfigurationSettings> node:
Example:
<ConfigurationSettings>
<ReportTransferService>
<SpoolingBase>
C:\ProgramData\Rohde-Schwarz\Contest\Common\Data\ReportTransferService
</SpoolingBase>
<LogDirectory>C:\ContestReports</LogDirectory>
<OpenEmpty>False</OpenEmpty>
<RetryTimeout>00:01:00</RetryTimeout>
<WorkerTimeout>04:00:00</WorkerTimeout>
<AddressBase>
http://localhost::8181/RohdeSchwarz/Contest/ReportTransferService
</AddressBase>
<AddressExtension-Slave>Slave</AddressExtension-Slave>
<AddressExtension-Status>Status</AddressExtension-Status>
</ReportTransferService>
<ConfigurationSettings>
Node Description
<SpoolingBase> Specifies the path where R&S CONTEST Report Transfer Service cre-
ates the file spoolers for queuing the pending transfers. The Spooling-
Base will not need a lot of memory but will have a high access fre-
quency. The default value is
<ProgramData>\Rohde-Schwarz\Contest\Common\Data\
ReportTransferService. Note that in this path, <ProgramData>
stands for the full path to the ProgramData folder.
<LogDirectory> Specifies the path where R&S CONTEST Report Transfer Service
saves its log files. Initially this is the test report directory used by R&S
CONTEST.
Node Description
<OpenEmpty> R&S CONTEST Report Transfer Service queues all pending transfers
persistent in the SpoolingBase and is able to restore not transferred
reports on the next startup. By setting <OpenEmpty> to True R&S
CONTEST Report Transfer Service will ignore available items on
startup and will abort their transfer. The default value is False.
<RetryTimeout> Each time the transfer of a report fails, the transfer is retried after the
given timeout. The timeout has to be defined in the format “hh:mm:ss”.
By default the timeout is set to 1 minute (00:01:00).
<WorkerTimeout> The timespan for the execution of a job. After the given timeout, the
job will be aborted and a retry enqued. The timeout has to be defined
in the format “hh:mm:ss”. By default the timeout is set to 4 hours
(04:00:00).
3 Reporting
3.1.1 Introduction
Test Result Header reports are XML-based report files that have been introduced by
CETECOM in order to faciliate the exchange of data generated by a test system when
a conformance test case is performed. This open interface standard allows the auto-
matic processing and exchange of test case data consistently across different project
management software tools.
These Test Result Header reports contain information about the test system used to
perform the test case as well as information about the tested DUT and the test case
run. Rather than including actual measurement data, the file paths to the trace and test
result files such as Summary Report and Online Report are listed.
The information stored in Test Result Header reports is structured in the following main
nodes:
● Test equiment
● User equipment
● Test case
● Test variables
● Test execution
The following sections give an overview of the contents of each main node, followed by
an example XML file.
The <testequipment> node lists the test systems involved in the test and their
respective hardware components, including firmware and serial number. Additionally,
the test case name and version is listed, as well as the test system software.
The <userquipment> node provides information about the DUT and its configuration.
This information can either be generated automatically from parameters set in the
"DUT Configuration" dialog or entered manually in the "Test Result Header Generator"
dialog.
The <testcase> node contains the name and title of the test case, as well as the test
specification and its version.
The <testexecution> node contains information about the test case run: start time,
duration, operator, result (verdict) additional information and file paths to other test
related files such as trace files, the Summary Report or the Online Report.
Because some elements or attributes are optional, the following example does not rep-
resent a comprehensive list of all available nodes. The elements have been left empty
for demonstration purposes.
Example:
<?xml version="1.0" encoding="UTF-8"?>
<testresultheader headerversion="1.4" schemaversion="1.4.2">
<testequipment>
<testsystems>
<testsystem>
<name></name>
<manufacturer></manufacturer>
<hardwaredevices>
<hardwaredevice>
<name></name>
<firmware></firmware>
<serialnumber></serialnumber>
</hardwaredevice>
<hardwaredevice>
<name></name>
<firmware></firmware>
<serialnumber></serialnumber>
</hardwaredevice>
<hardwaredevice>
<name></name>
<firmware></firmware>
<serialnumber></serialnumber>
</hardwaredevice>
</hardwaredevices>
<softwareparts>
<softwarepart type="testcase">
<name></name>
<version></version>
</softwarepart>
</softwareparts>
</testsystem>
</testsystems>
<testplatformnumber></testplatformnumber>
</testequipment>
<userequipment>
<devicecode></devicecode>
<configurationcode></configurationcode>
</userequipment>
<testcase>
<name></name>
<title></title>
<testspecification>
<name></name>
</testspecification>
</testcase>
<testvariables>
<conditions>
<voltage value="normal"/>
<temperature value="normal"/>
<vibration value="none"/>
</conditions>
<bands>
<band type="FDD 1" index="1" />
<band type="GSM 900" index="2" />
</bands>
<parameters>
<parameter type="domain"></parameter>
<parameter type="other"></parameter>
</parameters>
<limitations>
<limitation></limitation>
</limitations>
</testvariables>
<testexecution>
<starttime></starttime>
<duration></duration>
<operator></operator>
<result type="pass"/>
<filepaths>
<filepath group="trace">*.*</filepath>
<filepath group="result">*.*</filepath>
<filepath group="any">*.*</filepath>
</filepaths>
</testexecution>
</testresultheader>
The "Test Result Header Generator" dialog can be accessed via "Settings" → "Test
Result Header Generator". This dialog configures the generation of Test Result Header
reports for conformance test cases.
The following table describes the parameters within the "Test Result Header Genera-
tor" dialog.
Table 3-1: Configuration Parameters for the Test Result Header Generator
"Enable Test-Result Header Generator" Enables or disables the generation of Test Result
Header reports for conformance tests.
"Acknowledge at Test Plan Start" If this parameter is enabled, the generation of Test
Result Header reports must be acknowledged once
again at the beginning of the test pan execution.
"Combine Test Case Steps (Test Frequencies and When a conformance test plan is executed, test
Bandwidth)" cases are typically executed several times due to
test frequency loops and bandwidth loops.
If this parameter is enabled, one single Test Result
Header report is generated for the repeated execu-
tion of a test case due to loops.
If this parameter is disabled, a Test Result Header
report is generated for each execution of a test
case.
"Destination Folder" Defines where the Test Result Header reports are
be stored. This is always a folder in the root direc-
tory used to store test reports and log files. The
folder can be selected or created by clicking
"Browse".
The upload performance of Test Result Header
reports can be increased by choosing a dedica-
ted subdirectory in the test report directory.
"Take DUT Service Parameter" If this parameter is enabled, the "Device Code" and
"Configuration Code" information is not entered
manually, but generated from DUT parameters
defined in the "DUT Configuration" dialog under
"DUT Identity".
To enable or disable the generation of Test Result Header reports proceed as follows:
1. From the "Settings" Menu, select "Test Result Header Generator".
The "Test Result Header Generator" dialog opens:
2. In this dialog the generation of Test Result Header reports can be enabled and dis-
abled:
a) To enable the generation of Test Result Header reports check "Enable Test-
Result Header Generator".
b) To disable Test Result Header reports remove the checkmark.
3.2.1 Introduction
JSON (json.org, pronounced "Jason", short for JavaScript Object Notation) is a simple
and open data serialization format. Established libraries for reading and writing JSON
exist for all major programming languages, which makes it well-suited for interchanging
data between applications.
Contrary to Online Reports or Summary Reports, JSON reports are not intended as a
human-readable format. The JSON report serves the purpose of passing on measure-
ment data to post processing tools or client specific database systems. While the struc-
ture of both Online Report and Summary Report may change without notice, in order to
accommodate layout and design changes, the JSON report is intended as a stable for-
mat for measurement data.
For processing measurement data, please use JSON reports. Online Reports or Sum-
mary Reports are subject to change and should therefore not be used for processing
measurement data.
If enabled, the JSON Report Generator creates a JSON file for each test case, when a
test case run is finished. The file is located in the respective test case report folder and
named report.json.
Example:
c:\ContestReports\SampleTestPlan\SampleTestCase\report.json
In a JSON report the data is stored in a tree structure, where internal nodes are either
dictionaries or arrays, and leaves are either numbers, strings, booleans, or null. The
data is devided into two main nodes:
● a header with basic information about test system, DUT, and test case parameters,
● and an array of consecutive measurement nodes, each containing the actual mea-
sured data and measurement parameters.
Example:
2. In this dialog the generation of JSON reports can be enabled and disabled:
a) To enable the generation of JSON reports check "Enable JSON report genera-
tion".
b) To disable JSON reports remove the checkmark.
Following your selection, from now on JSON reports will or will no longer be gener-
ated when a test case run is finished.
3.3.1 Introduction
PDF reports have been introduced in the context of Custom Summary Reports, in
order to facilitate the exchange of test reports in a human-readable format. When "Pdf
file" is selected in the "Save Custom Summary Report" dialog in R&S CONTEST
Report Manager, a PDF file is created, that lists an overview of the selected test cases,
as known from SummaryReportsOverview.xml files. Additionally, the respective
test case reports are embedded in the PDF file as PDF File Attachments.
Custom Summary Reports are generated on demand using R&S CONTEST Report
Manager. They can be compared with Summary Reports Overview reports, the differ-
ence is that, here it is not the report of a test plan, but rather a project report, freely
composed of desired test case reports.
For more information about Custom Summary Reports, refer to the user manual of
R&S CONTEST Report Manager.
Both the Custom Summary Report and the test case reports have a cover sheet, giving
meta information about the project or the test case followed by the actual contents.
The information stored in a Custom Summary Report generally complies with the infor-
mation in a SummaryReportsOverview.xml file:
● Amount of test cases
● Verdict distribution
● List of all test cases within the custom project
– Test case title
– Verdict
– Limitations
– Software-Versions
– Start time and duration
– Observation
Additionally, a Custom Summary Report holds the following information:
● Custom project title
● Custom comment
● Custom logo
● Embedded PDF files of all test case reports
By double clicking the paperclip icon of a test case in the list, the respective embedded
test case report will be opened. Alternatively, the embedded test case reports can be
opened or saved individually from the PDF File Attachments panel of a PDF viewer.
Using Adobe Reader, this panel will be opened automatically when opening the Cus-
tom Summary Report PDF file.
Figure 3-5: Example of attached test case reports within a Custom Summary Report PDF file
Because a PDF versions of test case reports are generated from Summary Report
XML files, the contents are equal.
Summary
Table 3-2: Test Case Information
Observation ✓ ✓ ✓
Operator Name ✓
Operator Parameter ✓
Identifier ✓ ✓
Product Type ✓ ✓
Interconnection Details ✓ ✓
Calibration Information ✓
DUT Identifier ✓ ✓ ✓
DUT Manufacturer ✓ ✓
DUT HW Revision ✓ ✓ ✓
DUT SW Revision ✓ ✓ ✓
SIM Identifier ✓ ✓ ✓
SIM Manufacturer ✓
SIM Revision ✓
SIM Parameters ✓ ✓
Tested Band(s) ✓ ✓ ✓
Tested Bandwidth ✓ ✓
Test Conditions ✓ ✓ ✓
Step Number ✓ ✓
Description ✓ ✓
Result ✓ ✓
Supplementary Information
Reduced: not available
Standard: only external links and charts
Extended: all contents
PDF reports are generated using the Custom Summary Report functionality of R&S
CONTEST Report Manager. If no PDF files for the selected test case report(s) exist,
they will be generated from the respective SummaryReport.xml files. In case PDF
file have already been generated, they will be used instead.
Because PDF reports are generated from SummaryReport.xml files, their generation
is not possible without the available xml files.
Because the generation of PDF files for Summary Reports is a time-consuming proc-
ess, a setting has been implemented in R&S CONTEST, that allows the automatic
generation of Summary Report PDF files at the end of a test case run. This way, the
generation of Custom Summary Reports in PDF format will be performed significantly
faster.
Enabling this setting will increase the test case run time.
In order to activate the automatic generation of PDF test case reports, proceed as fol-
lows:
1. From the R&S CONTEST menu bar, select "Settings" → "Summary Report".
The "Summary Report" dialog appears.
4. From the "Select report style" field, select the desired information density option.
For more information about the information density options refer to Chapter 3.3.2.2,
"Test Case Report", on page 42.
The PDF files are located in the respective test case's report folder. The naming con-
vention is as follows:
TestCaseName_TestCaseNumber_PageOrientation_InformationDensity.pdf
Example:
TestcaseLteThroughput_7_6_1_portrait_extended.pdf
Using R&S CONTEST Report Manager, PDF reports can be generated on demand,
even if the automatic PDF generation is disabled in the Summary Report settings of
R&S CONTEST.
2. Select the desired test case run(s) from the right panel.
3. Right click on the selected test case run(s) to open the context menu.
6. Select the desired page orientation and information density option ("Style").
7. Specify a destination folder and filename (the file extension .pdf will be added
automatically).
4.1.1 Introduction
R&S CONTEST Remote Server is based on SOAP technology and serves the purpose
of remotely performing essential functions on the test system. Controlled by a (possibly
remote) SOAP client, R&S CONTEST Remote Server uses the Command Line Inter-
face to interact with R&S CONTEST. It acts as a single interface for all available R&S
CONTEST versions and their respective Command Line Interfaces.
For more information on R&S CONTEST Command Line Interface and its functionality,
refer to Chapter 4.2, "Command Line Interface", on page 60.
For the start of a test case with parameters R&S CONTEST Remote Server acts as a
single transparent interface to all installed R&S CONTEST application versions. The
start of a test plan however is always related to the version where the test plan has
been created and stored with the help of the R&S CONTEST GUI.
The following figure illustrates the main architecture.
Figure 4-1: Illustration of the R&S CONTEST Remote Server and Command Line Interface architec-
ture
The following prerequisites must be fullfilled in order to start R&S CONTEST Remote
Server:
● The newest version of the package R&S CONTEST Remote Server must be instal-
led on the system controller.
Depending on the version it comes as a single installer (e.g.
RS-CONTEST-REMOTE-SERVER_14.00.0.170.msi).
● The test system must be configured by means of the R&S CONTEST GUI.
● All DUT mandatory properties (cabling, automation, standard properties) must be
configured by means of the R&S CONTEST GUI.
During the remote test plan or test case run a DUT can be activated.
The SOAP server is now running with the standard IP port 65111. It is not neces-
sary to start the R&S CONTEST GUI in order to use the R&S CONTEST Remote
Server functionality.
Please ensure that the relevant port number is not blocked by any firewall.
The SOAP interface description can be retrieved by means of a generated WSDL file.
With R&S CONTEST Remote Server running, a "WSDL" file containing the interface
description can be generated by invoking a URL with the following structure: http://
<SystemControllerHostNameOrIP>:<Port>/RemoteServer/
RemoteServer?wsdl.
Example:
http://ts8980:65111/RemoteServer/RemoteServer?wsdl
For demonstration purposes, a C# code sample is provided with the installation of R&S
CONTEST by means of a Microsoft Visual Studio solution. A .zip file containing the
solution is located in
"C:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\Docs\
ContestRemoteServerSampleClient.zip".
Note that in this path, <ContestMajorVersionNumber> stands for the actual R&S
CONTEST major version number (e.g. 14).
The Program.cs illustrates the usage of the interface. For further details please refer
to the in-code documentation.
The SOAP interface provides a set of functions to set parameters, to perform the
requested actions or to retrieve information.
4.1.4.1 DoSetParameterValue(<Parameter>,<Value>)
This function sets parameters which are saved and used within the following call of
DoStartOperation(<Action>). The parametrization is depending on the use case
and differs between the start of a test plan or of a test case.
General Parameters
Testplan Test plan The path to the standard R&S CONTEST test plan
(.rstt), relative to the test plan directory on the
system controller
(c:\ProgramData\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\Testplans)
e.g. MySubFolder\MyTestPlan.rstt.
TestplanVersion Contest major version This specifies the version that was used for creating
number and saving the test plan.
e.g. 14.
ReportFolder Report folder name The given folder name must be a relative path. The
directory will be created within the test report direc-
tory
\\<SystemControllerHostNameOrIP>\
ContestReports.
Simplified test plans (.xml) are not created beforehand and referenced via the Test-
plan parameter, but built up step by step with the following parameter:
AddTestcase Test case Adds a test case to the current simplified test plan
as a string, containing the XML element
<testcase> and its child elements.
For an example of the usage of this parameter refer
to Chapter 4.1.5.2, "Starting a Simplified Test Plan",
on page 57.
For more information refer to "Simplified Test Plan"
on page 65 or to "Available Test Cases"
on page 71 for the exact spelling of the test case
name and the possible parametrization.
ReportFolder Report folder name The given folder name must be a relative path. The
directory will be created within the test report direc-
tory
\\<SystemControllerHostNameOrIP>\
ContestReports.
4.1.4.2 DoStartOperation(<Action>)
This function performs the desired actions. Depending on the action the parameters
from DoSetParameterValue(<Parameter>,<Value>) will be considered.
Action Description
CreateAvailableDuts Requests a file with all available and configured DUTs listed.
The output file will be written into the local R&S CONTEST report
folder and can be retrieved via
\\<SystemControllerHostNameOrIP>\ContestReports\
DUTs.xml.
For the syntax of the generated file please see "Configured DUTs"
on page 75.
4.1.4.3 DoGetOutput()
This function returns the latest console output of the running test case. With the call the
internal buffer is deleted and will return with the next call the new output.
4.1.4.4 DoGetLastExitCode()
This function will always return a code stating the verdict. A complete list of the return
codes is available at Chapter 4.2.4.2, "Process Return Code", on page 70.
4.1.5 Examples
Example:
var client = new RemoteServerService();
client.Url = "http://ts8980:65111/RemoteServer";
client.DoSetParameterValue("TestplanVersion", "14");
client.DoSetParameterValue("Testplan", "Remote.rstt");
client.DoStartOperationAsync("StartTestplan");
while (true)
{
var output = client.DoGetOutput();
if (!string.IsNullOrEmpty(output))
{
Console.Write(output);
Thread.Sleep(100);
}
}
Note that in the following example, the same test case is added twice with different
<TestFrequencyIDs> for each.
Example:
var client = new RemoteServerService();
client.Url = "http://ts8980:65111/RemoteServer";
client.DoSetParameterValue("TestplanVersion", "14");
client.DoSetParameterValue("KeepContestAlive", "true");
client.DoSetParameterValue("AddTestcase", @"
<Testcase Name=""TestcaseLteAclr_6_6_2_3"">
<Name>LTE FDD 6.6.2.3 Adjacent Channel Leakage power Ratio</Name>
<Description>Adjacent Channel Leakage power Ratio</Description>
<GUID>95A27D80-0BAD-494D-B6FA-F28C818C60A3</GUID>
<Version>RF-LTE-3.40</Version>
<ReleaseDate />
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalConditions>NN;</EnvironmentalConditions>
<LTEBands>FDD 1;</LTEBands>
<Bandwidths>5 MHz;</Bandwidths>
<TestFrequencyIDs>Low;</TestFrequencyIDs>
<BaseVersion>14</BaseVersion>
</Testcase>
");
client.DoSetParameterValue("AddTestcase", @"
<Testcase Name=""TestcaseLteAclr_6_6_2_3"">
<Name>LTE FDD 6.6.2.3 Adjacent Channel Leakage power Ratio</Name>
<Description>Adjacent Channel Leakage power Ratio</Description>
<GUID>95A27D80-0BAD-494D-B6FA-F28C818C60A3</GUID>
<Version>RF-LTE-3.40</Version>
<ReleaseDate />
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalConditions>NN;</EnvironmentalConditions>
<LTEBands>FDD 1;</LTEBands>
<Bandwidths>5 MHz;</Bandwidths>
<TestFrequencyIDs>Mid;</TestFrequencyIDs>
<BaseVersion>14</BaseVersion>
</Testcase>
");
client.DoStartOperationAsync("StartXMLTestplan");
while (true)
{
var output = client.DoGetOutput();
if (!string.IsNullOrEmpty(output))
{
Console.Write(output);
Thread.Sleep(100);
}
}
Example:
var client = new RemoteServerService();
client.Url = "http://ts8980:65111/RemoteServer";
client.DoSetParameterValue("TestplanVersion", "14");
client.DoSetParameterValue("Testcase", "TestcaseLteAclr_6_6_2_3");
client.DoSetParameterValue("TestcaseVersion", "RF-LTE-3.40");
client.DoSetParameterValue("EnvironmentalCondition", "NN");
client.DoSetParameterValue("PrimaryBand", "FDD 1");
client.DoSetParameterValue("PrimaryBandwidth", "5 MHz");
client.DoSetParameterValue("PrimaryFrequencyId", "Mid");
client.DoSetParameterValue("ForceAll", "true");
client.DoStartOperationAsync("StartTestcase");
while (true)
{
var output = client.DoGetOutput();
if (!string.IsNullOrEmpty(output))
{
Console.Write(output);
Thread.Sleep(100);
}
}
4.2.1 Introduction
4.2.2 Overview
Optionally a small graphical user interface will be created, to show the status of the
CommandLineInterface.exe. The test case report messages will be displayed as
plain unformatted text. It is intended to support debugging or status retrieval. This com-
mand line interface GUI can be enabled with the --gui (short form: -g) parameter.
The following picture illustrates the (optional) command line interface GUI.
4.2.3 Parameters
4.2.3.1 General
--port -p port num- Sets the port used to communicate with R&S CONTEST.
ber
--closecontest -cc Ends the R&S CONTEST process, when the command
line interface process is ended.
--abort -ab timeout in Ending all command line interface processes and
ms thereby aborting all running test plans that have been
started with the command line interface. This parameter
can be combined with --closecontest in order to end
the R&S CONTEST process as well.
First the application will be stopped in a controlled way,
i.e. it will be waited for any running measurements, the
report will be closed, the devices will be reset properly.
After the specified timeout the process will be ended.
That means the software has no chance to react in a
proper way. So please ensure a correct timeout.
--testsystemsettings -tss path name Sets the file name with full path for the test system set-
tings file. This used for automated regression systems,
when the current system configuration is replaced by the
R&S CONTEST installer.
This file holds the differences to the standard installation
and can be generated automatically within the R&S
CONTEST GUI test system dialog. Please refer to R&S
CONTEST Help as well.
--availabletestcases -at .xml file Requests a file with all available test cases listed.
name The given .xml file name is the output file generated by
the program which can be generated on any (writable)
location.
Each test case entry will contain the following information
in addition.
● Test case name
● Version
● Assembly Version
● Bands supported
● Bandwidth supported
● Frequency range value supported
● Environmental condition supported
For the syntax of the generated file please see "Available
Test Cases" on page 71.
--configuredduts -cd .xml file Requests a .xml file with all available and configured
name DUTs (devices under test) listed.
For the syntax of the generated file please see "Config-
ured DUTs" on page 75.
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.
Example:
RohdeSchwarz.Contest.CommandLineInterface.exe –at c:
\Alltestcases.xml
(written all in one line)
This method is used to start test plans. There are two possibilities to start test plans:
● Start standard R&S CONTEST test plan (.rstt)
● Start simplified test plan (.xml)
The following parameter are applicable to both possibilities:
Table 4-3: Parameters for starting a test plan with the Command Line Interface
--keepalive -ka Keeps the command line interface GUI alive, although an
error occured. This is for debugging purposes to watch
the latest error messages.
--activedut -ad DUT name Activates the given DUT (Device Under Test). The name
must be identical to the given out of
--configuredduts.
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.
--testplan -tp test plan Sets the test plan, that should be executed.
The test plan name must be specified with a full qualified
path! The file extension for a standard R&S CONTEST
test plan is .rstt, while the file extension for a simpli-
fied test plan is .xml.
--runandrepeat -rnr Enables the run and repeat mode, known from R&S
CONTEST GUI. If this parameter is set, the settings
defined in the "R&S CONTEST" → "Settings" → "Test
Case Run Manager" dialog will be used for the test exe-
cution.
For more information please refer to "Modifying Settings
in the R&S CONTEST GUI" on page 61.
--campaign -c campaign Assigns the test plan to the specified campaign. The
GUID campaign is specified by its identifier (GUID). The GUID
of a campaign can be obtained from the Campaign Man-
ager in the R&S CONTEST GUI or R&S CONTEST
Report Manager.
Campaigns are configured by means of the R&S CON-
TEST GUI. For more information please refer to "Modify-
ing Settings in the R&S CONTEST GUI" on page 61.
Example:
RohdeSchwarz.Contest.CommandLineInterface.exe -tp c:
\ProgramData\Rohde-Schwarz\Contest\<BASE-x.yz>\Testplans\MyTestplan.rstt
Example:
Wrong: <TestFrequency>Low;Mid;High</TestFrequency>
Right: <TestFrequency>Low;</TestFrequency>
Additionally to the parameters shared with start standard R&S CONTEST test plan, a
simplified test plan has the following parameters:
Table 4-4: Additional parameters for starting a simplified test plan with the Command Line Interface
For further details please refer to "Simplified Test Plan" on page 74.
The test cases will run as stated, i.e. no looping or sorting or repeating will be done
within R&S CONTEST.
Starting a simplified test plan can not be combined with Chapter 4.2.3.4, "Start Test
Case", on page 65.
Example:
RohdeSchwarz.Contest.CommandLineInterface.exe -tp c:
\ProgramData\Rohde-Schwarz\Contest\<BASE-x.yz>\Testplans\MyTestplan.xml
(written all in one line)
Example:
5 MHz, 10 MHz, 15 MHz
or
5 MHz; 10 MHz; 15 MHz
If no parameter is given the test case runs as known from the standard R&S CON-
TEST test plan. The test case will be started and depending on the 3GPP specification
the applicable combinations of the parameter values will be executed.
The following parameter are applicable to starting a single test case:
Table 4-5: Parameters for starting a test case with the Command Line Interface
--keepalive -ka Keeps the command line interface GUI alive, although an
error occured. This is for debugging purposes to watch
the latest error messages.
--dut -dt DUT num- Sets the DUT number, as configured within the DUT
ber MUX (only if the DUT MUX is enabled).
Note that in the case of starting a single test case,
the --dut parameter is used differently than for
starting a test plan!
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.
--activedut -ad DUT Activates the given DUT (Device Under Test). The name
name must be identical to the given out of
--configuredduts.
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.
--testcase -tc test case Sets the test case that should be executed.
--version -v version Sets the version of the test case. This parameter is man-
datory.
--runandrepeat -rnr Enables the run and repeat mode, known from R&S
CONTEST GUI. If this parameter is set, the settings
defined in the "R&S CONTEST" → "Settings" → "Test
Case Run Manager" dialog will be used for the test exe-
cution.
For more information please refer to "Modifying Settings
in the R&S CONTEST GUI" on page 61.
--campaign -c campaign Assigns the test case to the specified campaign. The
GUID campaign is specified by its identifier (GUID). The GUID
of a campaign can be obtained from the Campaign Man-
ager in the R&S CONTEST GUI or R&S CONTEST
Report Manager.
Campaigns are configured by means of the R&S CON-
TEST GUI. For more information please refer to "Modify-
ing Settings in the R&S CONTEST GUI" on page 61.
Additionally, test cases may have their own parameters. Using the --info parameter,
a list of available test case paramters can be shown.
The following list is only an example of test case parameters, not a complete list and
not applicable to all test cases.
Example:
Table 4-6: Example of parameters for starting a test case with the Command Line Interface
--primaryband -pd band Sets the bands for the test case.
For RRM or carrier aggregation test cases
this applies for the primary cell only.
Values: FDD 1, FDD 2, TDD 41, …, not set =
all
--primarybandwidth -ph bandwidth Sets the bandwidth for the test case. This
applies to LTE only.
Values: 1.4 MHz, 3 MHz, 5 MHz, 10 MHz, 15
MHz, 20 MHz, not set = all
--primaryfrequency -py frequency Sets the frequencies for the test case.
Values: Low, Mid, High, not set = all
--secondaryband -sd band Sets the bands of the secondary cell for the
test case. This applies to RRM and carrier
aggregation test cases only.
Values: FDD 1, FDD 2, TDD 41, …, not set =
all
--secondarybandwidth -sh bandwidth Sets the bandwidth of the secondary cell for
the test case. This applies to carrier aggrega-
tion test cases only.
Values: 1.4 MHz, 3 MHz, 5 MHz, 10 MHz, 15
MHz, 20 MHz, not set = all
--secondaryfrequency -sy frequency Sets the frequencies of the secondary cell for
the test case. This applies to carrier aggrega-
tion test cases only.
Values: Low, Mid, High, not set = all
--environment -et environ- Sets the environmental conditions for the test
ment case.
Values: LL, LH, NN, ML, HL, HH, not set = all
Example:
These parameters are used as follows:
RohdeSchwarz.Contest.CommandLineInterface.exe -tc
TestcaseLtePower_6_2_2 -v RF-LTE-2.70 -pd "FDD 5" -ph "5 MHz;10
MHz;15 MHz" -et NN;HL
Written all in one line will start the test case with the following setup:
Table 4-7: Overview of the example setup
Option Value
Version RF-LTE-2.70
Option Value
Bands FDD 5
4.2.4 Output
The following sections describe the output of the command line interface. Except the
Online Report, all that is being generated by R&S CONTEST is generated via CLI as
well (e.g. Summary Report).
By default the process writes all report messages to the standard stream stdout.
All test case report messages will be presented as unformatted plain text and depend
on the particular test case. Especially the different application types RF, RRM, PQA
are diverse by nature. The content of the particular test case report is subject to
change without notice.
The following entries are of highest interest for the client. They are common for all
applications and will be kept compatible.
Table 4-8: Output common for all applications
Message Description
Testplan Report Directory local: directory This reports the absolute path of the current test plan which
is containing the test plan name and the time stamp.
Testplan Report Directory UNC: directory This reports the directory in UNC notation:
\\ComputerName\ReportShare\TestPlanDirectory.
With that information a remote access to the report files is
possible.
Verdict: verdict This is the format how the final verdict of the test case will be
reported. The possible values are:
● Passed
● Passed with Restriction
● Failed
● Failed with Restriction
● Inconclusive
● Not Initialized
● Not Applicable
● Completed
● Aborted
Redirection of stdout
When calling the process the output can be re-directed into a file.
Example:
RohdeSchwarz.Contest.CommandLineInterface.exe –tp Testplan.rstt
>> TheContestReportMessages.txt
(written all in one line)
Code 130 reflects the missing license for the Remote Control product R&S TS8-
KT130.
If a test case or test plan has been started the code in the range of 0 - 6 is enumerating
the final verdict.
For a test plan the verdict of all single test cases will be cumulated, Inconclusive will
overstrike all other, Failed will overstrike Passed.
Table 4-9: Overview of process return codes
0 Passed
2 Failed
4 Inconclusive
5 Not Applicable
6 Not Initialized
<Testcase Name="TestcaseLbsPerformance3gppSupl_7_1_2">
<Name>LTE A-GNSS 7.1.2 - Sensitivity Fine Time Assistance (SUPL, GPS)</Name>
<Description>Sensitivity Fine Time Assistance (SUPL, GPS)</Description>
<GUID>0BF1FBE3-A588-4D8B-9486-500220490CB7</GUID>
<Version>RF-LBS-1.30</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;HL;HH;</EnvironmentalCondition>
<LTEBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;FDD1
<Bandwidth>1.4;3;5;10;15;20;</Bandwidth>
<TestFrequency>Low;Mid;High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseVzwSvd1xRttAclr_6_6_2_3">
<Name>VZW SVLTE 1xRTT 6.6.2.3 - Adjacent Channel Leakage Power Ratio (ACLR)</Name>
<Description>Adjacent Channel Leakage Power Ratio (ACLR)</Description>
<GUID>917D0023-3EFE-4a68-8C67-28029D914BCF</GUID>
<Version>RF-LTE-2.70</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;HL;HH;</EnvironmentalCondition>
<LTEBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;FDD1
<Bandwidth>1.4;5;10;15;20;</Bandwidth>
<TestFrequency>Low;Mid;High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseLteBlocking_7_6_2_and_7_7">
<Name>LTE FDD 7.6.2 and 7.7 Out-of-band blocking and Spurious response</Name>
<Description>Out-of-band blocking and Spurious response</Description>
<GUID>8088B53E-A381-4f85-A0D4-F8562A862FF7</GUID>
<Version>RF-LTE-2.70</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<LTEBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;FDD1
<Bandwidth>1.4;5;10;15;20;</Bandwidth>
<TestFrequency>High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseLteBlocking7_6_2_and_7_7_TDD">
<Name>LTE TDD 7.6.2 and 7.7 Out-of-band blocking and Spurious response</Name>
<Description>Out-of-band blocking and Spurious response</Description>
<GUID>45B58F49-25A0-4f0e-8DE4-3A11B7138260</GUID>
<Version>RF-LTE-2.70</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<LTEBand>TDD33;TDD34;TDD38;TDD39;TDD40;TDD41;TDD41AXGP;</LTEBand>
<Bandwidth>5;10;15;20;</Bandwidth>
<TestFrequency>High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseRtteTrx_4_2_4C_TDD">
<Name>R&TTE TDD 4.2.4B Spurious emission band UE co-existence</Name>
<GUID>86815280-D78E-4BC9-B158-19B015A84064</GUID>
<Version>RRM-4.00</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IRrmTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;Vib;ML;HL;HH;</EnvironmentalCondition>
<PrimaryBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD19;</PrimaryBan
<SecondaryBand>NoBand;</SecondaryBand>
</Testcase>
</Testcases>
</TestcaseInformation>
<TestcaseInformation>
<Testcases>
<Testcase Name="TestcaseGsmErrorRateSamplesCs_14_2_3">
<Name>GSM 14.2.3 - Reference sensitivity - FACCH/F</Name>
<Description>Reference sensitivity - FACCH/F</Description>
<GUID>5BFA87DC-681D-4D7F-8FCC-6DA6F1A4E3E3</GUID>
<Version>RF-GSM-1.40</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IGsmTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<GSMBand>GSM850;</GSMBand>
</Testcase>
<Testcase Name="TestcaseGsmErrorRateSamplesCs_14_2_4">
<Name>GSM 14.2.4 - Reference sensitivity - FACCH/H</Name>
<Description>Reference sensitivity - FACCH/H</Description>
<GUID>59FA4746-8FE5-47C3-B9F6-C9F79988D048</GUID>
<Version>RF-GSM-1.40</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IGsmTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<GSMBand>E-GSM900;</GSMBand>
</Testcase>
<Testcase Name="TestcaseLbsPerformance3gppCplane_7_1_1">
<Name>LTE A-GNSS 7.1.1 - Sensitivity Coarse time assistance (C-Plane, GPS)</Name>
<Description>Sensitivity Coarse time assistance (C-Plane, GPS)</Description>
<GUID>31B6F77B-EC87-462F-8598-0E5413730C75</GUID>
<Version>RF-LBS-1.30</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;</EnvironmentalCondition>
<LTEBand>FDD1;</LTEBand>
<Bandwidth>5;</Bandwidth>
<TestFrequency>Low;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseLbsPerformance3gppSupl_7_1_2">
<Name>LTE A-GNSS 7.1.2 - Sensitivity Fine Time Assistance (SUPL, GPS)</Name>
<Description>Sensitivity Fine Time Assistance (SUPL, GPS)</Description>
<GUID>0BF1FBE3-A588-4D8B-9486-500220490CB7</GUID>
<Version>RF-LBS-1.30</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;</EnvironmentalCondition>
<LTEBand>FDD4;</LTEBand>
<Bandwidth>5;</Bandwidth>
<TestFrequency>Mid;</TestFrequency>
</Testcase>
</Testcases>
</TestcaseInformation>
Configured DUTs
The generated .xml file called by --configuredduts is structured as follows:
<DUTs>
<DUT Name="DutName#1AsConfiguredWithinContestGUI"></DUT>
<DUT Name="DutName#2AsConfiguredWithinContestGUI"></DUT>
</DUTs>
The DUT name references to a folder named identical and holding all configuration
data of the relevant DUT. The folder can be found in
c:\ProgramData\Rohde-Schwarz\Contest\Common\DevicesUnderTest
4.3.1 Introduction
This chapter describes how to integrate R&S CONTEST into Jenkins in order to run
tests fully unattended. Because the Jenkins integration uses the Command Line Inter-
face to initiate the tests in R&S CONTEST without it's GUI, all that can be accom-
plished with the Command Line Interface can be automated.
Before any automation using Jenking, the following configurations must be done by
means of the R&S CONTEST GUI:
● Test system configuration, if changes on the hardware apply
● Parametrization, cabling and activation of the DUT
● Test system identity, to be updated once
For more information about the Command Line Interface, refer to Chapter 4.2, "Com-
mand Line Interface", on page 60.
The following lines show an example for the different steps within Jenkins that are
used to execute a test plan. Please adapt this accordingly to your infrastructure.
Build Step
set JenkinsReportName="%WORKSPACE%\JUnitReport.xml"
set JenkinsSummaryReportName="%WORKSPACE%\SummaryReportsOverview.html"
cd "C:\Program Files (x86)\Rohde-Schwarz\Contest\11\GUI\Bin"
5 DUT Automation
5.1.1 Introduction
In the context of DUT automation, R&S CONTEST offers an interface for remote con-
trol plugins that can be used to automate certain actions of the DUT. Along with the
diversity and the constant technological advancement of DUTs, the requirements of
remote control plugins change rapidly. Therefore — rather than providing actual plu-
gins — an API has been created, enabling third party developers to easily create their
own custom DUT remote control plugins, tailored to their specific needs.
Custom DUT remote control plugins can be programmed in any .Net programming lan-
guage and must be compiled for the target framework .Net Framework 4.
In addition to the following API documentation, the definition of the interface can be
found in RohdeSchwarz.Contest.DutRemoteControlPlugin.dll, located in
c:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\Base\<ContestBaseVersionNumber>\.
Namespace: RohdeSchwarz.Contest.Contracts
Assembly: RohdeSchwarz.Contest.DutRemoteControlPlugin.dll
5.1.2.1 Syntax
In C#
public interface IDutRemoteControlPlugin : IDisposable
5.1.2.2 Properties
Syntax Description
String Description { get; } Gets the description of the plugin to be presented in R&S CON-
TEST GUI or in test reports.
String Resource { get; set; } Gets or sets the resource of the DUT interface formatted in
VISA resource name syntax.
String ErrorMessage { get; } Gets the error message which is set in Read() and Write()
upon an error.
5.1.2.3 Methods
String Read(String command: The com- Read access to the The read response of the
command); mand to be executed device to receive DUT
response upon the
specified command.
Sets the property
ErrorMessage if an
error occurs.
In order to make the assembly of a custom DUT remote control plugin available to R&S
CONTEST, it must be copied to the following location:
c:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\DutPlugins\.
The plugin can then be activated by means of the R&S CONTEST GUI. During the
configuration, the following parameters must be set:
Parameter Description
Plugin Description Read only field of the description provided by the loaded plugin.
In order to configure R&S CONTEST to use a custom DUT remote control plugin, pro-
ceed as follows:
1. In R&S CONTEST, select "Test-Environment Configuration" → "DUT Configura-
tion".
The "DUT Configuration" dialog appears.
2. Select the DUT to be remotely controlled from the "Available DUTs" list.
7. Click the "Save" button, then the "OK" button at the bottom of the dialog.
The specified DUT remote control plugin will from now on be used for the selected
DUT.
In order to facilitate the development of a custom DUT remote control plugin, an exem-
plary Microsoft Visual Studio 2013 solution is provided with the standard R&S CON-
TEST installation. The solution is located in
c:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\DutPlugins\DutRemoteControlPlugin\
DutRemoteControlPluginExample.sln. The sample code is written in C#.
5.2.1 Introduction
This chapter describes the integration of external applications for DUT automation or
execution of customized tasks.
In order to provide an interface for DUT automation, the "Automation Manager Mode"
has been implemented in R&S CONTEST. This interface was originally developed for
the communication between R&S CONTEST and R&S Automation Manager (Option
R&S CMW-KT014) but can also be used by third party DUT automation applications.
R&S CONTEST can thus interact with any DUT automation application via TCP/IP
sockets. Both IPv4 and IPv6 addresses are supported. The desired R&S CONTEST
automation commands need to be interpreted by the DUT automation application and
mapped to the respective DUT-specific commands as illustrated by Figure 5-3 for
instance.
The following figure illustrates the interfaces between R&S CONTEST, a DUT automa-
tion application and the DUT.
Figure 5-2: Illustration of the interfaces between R&S CONTEST, a DUT automation application and
the DUT
The DUT automation application as server process needs to bind itself to a socket and
provides unambiguous access via a well known port number. Therefore the DUT auto-
mation application must be started before the execution of the test applications. The
R&S CONTEST test application connects to this socket each time an automation com-
mand is sent to the DUT automation application. The DUT automation application then
has to connect to an interface provided by the DUT (e.g. a serial COM port).
The interface to the DUT depends on the device manufacturer and is out of scope for
this document. DUT automation applications might need to send AT commands or
binary messages or even start and stop DUT-specific applications to control the DUT.
sage the test application is aborted. In this case an error message is added to the test
report.
The timeout for the reception of the response message can be set in "DUT Configura-
tion" → "Automation" tab → "Common" tab of the desired DUT. In the "Registration"
box, adjust the parameter "Maximum Registration Time" to your needs.
The following figures illustrate exemplary message sequences with the respective
automation commands.
DUT_SWITCH_ON Command
DUT_SWITCH_OFF Command
RESET Command
Before DUT automation applications can be used, R&S CONTEST needs to be config-
ured accordingly by means of the R&S CONTEST GUI.
In order to configure R&S CONTEST interact with a DUT automation application, pro-
ceed as follows:
1. In R&S CONTEST, select "Test-Environment Configuration" → "DUT Configura-
tion".
The "DUT Configuration" dialog appears.
6. In the "Resource" field, enter the TCP/IP string to the desired DUT automation
application.
7. Click the "Save" button, then the "OK" button at the bottom of the dialog.
R&S CONTEST will now interact with the DUT automation application at the speci-
fied TCP/IP socket.
For the combination of the "Automation Manager Mode" and the automatic operation of
power supplies please refer to the documentation of the "Automation" tab in the R&S
CONTEST Help.
5.3.1 Introduction
This chapter describes the integration of an external application which listens to R&S
CONTEST IP trigger events.
IP triggers allow external applications to start tasks like batch jobs or to execute tools
upon certain events that occur during a test run. For example an IP trigger can start a
job that copies DUT debug files to a dedicated location, when the end of a test case is
reached.
Because IP triggers are evaluated analogue to break conditions, they follow the same
naming as the break conditions. For more information on break conditions please refer
to "Break Conditions tab" in the R&S CONTEST user manual.
The interface between R&S CONTEST and the IP trigger application is analogue to the
interface of a DUT automation application, as described in Chapter 5.2, "DUT Automa-
tion Applications", on page 81. Such an application needs to bind itself to a socket and
provides unambiguous access via a well known port number. The R&S CONTEST test
application connects to this socket each time an IP trigger message is sent to the appli-
cation.
The external application as server process must be started before the execution of the
test application.
The following figure illustrates the interface between R&S CONTEST and an IP trigger
application.
Figure 5-8: Illustration of the interface between R&S CONTEST and an IP trigger application
5.3.2.1 Messages
When a trigger event occurs during test execution, a message is sent from the test
application to the connected IP trigger application. The latter must then acknowledge
each received message by sending the response message "OK". If no acknowledg-
ment is received, test plan execution is continued after the specified timeout.
The following figure shows a message sequence as an example to illustrate the con-
cept.
Figure 5-9: Illustration of an exemplary message sequence between R&S CONTEST and an IP trigger
application
The following messages are sent depending on the event that triggers the message:
Note that each time a positive response message "OK" is received or the timeout is
reached the test application disconnects the socket.
Figure 5-10: Screenshot of the IP Triggers tab in the Application Debugging dialog
"Client IP-Address" The IP address of the IP trigger application to which the message should
be sent.
"Connection Time-Out" Time until the test plan execution is resumed if no acknowledgement mes-
sage "OK" is received from the IP trigger application.
6.1.1 Introduction
Familiarity with the graphical user interface of R&S AMU200A is assumed in this chap-
ter.
The "Fading" dialog is used to configure multipath fading signals. This dialog can be
accessed by either selecting the "Fader" block or by pressing the MENU key.
R&S CONTEST requires the insertion loss of the fading settings to calculate the cor-
rect power compensation value for the faded signal. The insertion loss is displayed in
the "Insertion Loss Configuration" submenu of the fading dialog.
In order to read out the insertion loss for the fading settings, proceed as follows:
1. Make sure the mode of the "BB Input Block" is set to "Digital Input":
a) Select "config...".
b) Select "BB Input Settings".
c) In the "Mode" field, select "Digital Input".
4. Set either the "Baseband" block or the "BB Input" block "On".
5. Open the "Fading" dialog and click the "Insertion Loss Configuration" button.
The "Fading: Insertion Loss Configuration" dialog appears.
User-defined fading profiles can be used in conformance test cases as well as in R&D
applications. The following parameters are applicable to both cases.
"Fading File Name" or "UserDefinedFadingFile- Name of the fading settings file on the instrument.
Name" The file extension .fad may be omitted. If the file
was stored in the default folder D:\Contest\, the
file name is sufficient. Otherwise, the complete path
must be specified.
"Fading Insertion Loss" or "UserDefinedFadingInser- Insertion loss for this fading profile.
tionLoss"
2. Select "Properties".
The "Test Case Parameters" dialog appears.
4. Enter the values described in Table 6-1 in the respective fields of the desired test
step.
6.2.1 Introduction
The basic settings of climate chambers, such as the device's resource string and the
device type can be defined in the "Test System Configuration" dialog. A more
Basic settings are configured in the "Test System Configuration" dialog. In the "Hard-
ware Configuration" tab, the "Resource String" and "Device Type" can be defined.
The resource string consists of the interface type and addressing information.
Please refer to the manual of the climate chamber to determine the correct interface
type and addressing information.
The "Device Type" field is only editable for the device "ClimateChamber". This field is
used to select the appropriate device driver for the chamber. Drivers are available for
the following climate chamber models:
Advanced settings for climate chambers are defined in the XML configuration file
ClimateChamberConfigurationParameters.xml. The file is located in:
c:\ProgramData\Rohde-Schwarz\Contest\<CONTEST-Major-Version>\
ConfigurationParameters\
Four sample configuration files are shipped with the R&S CONTEST-Base installation:
● ClimateChamberSuppressResetConfigurationParameters.xml
Using this configuration file, the climate chamber will not be switched off after the
test plan execution.
● ClimateChamberVoetschVT4002ConfigurationParameters.xml
Contains the (default) parameters for the model Vötsch VT4002. The driver uses
the same values if the configuration file is not present.
● ClimateChamberVoetschVTL4003ConfigurationParameters.xml
Contains the parameters for the model Vötsch VTL4003.
● ClimateChamberTestEquityConfigurationParameters.xml
Contains the parameters for the model Test Equity 105A.
These sample configuration files are located in:
c:\ProgramData\Rohde-Schwarz\Contest\<CONTEST-Major-Version>\
ConfigurationParameters\Templates\
The following parameters are applicable to all climate chambers:
Parameter Description
SuppressReset If enabled the climate chamber will not be reset at test plan start and
finish.
Default value: false
3. In the context menu, select "Create", then the desired configuration parameter file.
The paramters defined in the selected configuration file are copied to the climate
chamber configuration file (
ClimateChamberConfigurationParameters.xml) and will from now on be
used.
If the configuration file does not exist or if a parameter entry is not present in the file,
the driver uses the default value described in Table 6-3.
The following table lists the settings that can not be configured within R&S CONTEST
and have to be set in the control panel of the chamber:
Parameter Value
Please refer to the operating manual of the chamber model before making changes to
the parameters.
In order to activate and edit advanced configuration parameters for Vötsch climate
chambers, follow the steps described in Chapter 6.2.3.1, "Activating a Configuration
File", on page 98 and Chapter 6.2.3.2, "Editing a Configuration File", on page 98.
In addition to the parameters described in Table 6-2, the following parameters are
applicable to Vötsch climate chambers:
The stated default values are those of the model Vötsch VT4002.
Parameter Description
DeviceAddress The device address of the chamber, in the manuals also referred to as
"bus address". Note that leading zeroes must be specified.
Permissible values: 01 - 32.
Default value: 01
SetValuesCommand The command string pattern for the "set values" command $01E. The
command will be prefixed by a dollar sign and the DeviceAddress
parameter. Parameters in curly braces are dynamically replaced by
the driver with the specified format.
Parameter {0:0000.0} is the target temperature in the format 4.1
digits, e.g. a target temperature of 23.5 °C will be formatted as
0023.5.
Parameter {1:0000.0} is the target humidity in percent. If the cham-
ber does not support humidity control, this value is taken from the
reply to the $01I command. A target humidity of 50 % will be format-
ted as 0050.0.
Parameter {2:0000.0} is the target fan speed in percent. This value
is taken from the reply to the $01I command. A fan speed of 80% is
formatted as 0080.0.
Parameter {3} consists of the first 16 bits of the bit mask for the digi-
tal channels. It is replaced by either the BitmaskEnabled or
BitmaskDisabled parameter, depending on the climate chamber
state. The last 16 bits are set to zero.
Default value: E {0:0000.0} {1:0000.0} {2:0000.0} 0000.0
0000.0 0000.0 0000.0 {3}0000000000000000
Example: To use a fixed fan speed of 75% modify this entry as fol-
lows: E {0:0000.0} {1:0000.0} 0075.0 0000.0 0000.0
0000.0 0000.0 {3}0000000000000000
BitmaskEnabled The first 16 bits of the bit mask used to enable the climate chamber.
This string is inserted as parameter {3} into SetValuesCommand.
Default value: 0100000000000000
BitmaskDisabled The first 16 bits of the bit mask used to disable the climate chamber.
This string is inserted as parameter {3} into SetValuesCommand.
Default value: 0000000000000000
ModelName The model name entry is merely used for informational purpose. It will
appear in the R&S CONTEST reports and has the format <Ven-
dor>,<Model Name>.
Default value: Voetsch,VT4002
The Test Equity 105A climate chamber is controlled by a set of registers that R&S
CONTEST reads and writes.
Example:
1 3 0 100 0 1 197 213
or
1 3 2 0 74 57 179
If the configuration file does not exist or if a parameter entry is not present in the file,
the driver uses the default value described in Table 6-4.
In order to activate and edit advanced configuration parameters for the Test Equity
105A climate chamber, follow the steps described in Chapter 6.2.3.1, "Activating a
Configuration File", on page 98 and Chapter 6.2.3.2, "Editing a Configuration File",
on page 98.
In addition to the parameters described in Table 6-2, the following parameters are
applicable to the Test Equity 105A climate chamber:
Table 6-4: Advanced Parameters for the Test Equity 105A Climate Chamber
Parameter Description
Index
A F
Automatic Report Transfer ................................................ 21 Fading Profiles .................................................................. 91
Automation Manager Mode ......................................... 81, 85 Configuring R&S CONTEST ....................................... 95
Fading Settings File .................................................... 91
B Insertion Loss ............................................................. 92
Fading Settings File .......................................................... 91
Browser ............................................................................. 10
G
C
Graphical User Interface
Central Report Server ................................................. 10, 16 Command Line Interface ............................................ 61
Configuration .............................................................. 17 GUI
Database .................................................................... 10 Command Line Interface ............................................ 61
Database Server ......................................................... 16
File Server .................................................................. 17 I
Settings ....................................................................... 17
Test Report Directory .................................................. 10 Infrastructure ....................................................................... 7
CETECOM ........................................................................ 30 Browser ....................................................................... 10
CLI ..................................................................................... 60 Central Report Server ................................................. 10
Climate Chamber .............................................................. 95 Command Line Interface .............................................. 8
Device Type ................................................................ 97 Database .................................................................... 10
Resource String .......................................................... 96 Illustration ..................................................................... 7
Test Equity 105A Configuration Parameters ............ 101 R&S CONTEST ............................................................ 7
Vötsch Configuration Parameters ............................. 100 Remote Server .............................................................. 8
XML configuration file ................................................. 97 Report Manager ............................................................ 9
Command Line Interface ......................................... 8, 49, 60 Reportal ........................................................................ 9
Graphical User Interface ............................................. 61 SOAP Client .................................................................. 9
Output Messages ........................................................ 69 System Controller ......................................................... 7
Parameters (General) ................................................. 62 System Controller Database ......................................... 9
Parameters for Information Retrieval .......................... 63 System Controller Test Report Directory ...................... 8
Parameters for Starting a Test Case .......................... 66 Test Report Directory .................................................. 10
Parameters for Starting a Test Plan ........................... 63 User .............................................................................. 9
Return Codes .............................................................. 70 Insertion Loss .................................................................... 92
Starting a Simplified Test Plan .................................... 65 IP Trigger Applications ...................................................... 86
Starting a Standard R&S CONTEST Test Plan .......... 63 Configuring R&S CONTEST ....................................... 89
Starting a Test Case ................................................... 65 Messages ................................................................... 87
CONTEST IP Triggers ......................................................................... 88
Command Line Interface ............................................ 60
Report Server Settings ............................................... 17 J
Report Transfer Service .............................................. 23
ContestReports ................................................................. 11 Jenkins .............................................................................. 75
Custom Summary Report .................................................. 38 Build Step ................................................................... 76
Configuration .............................................................. 76
D JUnit Report ................................................................ 77
Post Build Action ......................................................... 77
Database ....................................................................... 9, 10 Post Build Step ........................................................... 77
Device Type ...................................................................... 97 Reporting .................................................................... 76
DUT Automation .......................................................... 78, 81 JSON Report Generator .................................................... 37
Commands ................................................................. 82 JSON Reports ................................................................... 36
Configuring R&S CONTEST ....................................... 85 Generator .................................................................... 37
DUT Remote Control Plugins ..................................... 78 Location ...................................................................... 36
DUT_SWITCH_OFF ............................................. 82, 84
DUT_SWITCH_ON ............................................... 82, 83 L
RESET .................................................................. 82, 84
Test Application-Specific Command Strings ............... 85 Licenses
DUT Drivers ...................................................................... 82 R&S CMW-KT014 ....................................................... 81
DUT Remote Control Plugins ............................................ 78 R&S TS8-KT115 ......................................................... 16
API .............................................................................. 78 R&S TS8-KT120 ......................................................... 95
Configuring R&S CONTEST ....................................... 79 R&S TS8-KT130 ......................................................... 60
Methods ...................................................................... 79 R&S TS8-KT140 ......................................................... 38
Properties ................................................................... 79 R&S TS8-KT150 ......................................................... 16
Sample Code .............................................................. 81 R&S TS8-KT155 ......................................................... 86
Syntax ......................................................................... 78 TS8-KT130 ................................................................. 49
M Reports
Automatic Transfer ..................................................... 21
Manual Report Transfer .................................................... 22 Backup ........................................................................ 16
Data Serialization ........................................................ 36
N Interchanging Data ..................................................... 36
JSON Reports ............................................................. 36
Network Share .................................................................. 11 JUnit Report ................................................................ 76
Manual Transfer .......................................................... 22
P PDF Report ................................................................. 38
Test Result Header Reports ....................................... 30
PDF File Attachments ................................................. 38, 39 Transfer ...................................................................... 21
PDF Reports ..................................................................... 38 Resource String ................................................................ 96
Attachments ................................................................ 39 Return Codes .................................................................... 70
Embedded Reports ..................................................... 39
Filename ..................................................................... 48 S
Information Density ..................................................... 44
Page Orientation ......................................................... 39 Simplified Test Plan .................................................... 53, 65
Report Style ................................................................ 44 SOAP ................................................................................ 49
Requirements ............................................................. 38 SOAP Client ........................................................................ 9
stdout ................................................................................ 69
R Summary Report
Custom Summary Report ........................................... 38
R&S AMU200A ................................................................. 91 System Controller ................................................................ 7
R&S Automation Manager ................................................ 81
R&S CMW-KT014 ............................................................. 81 T
R&S CONTEST ................................................................... 7
R&S CONTEST Command Line Interface ........................ 60 Test Equity 105A Climate Chamber ................................ 100
R&S CONTEST Remote Server ....................................... 49 Test Report Directory .............................................. 8, 10, 11
R&S CONTEST Report Transfer Service ......................... 23 Test Result Header Generator .......................................... 35
R&S TS8-KT115 ............................................................... 16 Test Result Header Reports .............................................. 30
R&S TS8-KT120 ............................................................... 95 Generator .................................................................... 35
R&S TS8-KT130 ............................................................... 60 Test System Configuration ................................................ 96
R&S TS8-KT140 ............................................................... 38 Transferring Reports ......................................................... 21
R&S TS8-KT150 ............................................................... 16 TS8-KT130 ........................................................................ 49
R&S TS8-KT155 ............................................................... 86
Remote Server .............................................................. 8, 49 U
Actions ........................................................................ 55
Architecture ................................................................. 50 User .....................................................................................9
Code Samples ............................................................ 56 User Defined Fading Profiles ............................................ 91
Interface Description ................................................... 51
Parameters ................................................................. 52 V
Port ............................................................................. 51
Sample Code .............................................................. 51 Visual Studio Solution ................................................. 51, 81
Starting a Test Case ................................................... 59 Vötsch Climate Chambers ................................................ 99
Starting an .rstt Test Plan ........................................... 57
Starting an .xml Test Plan ........................................... 57 W
WSDL ......................................................................... 51
Report Analyzer ................................................................ 16 WSDL ................................................................................ 51
Report Folder .................................................................... 11
Report Folder Manager ..................................................... 11
Report Manager .................................................................. 9
Report Server .................................................................... 16
Configuration .............................................................. 17
Database Server ......................................................... 16
File Server .................................................................. 17
Settings ....................................................................... 17
Report Transfer Service .................................................... 23
Advanced Configuration ............................................. 28
Buttons ........................................................................ 24
Error ............................................................................ 25
Messages ................................................................... 26
Report Window ........................................................... 26
Status Application ....................................................... 23
System Tray Icons ...................................................... 23
Warning ...................................................................... 25
Reportal ............................................................................... 9