0% found this document useful (0 votes)
467 views103 pages

User Manual: R&S Contest Advanced User Procedures

Uploaded by

Ashish Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
467 views103 pages

User Manual: R&S Contest Advanced User Procedures

Uploaded by

Ashish Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 103

R&S® CONTEST

Advanced User Procedures


User Manual

(;ÙÊJ2)
1175.6026.02 ─ 02
Test & Measurement

User Manual
© 2016 Rohde & Schwarz GmbH & Co. KG
Mühldorfstr. 15, 81671 München, Germany
Phone: +49 89 41 29 - 0
Fax: +49 89 41 29 12 164
Email: [email protected]
Internet: www.rohde-schwarz.com
Subject to change – Data without tolerance limits is not binding.
R&S® is a registered trademark of Rohde & Schwarz GmbH & Co. KG.
Trade names are trademarks of the owners.

The following abbreviations are used throughout this manual: R&S® CONTEST is abbreviated as R&S CONTEST, Microsoft Win-
dows® is abbreviated as Windows, Jenkins® is abbreviated as Jenkins.
R&S® CONTEST Contents

Contents
1 Infrastructure..........................................................................................7
1.1 System Controller......................................................................................................... 7
1.2 User................................................................................................................................ 9
1.3 Central Report Server................................................................................................. 10

2 Data Handling....................................................................................... 11
2.1 Test Report Directory................................................................................................. 11
2.1.1 Introduction................................................................................................................... 11
2.1.2 Stored Information.........................................................................................................12
2.1.3 Changing the Name or Location of the Test Report Directory...................................... 13
2.1.4 Importing Test Reports into the Test Report Directory..................................................15
2.2 Central Report Server................................................................................................. 16
2.2.1 Introduction................................................................................................................... 16
2.2.2 Setting up the Database Server.................................................................................... 16
2.2.3 Setting up the File Server..............................................................................................17
2.2.4 Configuring R&S CONTEST......................................................................................... 17
2.2.5 Transferring Reports..................................................................................................... 21
2.3 Report Transfer Service............................................................................................. 23
2.3.1 Introduction................................................................................................................... 23
2.3.2 Status Application......................................................................................................... 23
2.3.3 Advanced Configuration................................................................................................28

3 Reporting.............................................................................................. 30
3.1 Test Result Header Reports....................................................................................... 30
3.1.1 Introduction................................................................................................................... 30
3.1.2 Stored Information.........................................................................................................30
3.1.3 Configuring the Test Result Header Generator.............................................................33
3.1.4 Generating Test Result Header Reports.......................................................................35
3.2 JSON Reports..............................................................................................................36
3.2.1 Introduction................................................................................................................... 36
3.2.2 Stored Information.........................................................................................................36
3.2.3 Generating JSON Reports............................................................................................ 37
3.3 PDF Reports................................................................................................................ 38

User Manual 1175.6026.02 ─ 02 3


R&S® CONTEST Contents

3.3.1 Introduction................................................................................................................... 38
3.3.2 Stored Information.........................................................................................................39
3.3.3 Generating PDF Reports...............................................................................................47

4 Remote Control and Test Automation............................................... 49


4.1 Remote Server.............................................................................................................49
4.1.1 Introduction................................................................................................................... 49
4.1.2 Setting up the Remote Server.......................................................................................50
4.1.3 Interface Description..................................................................................................... 51
4.1.4 Interface Functions........................................................................................................52
4.1.5 Examples...................................................................................................................... 56
4.2 Command Line Interface............................................................................................ 60
4.2.1 Introduction................................................................................................................... 60
4.2.2 Overview....................................................................................................................... 60
4.2.3 Parameters....................................................................................................................62
4.2.4 Output........................................................................................................................... 69
4.3 Jenkins Integration..................................................................................................... 75
4.3.1 Introduction................................................................................................................... 75
4.3.2 Jenkins Reporting......................................................................................................... 76
4.3.3 Configuring Jenkins.......................................................................................................76

5 DUT Automation...................................................................................78
5.1 Custom DUT Remote Control Plugins.......................................................................78
5.1.1 Introduction................................................................................................................... 78
5.1.2 Interface Description..................................................................................................... 78
5.1.3 Configuring R&S CONTEST......................................................................................... 79
5.1.4 Example Visual Studio Solution.................................................................................... 81
5.2 DUT Automation Applications................................................................................... 81
5.2.1 Introduction................................................................................................................... 81
5.2.2 Interface Description..................................................................................................... 81
5.2.3 Automation Commands.................................................................................................82
5.2.4 Configuring R&S CONTEST......................................................................................... 85
5.3 IP Trigger Applications...............................................................................................86
5.3.1 Introduction................................................................................................................... 86
5.3.2 Interface Description..................................................................................................... 86

User Manual 1175.6026.02 ─ 02 4


R&S® CONTEST Contents

5.3.3 Configuring R&S CONTEST......................................................................................... 89

6 Advanced Device Configuration.........................................................91


6.1 User Defined Fading Profiles..................................................................................... 91
6.1.1 Introduction................................................................................................................... 91
6.1.2 Creating a Fading Settings File on the Instrument........................................................91
6.1.3 Determining the Insertion Loss on the Instrument........................................................ 92
6.1.4 Configuring R&S CONTEST......................................................................................... 94
6.2 Climate Chamber Configuration................................................................................ 95
6.2.1 Introduction................................................................................................................... 95
6.2.2 Basic Settings............................................................................................................... 96
6.2.3 Advanced Settings........................................................................................................ 97

Index....................................................................................................102

User Manual 1175.6026.02 ─ 02 5


R&S® CONTEST Contents

User Manual 1175.6026.02 ─ 02 6


R&S® CONTEST Infrastructure
System Controller

1 Infrastructure
This chapter describes the infrastructure around R&S CONTEST by illustrating and
explaining the correlation between different topics of the present documentation.

Figure 1-1: Exemplary illustration of the test system infrastructure

1.1 System Controller


The system controller is the computer attached to a test system, running R&S CON-
TEST and services for controlling the test system.

R&S CONTEST
During the execution of a test plan, R&S CONTEST writes test report files into the test
report directory, shared in the network. Additionally, measurement data and test rela-
ted information is written into a database. If the R&S CONTEST Reportal server is run-
ning, the test report of a currently running test case is made available to R&S CON-
TEST Reportal and can thus be monitored remotely via web browser by any user
within the same network as the system controller. Using the built-in Jenkins reporting
plugin, a JUnitReport is written and stored on the Jenkins server.
For more information refer to:
● Chapter 2.1, "Test Report Directory", on page 11
● the separate R&S CONTEST Reportal documentation
● Chapter 4.3, "Jenkins Integration", on page 75

User Manual 1175.6026.02 ─ 02 7


R&S® CONTEST Infrastructure
System Controller

Remote Server
R&S CONTEST Remote Server is used to remotely perform essential functions on the
test system. Controlled by a (possibly remote) SOAP client, R&S CONTEST Remote
Server uses the Command Line Interface to interact with R&S CONTEST. R&S CON-
TEST Remote Server acts as a single interface for all available R&S CONTEST ver-
sions.
For more information about R&S CONTEST Remote Server refer to Chapter 4.1,
"Remote Server", on page 49.

Command Line Interface


Each R&S CONTEST major version comes with its own Command Line Interface. In
order to interact with R&S CONTEST remotely, each Command Line Interface is called
by the same R&S CONTEST Remote Server. The respective Command Line Interface
then runs the requested applications using the R&S CONTEST-BASE version they
depend upon:

Figure 1-2: Illustration of the Infrastructure of R&S CONTEST Command Line Interfaces

The applications listed in Figure 1-2 don't represent a comprehensive list, but rather an
exemplary selection for illustration purposes. A complete list of all available applica-
tions and test cases can be obtained using the --info parameter of the respective
Command Line Interface (see Table 4-5).

For more information about the Command Line Interface refer to Chapter 4.2, "Com-
mand Line Interface", on page 60.

Test Report Directory


The test report directory on the system controller holds test report files written by R&S
CONTEST during the execution of a test plan. It is a network share used by R&S CON-
TEST Report Manager for accessing test reports generated by the test system. Using

User Manual 1175.6026.02 ─ 02 8


R&S® CONTEST Infrastructure
User

R&S CONTEST Report Transfer Service the files can be replicated to the test report
directory on a Central Report Server.
For more information about the test report directory refer to Chapter 2.1, "Test Report
Directory", on page 11.

Database
The database on the system controller holds measurement data and test related infor-
mation written by R&S CONTEST during the execution of a test plan. It is used by R&S
CONTEST Report Manager for finding, viewing and comparing past test case runs.
Using R&S CONTEST Report Transfer Service database entries can be replicated to
the database on a Central Report Server.

Reportal
R&S CONTEST Reportal uses measurement data and test information provided by
R&S CONTEST to remotely display test reports in a web browser. Similar to the Online
Report, R&S CONTEST Reportal allows a live view of a currently running test report,
as well as all test case reports of the most recently run test plan. It can be accessed
from any client computer within the same network as the system controller.
For more information about R&S CONTEST Reportal refer to the separate R&S CON-
TEST Reportal documentation.

1.2 User
In this context, user refers to a client computer within the same network as the system
controller.

SOAP Client
Using a SOAP client R&S CONTEST Remote Server can be accessed in order to
remotely perform essential functions on a test system. The SOAP client is not provided
by Rohde & Schwarz.
For more information about R&S CONTEST Remote Server refer to Chapter 4.1,
"Remote Server", on page 49.

Report Manager
R&S CONTEST Report Manager uses the test report directory and the database on
the system controller to access measurement data and test reports generated by the
test system. In case a Central Report Server is set up, R&S CONTEST Report Man-
ager also can access the test report directory and database of the server. Using R&S
CONTEST Report Transfer Service, R&S CONTEST Report Manager is furthermore
able to replicate test reports and database entries from the system controller to a Cen-
tral Report Server.
For more information about R&S CONTEST Report Manager refer to the separate
R&S CONTEST Report Manager documentation.

User Manual 1175.6026.02 ─ 02 9


R&S® CONTEST Infrastructure
Central Report Server

Browser
Via web browser on a users' computer, the status of a currently running test plan, as
well as its test case reports can be monitored by accessing R&S CONTEST Reportal.
Additionally, by accessing the web front end of Jenkins, the automatic and unattended
execution of test plans can be set up and scheduled.
For more information refer to:
● Chapter 4.3, "Jenkins Integration", on page 75
● the separate R&S CONTEST Reportal documentation

1.3 Central Report Server


A Central Report Server allows the central storage of data and test reports from multi-
ple test systems while at the same time serving as a backup. As indicated by the dot-
ted line in Figure 1-1 the file server providing the test report directory and the database
server providing the database don't necessarily have to run on the same physical
machine.

Test Report Directory


The test report directory on a Central Report Server contains copies of test reports that
have been transferred from the test report directory on a system controller using R&S
CONTEST Report Transfer Service. This transfer can either be performed via R&S
CONTEST directly or using R&S CONTEST Report Manager.
For more information about the test report directory on a Central Report Server refer to
Chapter 2.2, "Central Report Server", on page 16 or Chapter 2.1, "Test Report Direc-
tory", on page 11.

Database
The database on a Central Report Server contains copies of database entries that
have been transferred from the database on a system controller using R&S CONTEST
Report Transfer Service. This transfer can either be performed via R&S CONTEST
directly or using R&S CONTEST Report Manager.
For more information about the database on a Central Report Server refer to Chap-
ter 2.2, "Central Report Server", on page 16.

User Manual 1175.6026.02 ─ 02 10


R&S® CONTEST Data Handling
Test Report Directory

2 Data Handling

2.1 Test Report Directory


Required additional license: none

2.1.1 Introduction

R&S CONTEST uses a Windows share to handle test reports and related files. The
default name of the network share is ContestReports, pointing to the test report direc-
tory ContestReports, located in the C:\ root directory on the system controller. This
directory is read-only except for the system and authorized users. The name and loca-
tion of the test report directory are configurable using Report Folder Manager that
comes with R&S CONTEST and R&S CONTEST Report Manager.
During the execution of a test plan a folder will be created for each test plan run, con-
taining subfolders for each test case run in the test plan. The test report directory is
structured as follows:

Figure 2-1: Example of the folder structure within the test report directory

The default network path of the shared test report directory is:
\\<ComputerName>\ContestReports. Note that in this path, <ComputerName> is
a placeholder for the actual name of the computer.

User Manual 1175.6026.02 ─ 02 11


R&S® CONTEST Data Handling
Test Report Directory

The following sections describe the information that is stored in the test report directory
and common tasks.

2.1.2 Stored Information

The following sections list the types of information stored in the test report directory.

Depending on the configuration, parametrization and available licenses not all ele-
ments may be present in the test report directory.

ContestReports Folder

Name Type Description

Test plan Folder One folder per test plan run. Con-
tains the respective test case fold-
ers and related files.

DBImport File (.cet) Encrypted database import trace


files.

TSTrace File (.cet) Encrypted test system trace files.

TSTrace File (.txt) Test system trace files.

TSTrace_ReportManager File (.txt) Test system trace files in the con-


text of R&S CONTEST Report
Manager.

ContestStarter log File (.txt) Log file of the R&S CONTEST


Configuration Selector.

ErrorLog File (.txt) R&S CONTEST Base error log


files. These files are only gener-
ated when errors occur during the
execution of the program.

GUI_ErrorLog File (.txt) R&S CONTEST GUI error log


files. These files are only gener-
ated when errors occur during the
execution of the program.

ReportManagerErrorLog File (.txt) R&S CONTEST Report Manager


error log files. These files are only
generated when errors occur dur-
ing the execution of the program.

ReportTransferService File (.txt) R&S CONTEST Report Transfer


Service log files.

Test system error log File (.txt) Error log files for each test system
writing into the Windows share.
These files are only generated
when error occur.

User Manual 1175.6026.02 ─ 02 12


R&S® CONTEST Data Handling
Test Report Directory

Test Plan Run Folder

Name Type Description

ReferencedItems Folder Contains referenced items such


as .pem files.

SummaryReportRepository Folder Contains stylesheets, images and


scripts required for the display of
Summary Reports in a web
browser.

Test case Folder One folder per test case. Contains


the respective test case report
and related files.

TSTrace File (.cet) Encrypted trace files.

R&S CONTEST test plan File (.rstt) A standard R&S CONTEST test
plan file.

Error log File (.txt) Error log files. These files are only
generated when errors occur dur-
ing the execution of the test plan.

SummaryReportsOverview File (.xml) SummaryReportsOverview.xml


is a summary of the test plan, its
test cases and their verdicts.

Test Case Run Folder

Name Type Description

OnlineReportGraphics Folder Contains graphics used in the


online or summary reports.

Signalling Logs Folder Contains the signalling logs of the


test case run.

Test Result Header Report File (.xml) The Test Report Header report
file.

JSON Report File (.json) The JSON report file.

Summary Report File (.xml) The SummaryReport file.

Online Report File (.html) The Online Report file.

TSTrace File (.cet) Encrypted trace files.

2.1.3 Changing the Name or Location of the Test Report Directory

To change the name or the location of the test report directory, proceed as follows:
1. Close any open R&S CONTEST instances.

2. Click the Windows Start button.

3. In the search field, enter Report Folder Manager.

User Manual 1175.6026.02 ─ 02 13


R&S® CONTEST Data Handling
Test Report Directory

A list of search result appears.

4. From the "Programs" sections of the search results, select Report Folder Manager.
The "CONTEST Report Folder Manager" dialog appears.

Tip: Alternatively, the executable can be found under


C:\Program Files (x86)\Rohde-Schwarz\Contest\
<MajorVersionNumber>\GUI\Bin\Tools\
RohdeSchwarz.Contest.ReportFolderManager.exe or
C:\Program Files (x86)\Rohde-Schwarz\Contest\Report Manager\
Bin\RohdeSchwarz.Contest.ReportFolderManager.exe respectively.
Note that <MajorVersionNumber> is a placeholder for the actual version number
(e.g. 14).

5. Click the "Change" button.


The "Browser for Folder" dialog appears.

6. Navigate to the desired folder, or create a new folder using the "'Make New Folder"
button.

7. Click the "OK" button.


The "CONTEST User Interaction" dialog appears.

8. Choose between the following options:


● Click the "Copy" button to copy all existing report files into the new test report
directory.
● Click the "Move" button to move all existing report files from the previous test
report directory to the new one.
● Click the "No" button to keep all existing report files in the previous test report
directory.
Your selected option will be performed. The "Current Report Folder" box indicates
the active test report directory and box underneath gives information about the sta-
tus of the operation.

User Manual 1175.6026.02 ─ 02 14


R&S® CONTEST Data Handling
Test Report Directory

2.1.4 Importing Test Reports into the Test Report Directory

To import test reports into the test report directory, proceed as follows:
1. Close any open R&S CONTEST instances.

2. Click the Windows Start button.

3. In the search field, enter Report Folder Manager.


A list of search result appears.

4. From the "Programs" sections of the search results, select Report Folder Manager.
The "CONTEST Report Folder Manager" dialog appears.

Tip: Alternatively, the executable can be found under


C:\Program Files (x86)\Rohde-Schwarz\Contest\
<MajorVersionNumber>\GUI\Bin\Tools\
RohdeSchwarz.Contest.ReportFolderManager.exe or
C:\Program Files (x86)\Rohde-Schwarz\Contest\Report Manager\
Bin\RohdeSchwarz.Contest.ReportFolderManager.exe respectively.
Note that <MajorVersionNumber> is a placeholder for the actual version number
(e.g. 14).

5. Click the "Import" button.


The "Browser for Folder" dialog appears.

6. Navigate to the folder containing the test report files to import.

7. Click the "OK" button.


The "CONTEST User Interaction" dialog appears.

8. Choose between the following options:


● Click the "Copy" button to copy all test report files from the selected folder into
the test report directory.
● Click the "Move" button to move all test report files from the selected folder to
the test report directory.

User Manual 1175.6026.02 ─ 02 15


R&S® CONTEST Data Handling
Central Report Server

● Click the "No" button to cancel the import.


Your selected option will be performed. The status box gives information about the
status of the operation.

2.2 Central Report Server


Required additional license: R&S TS8-KT115

2.2.1 Introduction

During the execution of a test plan, related meta information and measurement data
from test case runs are stored in a local database. Associated test report files such as
Online Reports or Summary Report are being written into the test report directory. By
setting up a central report server, the database entries and their associated files can
be transferred to an external machine in the network. The transfer can either be per-
formed manually using R&S CONTEST or R&S CONTEST Report Manager or auto-
matically following specified rules. For this to be achieved, a database server and a file
server need to be set up and R&S CONTEST has to be configured accordingly. The
database server and the file server can be configured separately and don't necessarily
have to run on the same network computer.
The main benefit of the central report server is, that it can be used to store information
generated by multiple test system in one central location. Users of R&S CONTEST
Report Manager for example will then have control over all test reports generated by all
test systems. If for one R&S CONTEST Report Manager accessing the central report
server, the Report Analyzer license (R&S TS8-KT150) has been acquired, it will auto-
matically be inherited by all other R&S CONTEST Report Manager clients connecting
to the same central report server.

2.2.2 Setting up the Database Server

As the database on the central report server serves as a backup for the local R&S
CONTEST database, the schema of both databases has to be the same. To assure
the correct set up of the database server, it is recommended to perform the installation
using R&S MCT Installation Manager. If you are unfamiliar with installing software
using R&S MCT Installation Manager, refer to the Software Installation manual.
In R&S MCT Installation Manager, install the most recent official POSTGRESQL ver-
sion. After the installation of the database server using R&S MCT Installation Manager,
the database setup matches the one of the local R&S CONTEST database: the
schema and the access data are the same.
After the database server is set up, R&S CONTEST can be configured accordingly.

User Manual 1175.6026.02 ─ 02 16


R&S® CONTEST Data Handling
Central Report Server

2.2.3 Setting up the File Server

The central report server uses the file server functionality of the operating system by
means of a network share. The easiest way to create a network share under Windows
is by using the NET SHARE command with the following parameters:
NET SHARE <ShareName>=<PathToFolder> /GRANT:<UserName>,
<Permissions>. Permissions can be either READ, CHANGE or FULL.

Example:
NET SHARE ContestReports=C:
\ContestReports /GRANT:SampleUser,FULL
(written all in one line)
This command would create a network share named ContestReports, pointing to
C:\ContestReports with read and write permissions for the user SampleUser. A
complete liste of parameters for theNET SHARE command is given by entering NET
SHARE /HELP

After the file server is set up, R&S CONTEST can be configured accordingly.

2.2.4 Configuring R&S CONTEST

After setting up the database server and the file server, R&S CONTEST can be config-
ured to access the respective servers and automatically transfer data and test reports
to the central report server. The configuration is done via the "CONTEST Report
Server Settings" dialog.

User Manual 1175.6026.02 ─ 02 17


R&S® CONTEST Data Handling
Central Report Server

Figure 2-2: Screenshot of the CONTEST Report Server settings dialog

The report server settings can be saved and loaded into other R&S CONTEST and
R&S CONTEST Report Manager applications to ensure consistency between the con-
figurations.

To enable a central report server, proceed as follows:


1. In the R&S CONTEST menu bar, select "Settings" → "CONTEST Report Server
Settings".
The "CONTEST Report Server Settings" dialog appears.

2. Select the checkbox "Enable Central Report Server".


The database server, the file server and the automatic transfer of data and reports
can now be configured in the respective tab pages.

2.2.4.1 Configuring the Database Server

In the "CONTEST Report Server Settings" dialog, the "Database Server Settings" tab
page contains the following GUI elements:

User Manual 1175.6026.02 ─ 02 18


R&S® CONTEST Data Handling
Central Report Server

GUI Element Description

"Database Host" The database host can be given either by the logical
network name or the IP address of the server where
the database server is running.

"Port" The port where the database server on the given


host can be accessed. The default port 5432 is set
automatically.

"Database Name" The name of the database. The default name is


Contestdb. If on the given database host no data-
base with the specified name exists, it will be cre-
ated (only in R&S CONTEST, not in R&S CONTEST
Report Manager).

"User" The name of the database user. The database user


is contest and cannot be changed.

"Password" The password set for the database user. The pass-
word for the database user contest is contest@1sp1
and cannot be changed.

"Test Database Creation and Access" Button to test the database connection and the pos-
sible creation of a new database.

To configure the access to the database server on the central report server, proceed
as follows:
1. In the R&S CONTEST menu bar, select "Settings" → "CONTEST Report Server
Settings"
The "CONTEST Report Server Settings" dialog appears.

2. Make sure the "Enable Central Report Server" checkbox is enabled.

3. In the "Database Server Settings" tab page, enter the following information:
● Database Host
● Database Name
● Port

4. Click the "Test Database Creation and Access" button.


If the specified database already exists, the "Database connectivity test" dialog
appears.
If the specified database doesn't exist, the "CONTEST User Interaction" dialog
appears.

5. If the connectifity test was successful, click the "OK" button to confirm the database
server settings.
R&S CONTEST is now able to copy data to the specified database server. In order
to configure an automatic transfer of data and reports, refer to Chapter 2.2.5.1,
"Transferring Reports Automatically", on page 21.

User Manual 1175.6026.02 ─ 02 19


R&S® CONTEST Data Handling
Central Report Server

2.2.4.2 Configuring the File Server

In the "CONTEST Report Server Settings" dialog, the "File Server Settings" tab page
contains the following GUI elements:

GUI Element Description

"Use database server also for file server" Checkbox to specify whether the host specified in
the "Database Server Settings" tab page should
also be used as host for the file server.

"Server Name or IP or Domain Name" The file server host. It can either be specified by
server name, IP address or domain name. Only
available when "Use database server also for file
server" is disabled.

"Share Name" The name of the network share. Only existing net-
work shares can be specified. By default, the name
of the network share is ContestReports.

"Report Folder" Path to the desired test report directory within the
specified network share.

"Provide credentials for file server access" Checkbox to specify whether credentials are to be
provided while accessing the file server. If no cre-
dentials are provided, the currently logged in users'
credentials are used for the connection attempt.

"Domain\User Name" Domain name or user name for accessing the file
server. Only available if "Provide credentials for file
server access" is enabled.

"Password" Password for the given domain name or user name


for accessing the file server. Only available if "Pro-
vide credentials for file server access" is enabled.

"UNC Path to Report Location" Indicating the UNC path to the test report directory
on the central report server.

"Test File Server Access" Button to test the file server connection.

To configure the access to the file server on the central report server, proceed as fol-
lows:
1. In the R&S CONTEST menu bar, select "Settings" → "CONTEST Report Server
Settings"
The "CONTEST Report Server Settings" dialog appears.

2. Make sure the "Enable Central Report Server" checkbox is enabled.

3. In the "File Server Settings" tab page, enter the required information.

4. Click the "Test File Server Access" button.


If the file server can be accessed, the "File Server access OK" dialog appears.
If the file server cannot be accessed, the "File Server access FAILED" dialog
appears, giving reasons for the connection problems.

User Manual 1175.6026.02 ─ 02 20


R&S® CONTEST Data Handling
Central Report Server

5. If the connectifity test was successful, click the "OK" button to confirm the file
server settings.
R&S CONTEST is now able to copy test reports to the specified file server. In order
to configure an automatic transfer of data and reports, refer to Chapter 2.2.5.1,
"Transferring Reports Automatically", on page 21.

2.2.5 Transferring Reports

With R&S CONTEST and R&S CONTEST Report Manager, data and test reports can
be transferred automatically or manually to a central report server. The actual transfer
process is handled by R&S CONTEST Report Transfer Service. For more information
about R&S CONTEST Report Transfer Service refer to Chapter 2.3, "Report Transfer
Service", on page 23. During the transfer process, data and test reports are copied
to a central report server. The original data and test reports remain on the local system
controller.

Reports generated by system applications such as the PathTransmittanceCalibration


(PTC) or InstrumentCheck can currently not be transferred to a central report server.

2.2.5.1 Transferring Reports Automatically

In the "CONTEST Report Server Settings" dialog, the "Automatic Transfer Settings" tab
page contains the following GUI elements:

GUI Element Description

"Enable automatic report transfer for the selected Checkbox to specify whether data and test reports
verdicts to this server" should be automatically transferred to the central
report server.

"Passed and Passed with Restrictions" Checkbox to specify whether data and test reports
of test case runs with the verdicts Passed and
Passed with Restrictions should be transferred. Only
available when is "Enable automatic report transfer
for the selected verdicts to this server" is enabled.

"Inconclusive" Checkbox to specify whether data and test reports


of test case runs with the verdict Inconclusive
should be transferred. Only available when is
"Enable automatic report transfer for the selected
verdicts to this server" is enabled.

"Failed and Failed with Restrictions" Checkbox to specify whether data and test reports
of test case runs with the verdicts Failed and Failed
with Restrictions should be transferred. Only availa-
ble when is "Enable automatic report transfer for the
selected verdicts to this server" is enabled.

"Other" Checkbox to specify whether data and test reports


of test case runs with other verdicts should be trans-
ferred. Only available when is "Enable automatic
report transfer for the selected verdicts to this
server" is enabled.

User Manual 1175.6026.02 ─ 02 21


R&S® CONTEST Data Handling
Central Report Server

To configure the automatic transfer of data and test reports to the central report server,
proceed as follows:
1. In the R&S CONTEST menu bar, select "Settings" → "CONTEST Report Server
Settings"
The "CONTEST Report Server Settings" dialog appears.

2. Make sure the "Enable Central Report Server" checkbox is enabled.

3. In the "Automatic Transfer Settings" tab page, enable the checkbox "Enable auto-
matic report transfer for the selected verdicts to this server".

4. Enable the checkboxes of the desired verdicts.

5. Click the "OK" button to confirm the automatic transfer settings.


R&S CONTEST will from now on automatically transfer data and test reports of test
case runs with the selected verdicts to the central report server.

This setting does not affect data and test case reports that have been created prior to
enabling the automatic transfer. Previous reports (data and files) have to be transferred
manually to a central report server.

2.2.5.2 Transferring Reports Manually

With R&S CONTEST or R&S CONTEST Report Manager, data and test reports of test
plan or test case runs can be manually transferred to a central report server. The man-
ual transfer is performed via the context menu of a test plan or test case report. The
following procedure describes the manual transfer in R&S CONTEST.

To manually transfer data and test reports to a central report server using R&S CON-
TEST, proceed as follows:
1. In the explorer pane (on the left), select the "Reports" tab.
A list of test plan runs appears in the tab page.

2. To transfer data and test reports of whole a test plan run, proceed as follows:
a) Right click on the desired test plan run.
b) In the context menu, select "Copy Selected Report(s) to Central Report Server
(DB: <DatabaseServer>- Files: <FileServer>)"
The data and test reports of the selected test plan run — including all test case
runs — will be transferred.

3. To transfer data and test reports of a test case run, proceed as follows:
a) Double click the desired test plan run.
b) In the test case overview in the right pane, select the desired test case run(s).
c) In the context menu, select "Copy Selected Report(s) to Central Report Server
(DB: <DatabaseServer>- Files: <FileServer>)"

User Manual 1175.6026.02 ─ 02 22


R&S® CONTEST Data Handling
Report Transfer Service

The data and test reports of the selected test case run(s) will be transferred.
In the Windows system tray, the R&S CONTEST Report Transfer Service indicates
the status of the transfer.

2.3 Report Transfer Service


R&S CONTEST Report Transfer Service
Required additional license: none

2.3.1 Introduction

R&S CONTEST Report Transfer Service is a service tool, that manages the transfer of
data and test reports between test systems and central report servers. It replaces the
Report Handler integrated in R&S CONTEST and R&S CONTEST Report Manager.
While the former Report Handler transferred data and test reports synchronously, R&S
CONTEST Report Transfer Service performes the transfer sequentially and in an
ansynchronous manner, in order to operate independently of R&S CONTEST or R&S
CONTEST Report Manager.
It comes with the installation of R&S CONTEST or R&S CONTEST Report Manager,
and is preconfigured, to work "out of the box". The configuration can however be cus-
tomized to match individual requirements. For more information refer to Chapter 2.3.3,
"Advanced Configuration", on page 28. R&S CONTEST Report Transfer Service
consists of a Windows Service and a status application, that are started by R&S CON-
TEST or R&S CONTEST Report Manager. The service will be restarted every 24
hours, and in the event of unexpected failures.
The following sections describe the status application of R&S CONTEST Report Trans-
fer Service and advanced configuration parameters.

2.3.2 Status Application

The status application of R&S CONTEST Report Transfer Service offers a brief status
of the data and test report transfer and allows some basic control. It is started upon
system start or once the R&S CONTEST Report Transfer Service becomes active.
Alternatively, it can be started manually via the "Report Transfer Service" start menu
entry. The status application runs as a system tray icon and consists of a status win-
dow and a report window. The system tray icon indicates the status of the service:

The service is running and the queue is empty.

The service is working through the queued transfers.

The service crashed.

User Manual 1175.6026.02 ─ 02 23


R&S® CONTEST Data Handling
Report Transfer Service

A left click on the system tray icon reveals the status window, while a right click opens
the context menu, giving access to the following controls:
● Start Report Transfer Service
● Stop service and abort all transfers
● Continue transferrings reports
● Pause the transfer of reports
● Open Status Window
● Open Report Window
● Open current log file
● About
● Close Application and Service

2.3.2.1 Overview

Figure 2-3: Screenshot of the R&S CONTEST Report Transfer Service Status Window

In the title bar of the status window several buttons for controlling R&S CONTEST
Report Transfer Service are placed. In particular those are:

Close the status window

Stop all transfers and abort all pending transfers

Continue the transfer (after pause or stop) or start a new instance if currently no service is available

Pause the transfer of all reports

Open the Report Window (see Chapter 2.3.2.3, "Report Window", on page 26)

Close a warning

Depending of the current state of R&S CONTEST Report Transfer Service not all items
are displayed.

User Manual 1175.6026.02 ─ 02 24


R&S® CONTEST Data Handling
Report Transfer Service

Report Window

Default View
The transfer of a report consists of two steps: a transfer to the database server and a
transfer to the file server. In the default view, the number of pending transfers for both
steps is shown. Additionally, the average duration of a transfer and the duration of the
last transfer, as well as the times for the start and end of the last completed transfer
are given.

Figure 2-4: Annotated Screenshot of the Default View

Warning View
Whenever a report could not be transferred successfully or another problem occurs
which doesn’t terminate the service, the status window switches to the warning view
and displays a failure message. For a short summary of the most common failure mes-
sages and their solutions refer to Chapter 2.3.2.2, "Failure Outline", on page 26.

Figure 2-5: Screenshot of the Warning View

By closing the warning the view will change back to the default view.

Error View
Whenever the service is not available, e.g. because it was terminated by an external
event or by an internal error, the status window changes to the error view. As soon as

User Manual 1175.6026.02 ─ 02 25


R&S® CONTEST Data Handling
Report Transfer Service

a new service instance is available (automatically started by R&S CONTEST, R&S


CONTEST Report Manager or manually via the start/continue control button) the status
window will switch back to the default view.

2.3.2.2 Failure Outline

The following table gives a summary of the most common failure messages and possi-
ble solutions:

Message Description

Database or file server not availa- An error occured while testing the connection to the destination serv-
ble ers.
● Check if the destination servers are reachable from the internet.
● Check if the central report server is enabled in R&S CONTEST
or R&S CONTEST Report Manager.
An automatic retry is scheduled.

Currently transferring A report with this ID and destination is already pending in R&S CON-
TEST Report Transfer Service. The newer report will be discarded.

A file is currently used by another The file is still being used by R&S CONTEST or R&S CONTEST
process Report Manager. An automatic retry is scheduled.

A directory is currently not availa- R&S CONTEST or R&S CONTEST Report Manager have not com-
ble pleted creating all report files. An automatic retry is scheduled.

No temporary file can be created The database request returns no valid file path(s). An automatic retry
is scheduled.

File transfer was marked as not The test report directory could not be mounted on the destination file
successful server.
● Check if R&S CONTEST Report Transfer Service is running in
the currently active user's session.
● Check if write permissions for the current user are available on
the destination file server.
An automatic retry is scheduled.

Aborted by an unknown reason The transfer of a report couldn’t be performed properly for reasons
not caused by R&S CONTEST Report Transfer Service.
● Check the log file for more information.
An automatic retry is scheduled.

2.3.2.3 Report Window

The Report Window of R&S CONTEST Report Transfer Service is a tabular display of
transfer jobs. Using the icon the history of the last 7 days from the server can be
loaded into the job list. By clicking on a row, further information about the select job is
displayed.

User Manual 1175.6026.02 ─ 02 26


R&S® CONTEST Data Handling
Report Transfer Service

Figure 2-6: Screenshot of the Report Window within the status application of R&S CONTEST Report
Transfer Service

Table 2-1: Description of the columns in the Report Window

Column Description

"DB" Database transfer. The icon represents the status of the respective job.

"FS" Fileserver transfer. The icon represents the status of the respective job.

"Enque-time" Date and time the job has been added to the queue.

"Type" The type of the transfer as represented by the following icons:


Copy
Move
Delete

"Name" Name of the job.

"Source" IP-Address of the source server.

"Finished" The time the respective job has been finished or aborted.

In the "DB" and "FS" columns, the different icons represent the status of the respective
job:

Success

Pending

Error. Retry scheduled.

User Manual 1175.6026.02 ─ 02 27


R&S® CONTEST Data Handling
Report Transfer Service

Error. Retry running or finished.

Final abort. No further retry attempt.

Duplicated job. Rejected.

2.3.3 Advanced Configuration

R&S CONTEST Report Transfer Service inherits the configuration of the destination
servers from R&S CONTEST or R&S CONTEST Report Manager. Therefor changes to
the configuration are not required to use R&S CONTEST Report Transfer Service. A
customization of the configuration however is possible via the modification of the R&S
CONTEST's global configuration file Contest.Settings.xml. Therefor a node
<ReportTransferService> has to be added, directly within the
<ConfigurationSettings> node:

Example:
<ConfigurationSettings>
<ReportTransferService>
<SpoolingBase>
C:\ProgramData\Rohde-Schwarz\Contest\Common\Data\ReportTransferService
</SpoolingBase>
<LogDirectory>C:\ContestReports</LogDirectory>
<OpenEmpty>False</OpenEmpty>
<RetryTimeout>00:01:00</RetryTimeout>
<WorkerTimeout>04:00:00</WorkerTimeout>
<AddressBase>
http://localhost::8181/RohdeSchwarz/Contest/ReportTransferService
</AddressBase>
<AddressExtension-Slave>Slave</AddressExtension-Slave>
<AddressExtension-Status>Status</AddressExtension-Status>
</ReportTransferService>
<ConfigurationSettings>

The following table describes the available configuration parameters:

Node Description

<SpoolingBase> Specifies the path where R&S CONTEST Report Transfer Service cre-
ates the file spoolers for queuing the pending transfers. The Spooling-
Base will not need a lot of memory but will have a high access fre-
quency. The default value is
<ProgramData>\Rohde-Schwarz\Contest\Common\Data\
ReportTransferService. Note that in this path, <ProgramData>
stands for the full path to the ProgramData folder.

<LogDirectory> Specifies the path where R&S CONTEST Report Transfer Service
saves its log files. Initially this is the test report directory used by R&S
CONTEST.

User Manual 1175.6026.02 ─ 02 28


R&S® CONTEST Data Handling
Report Transfer Service

Node Description

<OpenEmpty> R&S CONTEST Report Transfer Service queues all pending transfers
persistent in the SpoolingBase and is able to restore not transferred
reports on the next startup. By setting <OpenEmpty> to True R&S
CONTEST Report Transfer Service will ignore available items on
startup and will abort their transfer. The default value is False.

<RetryTimeout> Each time the transfer of a report fails, the transfer is retried after the
given timeout. The timeout has to be defined in the format “hh:mm:ss”.
By default the timeout is set to 1 minute (00:01:00).

<WorkerTimeout> The timespan for the execution of a job. After the given timeout, the
job will be aborted and a retry enqued. The timeout has to be defined
in the format “hh:mm:ss”. By default the timeout is set to 4 hours
(04:00:00).

<AddressBase> Since R&S CONTEST Report Transfer Service runs in a separate


process independent from R&S CONTEST or R&S CONTEST Report
<AddressExtension-Slave> Manager, .NET-TCP-Channels are used for the interprocess commu-
nication. With <AddressBase> a general address can be definied
<AddressExtension-Status>
that is used for all interprocess channels.
<AddressExtensionSlave> and <AddressExtensionStatus>
specify an extensional address to distinguish the channels by their
purpose: for the status application or for the R&S CONTEST integra-
tion. The default values are:
AddressBase: http://localhost::8181/RohdeSchwarz/
Contest/ReportTransferService
AddressExtension-Slave: Slave
AddressExtension-Status: Status

Risk of Software Malfunction


It is not recommended to make changes to <AddressBase>,
<AddressExtension-Slave> or <AddressExtension-Status>. When any of
these parameters differ from their default values, the following configuration files have
to be modified accordingly:
● <ProgramFiles(x86)>\Rohde-Schwarz\Contest\
Report Transfer Service\Bin\
RohdeSchwarz.Contest.ReportTransferService.Status.exe.config
● <ProgramFiles(x86)>\Rohde-Schwarz\Contest\
<MajorVersionNumber>\GUI\Bin\Tools\
RohdeSchwarz.Contest.ReportTransferService.Slave.dll.config
● <ProgramFiles(x86)>\Rohde-Schwarz\Contest\Report Manager\Bin\
RohdeSchwarz.Contest.ReportTransferService.Slave.dll.config
Note that in these paths <ProgramFiles(x86)> stands for the path of the
Program Files (x86) folder and <MajorVersionNumber> stands for the R&S
CONTEST major version number (e.g. 14).

User Manual 1175.6026.02 ─ 02 29


R&S® CONTEST Reporting
Test Result Header Reports

3 Reporting

3.1 Test Result Header Reports


Required additional license: none

3.1.1 Introduction

Test Result Header reports are XML-based report files that have been introduced by
CETECOM in order to faciliate the exchange of data generated by a test system when
a conformance test case is performed. This open interface standard allows the auto-
matic processing and exchange of test case data consistently across different project
management software tools.
These Test Result Header reports contain information about the test system used to
perform the test case as well as information about the tested DUT and the test case
run. Rather than including actual measurement data, the file paths to the trace and test
result files such as Summary Report and Online Report are listed.

3.1.2 Stored Information

The information stored in Test Result Header reports is structured in the following main
nodes:
● Test equiment
● User equipment
● Test case
● Test variables
● Test execution
The following sections give an overview of the contents of each main node, followed by
an example XML file.

3.1.2.1 Test Equipment

The <testequipment> node lists the test systems involved in the test and their
respective hardware components, including firmware and serial number. Additionally,
the test case name and version is listed, as well as the test system software.

3.1.2.2 User Equipment

The <userquipment> node provides information about the DUT and its configuration.
This information can either be generated automatically from parameters set in the

User Manual 1175.6026.02 ─ 02 30


R&S® CONTEST Reporting
Test Result Header Reports

"DUT Configuration" dialog or entered manually in the "Test Result Header Generator"
dialog.

3.1.2.3 Test Case

The <testcase> node contains the name and title of the test case, as well as the test
specification and its version.

3.1.2.4 Test Variables

The <testvariables> node holds environmental conditions (voltage, temperature


and vibration), the involved frequency bands, parameters and test limitations. The
involved bands are reported in indexed order. In the case of carrier aggregation, index
1 represents the Primary Component Carrier (PCC) and index 2 the Secondary Com-
ponent Carrier (SCC). In addition to the index attribute, the <band> node has
optional attributes for channel and bandwidth.

3.1.2.5 Test Execution

The <testexecution> node contains information about the test case run: start time,
duration, operator, result (verdict) additional information and file paths to other test
related files such as trace files, the Summary Report or the Online Report.

3.1.2.6 Structure of the XML File

Because some elements or attributes are optional, the following example does not rep-
resent a comprehensive list of all available nodes. The elements have been left empty
for demonstration purposes.

User Manual 1175.6026.02 ─ 02 31


R&S® CONTEST Reporting
Test Result Header Reports

Example:
<?xml version="1.0" encoding="UTF-8"?>
<testresultheader headerversion="1.4" schemaversion="1.4.2">
<testequipment>
<testsystems>
<testsystem>
<name></name>
<manufacturer></manufacturer>
<hardwaredevices>
<hardwaredevice>
<name></name>
<firmware></firmware>
<serialnumber></serialnumber>
</hardwaredevice>
<hardwaredevice>
<name></name>
<firmware></firmware>
<serialnumber></serialnumber>
</hardwaredevice>
<hardwaredevice>
<name></name>
<firmware></firmware>
<serialnumber></serialnumber>
</hardwaredevice>
</hardwaredevices>
<softwareparts>
<softwarepart type="testcase">
<name></name>
<version></version>
</softwarepart>
</softwareparts>
</testsystem>
</testsystems>
<testplatformnumber></testplatformnumber>
</testequipment>
<userequipment>
<devicecode></devicecode>
<configurationcode></configurationcode>
</userequipment>
<testcase>
<name></name>
<title></title>
<testspecification>
<name></name>
</testspecification>
</testcase>
<testvariables>
<conditions>
<voltage value="normal"/>
<temperature value="normal"/>

User Manual 1175.6026.02 ─ 02 32


R&S® CONTEST Reporting
Test Result Header Reports

<vibration value="none"/>
</conditions>
<bands>
<band type="FDD 1" index="1" />
<band type="GSM 900" index="2" />
</bands>
<parameters>
<parameter type="domain"></parameter>
<parameter type="other"></parameter>
</parameters>
<limitations>
<limitation></limitation>
</limitations>
</testvariables>
<testexecution>
<starttime></starttime>
<duration></duration>
<operator></operator>
<result type="pass"/>
<filepaths>
<filepath group="trace">*.*</filepath>
<filepath group="result">*.*</filepath>
<filepath group="any">*.*</filepath>
</filepaths>
</testexecution>
</testresultheader>

3.1.3 Configuring the Test Result Header Generator

The "Test Result Header Generator" dialog can be accessed via "Settings" → "Test
Result Header Generator". This dialog configures the generation of Test Result Header
reports for conformance test cases.

User Manual 1175.6026.02 ─ 02 33


R&S® CONTEST Reporting
Test Result Header Reports

Figure 3-1: Screenshot of the Test Result Header Generator Dialog

The following table describes the parameters within the "Test Result Header Genera-
tor" dialog.
Table 3-1: Configuration Parameters for the Test Result Header Generator

GUI Element Description

"Enable Test-Result Header Generator" Enables or disables the generation of Test Result
Header reports for conformance tests.

"Acknowledge at Test Plan Start" If this parameter is enabled, the generation of Test
Result Header reports must be acknowledged once
again at the beginning of the test pan execution.

"Combine Test Case Steps (Test Frequencies and When a conformance test plan is executed, test
Bandwidth)" cases are typically executed several times due to
test frequency loops and bandwidth loops.
If this parameter is enabled, one single Test Result
Header report is generated for the repeated execu-
tion of a test case due to loops.
If this parameter is disabled, a Test Result Header
report is generated for each execution of a test
case.

"Destination Folder" Defines where the Test Result Header reports are
be stored. This is always a folder in the root direc-
tory used to store test reports and log files. The
folder can be selected or created by clicking
"Browse".
The upload performance of Test Result Header
reports can be increased by choosing a dedica-
ted subdirectory in the test report directory.

"Take DUT Service Parameter" If this parameter is enabled, the "Device Code" and
"Configuration Code" information is not entered
manually, but generated from DUT parameters
defined in the "DUT Configuration" dialog under
"DUT Identity".

User Manual 1175.6026.02 ─ 02 34


R&S® CONTEST Reporting
Test Result Header Reports

GUI Element Description

"Device Code" The information that appears in the <devicecode>


node of the xml file.

"Configuration Code" The information that appears in the


<configurationcode> node of the xml file.

3.1.4 Generating Test Result Header Reports

To enable or disable the generation of Test Result Header reports proceed as follows:
1. From the "Settings" Menu, select "Test Result Header Generator".
The "Test Result Header Generator" dialog opens:

Figure 3-2: Screenshot of the Test Result Header Generator Dialog

2. In this dialog the generation of Test Result Header reports can be enabled and dis-
abled:
a) To enable the generation of Test Result Header reports check "Enable Test-
Result Header Generator".
b) To disable Test Result Header reports remove the checkmark.

3. Set the desired parameters.


For more information about parameters of the Test Result Header Generator refer
to Table 3-1.
The upload performance of Test Result Header reports can be increased by
choosing a dedicated subdirectory in the test report directory as "Destina-
tion Folder".
Following your selection, from now on Test Result Header reports will or will no lon-
ger be generated when a test case run is finished.

User Manual 1175.6026.02 ─ 02 35


R&S® CONTEST Reporting
JSON Reports

This setting does not affect previously run test cases.

3.2 JSON Reports


Required additional license: none

3.2.1 Introduction

JSON (json.org, pronounced "Jason", short for JavaScript Object Notation) is a simple
and open data serialization format. Established libraries for reading and writing JSON
exist for all major programming languages, which makes it well-suited for interchanging
data between applications.
Contrary to Online Reports or Summary Reports, JSON reports are not intended as a
human-readable format. The JSON report serves the purpose of passing on measure-
ment data to post processing tools or client specific database systems. While the struc-
ture of both Online Report and Summary Report may change without notice, in order to
accommodate layout and design changes, the JSON report is intended as a stable for-
mat for measurement data.

For processing measurement data, please use JSON reports. Online Reports or Sum-
mary Reports are subject to change and should therefore not be used for processing
measurement data.

If enabled, the JSON Report Generator creates a JSON file for each test case, when a
test case run is finished. The file is located in the respective test case report folder and
named report.json.

Example:
c:\ContestReports\SampleTestPlan\SampleTestCase\report.json

3.2.2 Stored Information

In a JSON report the data is stored in a tree structure, where internal nodes are either
dictionaries or arrays, and leaves are either numbers, strings, booleans, or null. The
data is devided into two main nodes:
● a header with basic information about test system, DUT, and test case parameters,
● and an array of consecutive measurement nodes, each containing the actual mea-
sured data and measurement parameters.

User Manual 1175.6026.02 ─ 02 36


R&S® CONTEST Reporting
JSON Reports

Example:

Figure 3-3: Example of the tree structure of a JSON report

3.2.3 Generating JSON Reports

To enable or disable the generation of JSON reports proceed as follows:


1. From the "Settings" Menu, select "JSON Report Generator".

User Manual 1175.6026.02 ─ 02 37


R&S® CONTEST Reporting
PDF Reports

The "JSON Report Generator" dialog opens:

Figure 3-4: Screenshot of the JSON Report Generator dialog

2. In this dialog the generation of JSON reports can be enabled and disabled:
a) To enable the generation of JSON reports check "Enable JSON report genera-
tion".
b) To disable JSON reports remove the checkmark.
Following your selection, from now on JSON reports will or will no longer be gener-
ated when a test case run is finished.

This setting does not affect previously run test cases.

3.3 PDF Reports


Required additional license: R&S TS8-KT140 (Summary Report)
Other requirements: Available SummaryReport.xml file(s)

3.3.1 Introduction

PDF reports have been introduced in the context of Custom Summary Reports, in
order to facilitate the exchange of test reports in a human-readable format. When "Pdf
file" is selected in the "Save Custom Summary Report" dialog in R&S CONTEST
Report Manager, a PDF file is created, that lists an overview of the selected test cases,
as known from SummaryReportsOverview.xml files. Additionally, the respective
test case reports are embedded in the PDF file as PDF File Attachments.
Custom Summary Reports are generated on demand using R&S CONTEST Report
Manager. They can be compared with Summary Reports Overview reports, the differ-
ence is that, here it is not the report of a test plan, but rather a project report, freely
composed of desired test case reports.

User Manual 1175.6026.02 ─ 02 38


R&S® CONTEST Reporting
PDF Reports

For more information about Custom Summary Reports, refer to the user manual of
R&S CONTEST Report Manager.

Just like Summary Reports, PDF reports exist in two variants:


● Test case reports (similar to a Summary Report)
● Custom Summary Reports (similar to a Summary Reports Overview)
PDF reports are available in portrait or landscape format. The desired format can be
specified while creating a Custom Summary Report (see Chapter 3.3.3.2, "Generating
PDF Reports on Demand", on page 48).

3.3.2 Stored Information

Both the Custom Summary Report and the test case reports have a cover sheet, giving
meta information about the project or the test case followed by the actual contents.

3.3.2.1 Custom Summary Report

The information stored in a Custom Summary Report generally complies with the infor-
mation in a SummaryReportsOverview.xml file:
● Amount of test cases
● Verdict distribution
● List of all test cases within the custom project
– Test case title
– Verdict
– Limitations
– Software-Versions
– Start time and duration
– Observation
Additionally, a Custom Summary Report holds the following information:
● Custom project title
● Custom comment
● Custom logo
● Embedded PDF files of all test case reports
By double clicking the paperclip icon of a test case in the list, the respective embedded
test case report will be opened. Alternatively, the embedded test case reports can be
opened or saved individually from the PDF File Attachments panel of a PDF viewer.
Using Adobe Reader, this panel will be opened automatically when opening the Cus-
tom Summary Report PDF file.

User Manual 1175.6026.02 ─ 02 39


R&S® CONTEST Reporting
PDF Reports

Figure 3-5: Example of attached test case reports within a Custom Summary Report PDF file

User Manual 1175.6026.02 ─ 02 40


R&S® CONTEST Reporting
PDF Reports

Figure 3-6: Example of the cover sheet of a Custom Summary Report

User Manual 1175.6026.02 ─ 02 41


R&S® CONTEST Reporting
PDF Reports

Figure 3-7: Example of the contents of a Custom Summary Report

3.3.2.2 Test Case Report

Because a PDF versions of test case reports are generated from Summary Report
XML files, the contents are equal.

User Manual 1175.6026.02 ─ 02 42


R&S® CONTEST Reporting
PDF Reports

Figure 3-8: Example of the cover sheet of a test case report

User Manual 1175.6026.02 ─ 02 43


R&S® CONTEST Reporting
PDF Reports

Figure 3-9: Example of the contents of a test case report

As known from the SummaryReport.xml, different information density options are


available:
● Reduced
● Standard
● Extended
The following tables give an overview of the availability of contents for each information
density option, based on the sections of a Summary Report.

User Manual 1175.6026.02 ─ 02 44


R&S® CONTEST Reporting
PDF Reports

Summary
Table 3-2: Test Case Information

Reduced Standard Extended

Test Specification and Version ✓ ✓ ✓

Test Case Number ✓ ✓ ✓

Test Case Description ✓ ✓

Test Case Software and Version ✓

Test Method and Version ✓

Test Method Parameter and Version ✓ ✓ ✓

Test Case Parameter ✓ ✓ ✓

Test Case Limitation ✓ ✓ ✓

Test Case Start Date and Time ✓ ✓ ✓

Test Case End Date and Time ✓ ✓ ✓

Test Case Duration ✓ ✓

Test Case Final Verdict ✓ ✓ ✓

Observation ✓ ✓ ✓

Total Number of Test Steps ✓ ✓ ✓

Table 3-3: Test System Description

Reduced Standard Extended

Test System Name ✓ ✓ ✓

Test Platform Number ✓ ✓

Test System Variant ✓ ✓

Test System Location ✓ ✓

Test System Serial Number ✓ ✓ ✓

Test System Calibration Information ✓ ✓

Test System Software and Version ✓ ✓

Test System Hardware Configuration ✓ ✓

Table 3-4: Operator Information

Reduced Standard Extended

Operator Name ✓

Operator Parameter ✓

User Manual 1175.6026.02 ─ 02 45


R&S® CONTEST Reporting
PDF Reports

Table 3-5: DUT Cable Information

Reduced Standard Extended

Identifier ✓ ✓

Product Type ✓ ✓

Interconnection Details ✓ ✓

Calibration Information ✓

Table 3-6: DUT Information

Reduced Standard Extended

DUT Identifier ✓ ✓ ✓

DUT Manufacturer ✓ ✓

DUT Product Type ✓ ✓

DUT Serial Number / IMEI ✓ ✓ ✓

DUT HW Revision ✓ ✓ ✓

DUT SW Revision ✓ ✓ ✓

DUT Parameters (PICS/PIXIT) ✓ ✓ ✓

Table 3-7: SIM Information

Reduced Standard Extended

SIM Identifier ✓ ✓ ✓

SIM Manufacturer ✓

SIM Product Type ✓

SIM Revision ✓

SIM Parameters ✓ ✓

Table 3-8: Static Test Conditions

Reduced Standard Extended

Tested Band(s) ✓ ✓ ✓

Tested Bandwidth ✓ ✓

Test Conditions ✓ ✓ ✓

Static Test Case Parameters ✓ ✓

Table 3-9: Test Step Results

Reduced Standard Extended

Step Number ✓ ✓

Description ✓ ✓

Result ✓ ✓

User Manual 1175.6026.02 ─ 02 46


R&S® CONTEST Reporting
PDF Reports

Test Case Information


Reduced: not available
Standard: not available
Extended: all contents

Test Step Details


Reduced: not available
Standard: all contents but trace messages
Extended: all contents

Supplementary Information
Reduced: not available
Standard: only external links and charts
Extended: all contents

3.3.3 Generating PDF Reports

PDF reports are generated using the Custom Summary Report functionality of R&S
CONTEST Report Manager. If no PDF files for the selected test case report(s) exist,
they will be generated from the respective SummaryReport.xml files. In case PDF
file have already been generated, they will be used instead.

Because PDF reports are generated from SummaryReport.xml files, their generation
is not possible without the available xml files.

3.3.3.1 Generating PDF Reports Automatically

Because the generation of PDF files for Summary Reports is a time-consuming proc-
ess, a setting has been implemented in R&S CONTEST, that allows the automatic
generation of Summary Report PDF files at the end of a test case run. This way, the
generation of Custom Summary Reports in PDF format will be performed significantly
faster.

Enabling this setting will increase the test case run time.

In order to activate the automatic generation of PDF test case reports, proceed as fol-
lows:
1. From the R&S CONTEST menu bar, select "Settings" → "Summary Report".
The "Summary Report" dialog appears.

User Manual 1175.6026.02 ─ 02 47


R&S® CONTEST Reporting
PDF Reports

2. Make sure the "Enable Summary Report" is checked.

3. Enable the checkbox "Create PDF".

4. From the "Select report style" field, select the desired information density option.
For more information about the information density options refer to Chapter 3.3.2.2,
"Test Case Report", on page 42.

5. Click the "OK" button.


From now on, PDF files of test case reports will be generated automatically at the
end of a test case run.

The PDF files are located in the respective test case's report folder. The naming con-
vention is as follows:
TestCaseName_TestCaseNumber_PageOrientation_InformationDensity.pdf

Example:
TestcaseLteThroughput_7_6_1_portrait_extended.pdf

3.3.3.2 Generating PDF Reports on Demand

Using R&S CONTEST Report Manager, PDF reports can be generated on demand,
even if the automatic PDF generation is disabled in the Summary Report settings of
R&S CONTEST.

In order to generated PDF reports on demand, proceed as follows:


1. In R&S CONTEST Report Manager, navigate to the desired test case runs.

2. Select the desired test case run(s) from the right panel.

3. Right click on the selected test case run(s) to open the context menu.

4. From the context menu, select "Create Custom Summary Report".


The "Save Custom Summary Report" dialog appears.

5. Select "Pdf File" as Creation Mode.

6. Select the desired page orientation and information density option ("Style").

7. Specify a destination folder and filename (the file extension .pdf will be added
automatically).

8. Optionally, add a "Project Name", "Comment" and "Additional Header Image".

9. Click the "OK" button.


The generation of the Custom Summary Report is started as indicated by a status
bar at the bottom of the right panel (not to be confused with the general status bar,
spanning both the left and the right panel!). Once the generation is complete, a
Windows Explorer window appears showing the contents of the specified output
folder.

User Manual 1175.6026.02 ─ 02 48


R&S® CONTEST Remote Control and Test Automation
Remote Server

4 Remote Control and Test Automation

4.1 Remote Server


R&S CONTEST Remote Server
Required additional license: R&S TS8-KT130

4.1.1 Introduction

R&S CONTEST Remote Server is based on SOAP technology and serves the purpose
of remotely performing essential functions on the test system. Controlled by a (possibly
remote) SOAP client, R&S CONTEST Remote Server uses the Command Line Inter-
face to interact with R&S CONTEST. It acts as a single interface for all available R&S
CONTEST versions and their respective Command Line Interfaces.

The command line interface does not support parallel testing.


In order to save any settings modified by means of the R&S CONTEST GUI into the
correct configuration file for the command line interface, R&S CONTEST GUI must be
started directly via Contest.exe. If R&S CONTEST GUI is started using the R&S
CONTEST Configuration Selector, any settings modified will not be considered by the
command line interface.

For more information on R&S CONTEST Command Line Interface and its functionality,
refer to Chapter 4.2, "Command Line Interface", on page 60.

For the start of a test case with parameters R&S CONTEST Remote Server acts as a
single transparent interface to all installed R&S CONTEST application versions. The
start of a test plan however is always related to the version where the test plan has
been created and stored with the help of the R&S CONTEST GUI.
The following figure illustrates the main architecture.

User Manual 1175.6026.02 ─ 02 49


R&S® CONTEST Remote Control and Test Automation
Remote Server

Figure 4-1: Illustration of the R&S CONTEST Remote Server and Command Line Interface architec-
ture

4.1.2 Setting up the Remote Server

The following prerequisites must be fullfilled in order to start R&S CONTEST Remote
Server:
● The newest version of the package R&S CONTEST Remote Server must be instal-
led on the system controller.
Depending on the version it comes as a single installer (e.g.
RS-CONTEST-REMOTE-SERVER_14.00.0.170.msi).
● The test system must be configured by means of the R&S CONTEST GUI.
● All DUT mandatory properties (cabling, automation, standard properties) must be
configured by means of the R&S CONTEST GUI.
During the remote test plan or test case run a DUT can be activated.

In order to start R&S CONTEST Remote Server proceed as follows:


► On the system controller, start the R&S CONTEST Remote Server executable:
"C:\Program Files (x86)\Rohde-Schwarz\ContestRemoteServer\
Bin\RohdeSchwarz.Contest.RemoteServer.exe".
The "Remote Server" status window appears.

User Manual 1175.6026.02 ─ 02 50


R&S® CONTEST Remote Control and Test Automation
Remote Server

Figure 4-2: Screenshot of the Remote Server dialog

The SOAP server is now running with the standard IP port 65111. It is not neces-
sary to start the R&S CONTEST GUI in order to use the R&S CONTEST Remote
Server functionality.

To start R&S CONTEST Remote Server completely unattended, link


RohdeSchwarz.Contest.RemoteServer.exe to the Windows autorun.

To adapt the port to your needs RohdeSchwarz.Contest.RemoteServer.exe can


be started with the option -p <PortNumber>.

Please ensure that the relevant port number is not blocked by any firewall.

4.1.3 Interface Description

The SOAP interface description can be retrieved by means of a generated WSDL file.
With R&S CONTEST Remote Server running, a "WSDL" file containing the interface
description can be generated by invoking a URL with the following structure: http://
<SystemControllerHostNameOrIP>:<Port>/RemoteServer/
RemoteServer?wsdl.

Example:
http://ts8980:65111/RemoteServer/RemoteServer?wsdl

For demonstration purposes, a C# code sample is provided with the installation of R&S
CONTEST by means of a Microsoft Visual Studio solution. A .zip file containing the

User Manual 1175.6026.02 ─ 02 51


R&S® CONTEST Remote Control and Test Automation
Remote Server

solution is located in
"C:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\Docs\
ContestRemoteServerSampleClient.zip".

Note that in this path, <ContestMajorVersionNumber> stands for the actual R&S
CONTEST major version number (e.g. 14).

The proxy module RemoteClientProxy.cs has automatically been generated by


means of the WSDL file. If any other programming language is needed, the WSDL file
can be used to generate the code for the remote procedure calls.

For more information on using the WSDL file refer to https://msdn.microsoft.com/en-us/


library/7h3ystb6(v=vs.100).aspx.

The Program.cs illustrates the usage of the interface. For further details please refer
to the in-code documentation.

4.1.4 Interface Functions

The SOAP interface provides a set of functions to set parameters, to perform the
requested actions or to retrieve information.

4.1.4.1 DoSetParameterValue(<Parameter>,<Value>)

This function sets parameters which are saved and used within the following call of
DoStartOperation(<Action>). The parametrization is depending on the use case
and differs between the start of a test plan or of a test case.

General Parameters

Parameter Value Description

KeepContestAlive True or False optional


Keeps the R&S CONTEST service running after fin-
ish of the test plan or the test case. This will speed
up the starting.

AbortTimeout Seconds optional


This applies only for
DoStartOperation("Abort"). The default value
is 0, i.e. no time out.

Test Plan Parameters


For a standard R&S CONTEST test plan (.rstt), the following parameters apply:

User Manual 1175.6026.02 ─ 02 52


R&S® CONTEST Remote Control and Test Automation
Remote Server

Parameter Value Description

Testplan Test plan The path to the standard R&S CONTEST test plan
(.rstt), relative to the test plan directory on the
system controller
(c:\ProgramData\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\Testplans)
e.g. MySubFolder\MyTestPlan.rstt.

TestplanVersion Contest major version This specifies the version that was used for creating
number and saving the test plan.
e.g. 14.

ActiveDut DUT name optional


Activates the DUT for the current test plan run. The
name must be identical to the given out of "Config-
ured DUTs" on page 75. The name of a DUT is
set in R&S CONTEST GUI.
If not specified the activated DUT within the R&S
CONTEST GUI is taken.

ReportFolder Report folder name The given folder name must be a relative path. The
directory will be created within the test report direc-
tory
\\<SystemControllerHostNameOrIP>\
ContestReports.

Simplified test plans (.xml) are not created beforehand and referenced via the Test-
plan parameter, but built up step by step with the following parameter:

Parameter Value Description

AddTestcase Test case Adds a test case to the current simplified test plan
as a string, containing the XML element
<testcase> and its child elements.
For an example of the usage of this parameter refer
to Chapter 4.1.5.2, "Starting a Simplified Test Plan",
on page 57.
For more information refer to "Simplified Test Plan"
on page 65 or to "Available Test Cases"
on page 71 for the exact spelling of the test case
name and the possible parametrization.

Test Case Parameters


If a single test case shall be performed the following parameters apply.
The test cases can be parametrized to be performed within a requested band or with
the desired bandwidth, frequency or environmental condition. Multiple parameter val-
ues must be separated with a semi-colon (;).
If no parameter is given the test case performs as known from the R&S CONTEST GUI
test plan. The test case will be started and depending on the 3GPP specification the
applicable combinations of the parameter values will be executed.

User Manual 1175.6026.02 ─ 02 53


R&S® CONTEST Remote Control and Test Automation
Remote Server

Parameter Value Description

Testcase Test case e.g. TestcaseLtePower_6_2_2


Please see "Available Test Cases" on page 71 for
the exact spelling of the test case name and the
possible parametrization.

TestcaseVersion Version e.g. RF-LTE-3.40

ActiveDut DUT name optional


Activates the DUT for the current test plan run. The
name must be identical to the given out of "Config-
ured DUTs" on page 75.
If not specified the DUT activated within the R&S
CONTEST GUI is used.
Can not be combined with DutForMux.

ReportFolder Report folder name The given folder name must be a relative path. The
directory will be created within the test report direc-
tory
\\<SystemControllerHostNameOrIP>\
ContestReports.

DutForMux List of DUT indexes optional


These values specify the index within the DUT MUX
configuration. It applies only, if a DUT MUX hard-
ware is enabled and the relevant DUTs have been
configured and activated within the R&S CONTEST
GUI
Can not be combined with ActiveDut.
Values: 1, 2, …, 6, not set = all.

EnvironmentalCondition List of environmental optional


conditions Sets the environmental conditions for the Testcase.
Values: LL, LH, NN, ML, HL, HH, not set = all.

ForceEnvironmentalCon- Forces the environmental conditions values as set


dition above to be performed, independent of 3GPP speci-
fication.

PrimaryBand List of band specifier optional


Sets the bands for the Testcase.
For RRM or carrier aggregation test cases this
applies for the primary cell only.
Values: FDD 1, FDD 2, TDD 41, …, not set = all.

PrimaryBandwidth List of bandwidth speci- optional


fier Sets the bandwidth for the test case.
For carrier aggregation test cases this applies for
the primary cell only.
Values: 1.4 MHz, 3 MHz, 5 MHz, 10 MHz, 15
MHz, 20 MHz, not set = all.

ForcePrimaryBandwidth Forces the bandwidth values as set above to be per-


formed, independent of 3GPP specification.
For carrier aggregation test cases this applies for
the primary cell only.

User Manual 1175.6026.02 ─ 02 54


R&S® CONTEST Remote Control and Test Automation
Remote Server

Parameter Value Description

PimaryFrequencyId List of frequency id optional


specifier Sets the frequencies for the Testcase.
For carrier aggregation test cases this applies for
the primary cell only.
Values: Low, Mid, High, not set = all.

ForcePrimaryFrequen- Forces the frequency values as set above to be per-


cyId formed, independent of 3GPP definition.
For carrier aggregation test cases this applies for
the primary cell only.

SecondaryBand List of band specifier optional


Sets the bands of the secondary cell for the Test-
case.
This applies to RRM and carrier aggregation test
cases only.
Values: FDD 1, FDD 2, TDD 41, …, not set = all.

SecondaryBandwidth List of bandwidth speci- optional


fier Sets the bandwidth for the secondary cell.
This applies to carrier aggregation test cases only.
Values: 1.4 MHz, 3 MHz, 5 MHz, 10 MHz, 15
MHz, 20 MHz, not set = all.

ForceSecondaryBand- Forces the bandwidth values as set above to be per-


width formed, independent of 3GPP specification.
This applies to carrier aggregation test cases only.

SecondaryFrequencyId List of frequency id optional


specifier Sets the frequencies for the secondary cell.
This applies for carrier aggregation test cases only.
Values: Low, Mid, High, not set = all.

ForceSecondaryFre- Forces the frequency values as set above to be per-


quencyId formed, independent of 3GPP speficifation.
This applies for carrier aggregation test cases only.

4.1.4.2 DoStartOperation(<Action>)

This function performs the desired actions. Depending on the action the parameters
from DoSetParameterValue(<Parameter>,<Value>) will be considered.

User Manual 1175.6026.02 ─ 02 55


R&S® CONTEST Remote Control and Test Automation
Remote Server

Action Description

CreateAvailableTestcases Requests a file with all available test cases listed.


The output file will be written into the local R&S CONTEST report
folder and can be retrieved via
\\<SystemControllerHostNameOrIP>\ContestReports\
Testcases.xml.
Each test case entry will contain the following information in addition:
● Test case name
● Version
● Assembly Version
● Bands supported
● Bandwidth supported
● Frequency range value supported
● Environmental condition supported
For the syntax of the generated file please see "Available Test Cases"
on page 71.

CreateAvailableDuts Requests a file with all available and configured DUTs listed.
The output file will be written into the local R&S CONTEST report
folder and can be retrieved via
\\<SystemControllerHostNameOrIP>\ContestReports\
DUTs.xml.
For the syntax of the generated file please see "Configured DUTs"
on page 75.

StartTestplan Starts the test plan.

StartXMLTestplan Starts the simplified test plan.


For more information on simplified test plans, refer to "Simplified Test
Plan" on page 65.

StartTestcase Starts a single test case.

Abort Aborts the current run.

4.1.4.3 DoGetOutput()

This function returns the latest console output of the running test case. With the call the
internal buffer is deleted and will return with the next call the new output.

4.1.4.4 DoGetLastExitCode()

This function will always return a code stating the verdict. A complete list of the return
codes is available at Chapter 4.2.4.2, "Process Return Code", on page 70.

4.1.5 Examples

The following sections contain examplary C# code for:

● Starting a standard R&S CONTEST test plan (.rstt)


● Starting a simplified test plan (.xml)
● Starting a single test case

User Manual 1175.6026.02 ─ 02 56


R&S® CONTEST Remote Control and Test Automation
Remote Server

4.1.5.1 Starting a Standard R&S CONTEST Test Plan

Example:
var client = new RemoteServerService();

client.Url = "http://ts8980:65111/RemoteServer";

client.DoStartOperationCompleted += (sender, args) =>


{
Console.WriteLine("operation result: " + args.Result);
var code = client.DoGetLastExitCode();
Console.WriteLine("exit code: " + code);
Environment.Exit(code);
};

client.DoSetParameterValue("TestplanVersion", "14");
client.DoSetParameterValue("Testplan", "Remote.rstt");
client.DoStartOperationAsync("StartTestplan");

while (true)
{
var output = client.DoGetOutput();
if (!string.IsNullOrEmpty(output))
{
Console.Write(output);
Thread.Sleep(100);
}
}

4.1.5.2 Starting a Simplified Test Plan

Note that in the following example, the same test case is added twice with different
<TestFrequencyIDs> for each.

User Manual 1175.6026.02 ─ 02 57


R&S® CONTEST Remote Control and Test Automation
Remote Server

Example:
var client = new RemoteServerService();

client.Url = "http://ts8980:65111/RemoteServer";

client.DoStartOperationCompleted += (sender, args) =>


{
Console.WriteLine("operation result: " + args.Result);
var code = client.DoGetLastExitCode();
Console.WriteLine("exit code: " + code);
Environment.Exit(code);
};

client.DoSetParameterValue("TestplanVersion", "14");
client.DoSetParameterValue("KeepContestAlive", "true");

client.DoSetParameterValue("AddTestcase", @"
<Testcase Name=""TestcaseLteAclr_6_6_2_3"">
<Name>LTE FDD 6.6.2.3 Adjacent Channel Leakage power Ratio</Name>
<Description>Adjacent Channel Leakage power Ratio</Description>
<GUID>95A27D80-0BAD-494D-B6FA-F28C818C60A3</GUID>
<Version>RF-LTE-3.40</Version>
<ReleaseDate />
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalConditions>NN;</EnvironmentalConditions>
<LTEBands>FDD 1;</LTEBands>
<Bandwidths>5 MHz;</Bandwidths>
<TestFrequencyIDs>Low;</TestFrequencyIDs>
<BaseVersion>14</BaseVersion>
</Testcase>
");

client.DoSetParameterValue("AddTestcase", @"
<Testcase Name=""TestcaseLteAclr_6_6_2_3"">
<Name>LTE FDD 6.6.2.3 Adjacent Channel Leakage power Ratio</Name>
<Description>Adjacent Channel Leakage power Ratio</Description>
<GUID>95A27D80-0BAD-494D-B6FA-F28C818C60A3</GUID>
<Version>RF-LTE-3.40</Version>
<ReleaseDate />
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalConditions>NN;</EnvironmentalConditions>
<LTEBands>FDD 1;</LTEBands>
<Bandwidths>5 MHz;</Bandwidths>
<TestFrequencyIDs>Mid;</TestFrequencyIDs>
<BaseVersion>14</BaseVersion>
</Testcase>
");

client.DoStartOperationAsync("StartXMLTestplan");

User Manual 1175.6026.02 ─ 02 58


R&S® CONTEST Remote Control and Test Automation
Remote Server

while (true)
{
var output = client.DoGetOutput();
if (!string.IsNullOrEmpty(output))
{
Console.Write(output);
Thread.Sleep(100);
}
}

4.1.5.3 Starting a Single Test Case

Example:
var client = new RemoteServerService();

client.Url = "http://ts8980:65111/RemoteServer";

client.DoStartOperationCompleted += (sender, args) =>


{
Console.WriteLine("operation result: " + args.Result);
var code = client.DoGetLastExitCode();
Console.WriteLine("exit code: " + code);
Environment.Exit(code);
};

client.DoSetParameterValue("TestplanVersion", "14");
client.DoSetParameterValue("Testcase", "TestcaseLteAclr_6_6_2_3");
client.DoSetParameterValue("TestcaseVersion", "RF-LTE-3.40");
client.DoSetParameterValue("EnvironmentalCondition", "NN");
client.DoSetParameterValue("PrimaryBand", "FDD 1");
client.DoSetParameterValue("PrimaryBandwidth", "5 MHz");
client.DoSetParameterValue("PrimaryFrequencyId", "Mid");
client.DoSetParameterValue("ForceAll", "true");
client.DoStartOperationAsync("StartTestcase");

while (true)
{
var output = client.DoGetOutput();
if (!string.IsNullOrEmpty(output))
{
Console.Write(output);
Thread.Sleep(100);
}
}

User Manual 1175.6026.02 ─ 02 59


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

4.2 Command Line Interface


R&S CONTEST Command Line Interface
Required additional license: R&S TS8-KT130

4.2.1 Introduction

R&S CONTEST provides a simple interface to be controlled by a customer client with-


out the R&S CONTEST GUI.
To be independent of certain technologies the interface has been designed as simple
as possible, i.e. a standard Windows executable which can be controlled via the com-
mand line interface.
The existing remote interface is still supported, but due to the complexity and the exist-
ing dependencies (.NET, assembly version) this command line interface is preferable.

The command line interface does not support parallel testing.


In order to save any settings modified by means of the R&S CONTEST GUI into the
correct configuration file for the command line interface, R&S CONTEST GUI must be
started directly via Contest.exe. If R&S CONTEST GUI is started using the R&S
CONTEST Configuration Selector, any settings modified will not be considered by the
command line interface.

4.2.2 Overview

The command line interface


(<ProgramFiles>\Rohde-Schwarz\Contest\<BASE-Version>\GUI\Bin\
RohdeSchwarz.Contest.CommandLineInterface.exe) is installed with every
available R&S CONTEST-Base version. Within this document it is named only as
CommandLineInterface.exe, however a full qualifier is required.
Any dependencies on the installed version are handled internally, the calling client only
needs to know the location (i.e. the relevant R&S CONTEST-Base version).
The command line interface supports:
● retrieval of test system information, e.g. the available test cases
● start of pre-configured test plans or single test cases
● parametrization of single test case runs
● report messages via standard output
● a simple GUI for status update

User Manual 1175.6026.02 ─ 02 60


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

Modifying Settings in the R&S CONTEST GUI


To keep the interface simple all additional settings should be done by means of the
R&S CONTEST GUI:
● configuring test system
● enabling and cabling of the DUT
● maintaining the DUT MUX
● configuring the Test Case Run Manager
● managing campaings
In order to save any settings modified by means of the R&S CONTEST GUI into the
correct configuration file for the command line interface, R&S CONTEST GUI must be
started directly via Contest.exe. If R&S CONTEST GUI is started using the R&S
CONTEST Configuration Selector, any settings modified will not be considered by the
command line interface.

Optionally a small graphical user interface will be created, to show the status of the
CommandLineInterface.exe. The test case report messages will be displayed as
plain unformatted text. It is intended to support debugging or status retrieval. This com-
mand line interface GUI can be enabled with the --gui (short form: -g) parameter.
The following picture illustrates the (optional) command line interface GUI.

Figure 4-3: Screenshot of the command line interface GUI

User Manual 1175.6026.02 ─ 02 61


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

4.2.3 Parameters

The following sections describe the available input parameters. By calling


CommandLineInterface.exe without any input parameters or with --help (short
form: -?) a list of possible parameters will be displayed.
The parameters are defined according to the POSIX standard and provide a short and
a long form, both will be displayed in the tables below.

4.2.3.1 General

Table 4-1: General parameters of the Command Line Interface

Parameter Value Description

--help -? A list of possible input parameters will be displayed.

--port -p port num- Sets the port used to communicate with R&S CONTEST.
ber

--closecontest -cc Ends the R&S CONTEST process, when the command
line interface process is ended.

--abort -ab timeout in Ending all command line interface processes and
ms thereby aborting all running test plans that have been
started with the command line interface. This parameter
can be combined with --closecontest in order to end
the R&S CONTEST process as well.
First the application will be stopped in a controlled way,
i.e. it will be waited for any running measurements, the
report will be closed, the devices will be reset properly.
After the specified timeout the process will be ended.
That means the software has no chance to react in a
proper way. So please ensure a correct timeout.

--testsystemsettings -tss path name Sets the file name with full path for the test system set-
tings file. This used for automated regression systems,
when the current system configuration is replaced by the
R&S CONTEST installer.
This file holds the differences to the standard installation
and can be generated automatically within the R&S
CONTEST GUI test system dialog. Please refer to R&S
CONTEST Help as well.

--operator -or operator Sets the name of the operator.


name

User Manual 1175.6026.02 ─ 02 62


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

4.2.3.2 Information Retrieval

Table 4-2: Information retrieval parameters of the Command Line Interface

Parameter Value Description

--availabletestcases -at .xml file Requests a file with all available test cases listed.
name The given .xml file name is the output file generated by
the program which can be generated on any (writable)
location.
Each test case entry will contain the following information
in addition.
● Test case name
● Version
● Assembly Version
● Bands supported
● Bandwidth supported
● Frequency range value supported
● Environmental condition supported
For the syntax of the generated file please see "Available
Test Cases" on page 71.

--configuredduts -cd .xml file Requests a .xml file with all available and configured
name DUTs (devices under test) listed.
For the syntax of the generated file please see "Config-
ured DUTs" on page 75.
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.

Example:
RohdeSchwarz.Contest.CommandLineInterface.exe –at c:
\Alltestcases.xml
(written all in one line)

4.2.3.3 Start Test Plan

This method is used to start test plans. There are two possibilities to start test plans:
● Start standard R&S CONTEST test plan (.rstt)
● Start simplified test plan (.xml)
The following parameter are applicable to both possibilities:
Table 4-3: Parameters for starting a test plan with the Command Line Interface

Parameter Value Description

--gui -g The command line interrace is started with an additional


graphical user interface, see Chapter 4.2.2, "Overview",
on page 60.

--keepalive -ka Keeps the command line interface GUI alive, although an
error occured. This is for debugging purposes to watch
the latest error messages.

User Manual 1175.6026.02 ─ 02 63


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

Parameter Value Description

--activedut -ad DUT name Activates the given DUT (Device Under Test). The name
must be identical to the given out of
--configuredduts.
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.

--testplan -tp test plan Sets the test plan, that should be executed.
The test plan name must be specified with a full qualified
path! The file extension for a standard R&S CONTEST
test plan is .rstt, while the file extension for a simpli-
fied test plan is .xml.

--testplanreportdir -td directory Sets the test plan report directory.


If not set, a new directory will be created named after the
test plan name, with the appended current date and time.
The directory must be declared relative to the configured
R&S CONTEST Report Root. The root directory is con-
figured during first start of R&S CONTEST by means of
the Report Folder Manager. A share for remote access is
automatically created.

--runandrepeat -rnr Enables the run and repeat mode, known from R&S
CONTEST GUI. If this parameter is set, the settings
defined in the "R&S CONTEST" → "Settings" → "Test
Case Run Manager" dialog will be used for the test exe-
cution.
For more information please refer to "Modifying Settings
in the R&S CONTEST GUI" on page 61.

--campaign -c campaign Assigns the test plan to the specified campaign. The
GUID campaign is specified by its identifier (GUID). The GUID
of a campaign can be obtained from the Campaign Man-
ager in the R&S CONTEST GUI or R&S CONTEST
Report Manager.
Campaigns are configured by means of the R&S CON-
TEST GUI. For more information please refer to "Modify-
ing Settings in the R&S CONTEST GUI" on page 61.

Standard R&S CONTEST Test Plan


A standard R&S CONTEST test plan should be generated with the R&S CONTEST
GUI and must have the extension .rstt. The behaviour is identical to the R&S CON-
TEST GUI, i.e. the loops over bands, test frequencies and so on will be performed - as
visible within the R&S CONTEST GUI.

Example:
RohdeSchwarz.Contest.CommandLineInterface.exe -tp c:
\ProgramData\Rohde-Schwarz\Contest\<BASE-x.yz>\Testplans\MyTestplan.rstt

(written all in one line)

User Manual 1175.6026.02 ─ 02 64


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

Simplified Test Plan


A simplified test plan should be generated by the user client software and must have
the extension .xml. The syntax is identical to the output file given with the option --
availabletestcases. For an example of this output file, refer to "Available Test Cases"
on page 71. However while this output file is returning all possible parameter values
the simplified test plan must contain only one parameter value per test case loop but
still end with a semicolon (;).

Example:
Wrong: <TestFrequency>Low;Mid;High</TestFrequency>
Right: <TestFrequency>Low;</TestFrequency>

Additionally to the parameters shared with start standard R&S CONTEST test plan, a
simplified test plan has the following parameters:
Table 4-4: Additional parameters for starting a simplified test plan with the Command Line Interface

Parameter Value Description

--savetestplan -st Converts a created simplified test plan to a standard


R&S CONTEST test plan and saves it as an .rstt file
in the Testplans directory.

For further details please refer to "Simplified Test Plan" on page 74.
The test cases will run as stated, i.e. no looping or sorting or repeating will be done
within R&S CONTEST.

Starting a simplified test plan can not be combined with Chapter 4.2.3.4, "Start Test
Case", on page 65.

Example:
RohdeSchwarz.Contest.CommandLineInterface.exe -tp c:
\ProgramData\Rohde-Schwarz\Contest\<BASE-x.yz>\Testplans\MyTestplan.xml
(written all in one line)

4.2.3.4 Start Test Case

This method is used for starting a single test case.


The test cases can be parametrized to run within a requested band or with the wanted
bandwidth, frequency or environmental condition. Multiple parameter values must be
seperated with a either commata (,) or semicola (;).

Example:
5 MHz, 10 MHz, 15 MHz
or
5 MHz; 10 MHz; 15 MHz

User Manual 1175.6026.02 ─ 02 65


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

If no parameter is given the test case runs as known from the standard R&S CON-
TEST test plan. The test case will be started and depending on the 3GPP specification
the applicable combinations of the parameter values will be executed.
The following parameter are applicable to starting a single test case:
Table 4-5: Parameters for starting a test case with the Command Line Interface

Parameter Value Description

--info -i Lists available application versions, available test cases


within an application or available parameters of a specific
test case.
Example:
CommandLineInterface.exe --info
Lists available application versions.
CommandLineInterface.exe --info
RF-LTE-3.40
Lists all test cases available within RF-LTE 3.40.
CommandLineInterface.exe --info
RF-LTE-3.40 TestcaseLteThroughput_7_5
Lists all available parameters of the test case TestcaseL-
teThroughput_7_5.

--gui -g The command line interface is started with an additional


graphical user interface, see Chapter 4.2.2, "Overview",
on page 60.

--keepalive -ka Keeps the command line interface GUI alive, although an
error occured. This is for debugging purposes to watch
the latest error messages.

--dut -dt DUT num- Sets the DUT number, as configured within the DUT
ber MUX (only if the DUT MUX is enabled).
Note that in the case of starting a single test case,
the --dut parameter is used differently than for
starting a test plan!
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.

--activedut -ad DUT Activates the given DUT (Device Under Test). The name
name must be identical to the given out of
--configuredduts.
DUTs are configured by means of the R&S CONTEST
GUI. For more information please refer to "Modifying Set-
tings in the R&S CONTEST GUI" on page 61.

--savetestplan -st Converts a created simplified test plan to a standard


CONTEST test plan and saves it as an .rstt file in the
Testplans directory.

User Manual 1175.6026.02 ─ 02 66


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

Parameter Value Description

--testplanreportdir -td directory Sets the test plan report directory.


If not set, a new directory will be created named after the
test plan name, with the appended current date and time.
The directory must be declared relative to the configured
R&S CONTEST Report Root. The root directory is con-
figured during first start of R&S CONTEST by means of
the Report Folder Manager. A share for remote access is
automatically created.

--testcase -tc test case Sets the test case that should be executed.

--version -v version Sets the version of the test case. This parameter is man-
datory.

--runandrepeat -rnr Enables the run and repeat mode, known from R&S
CONTEST GUI. If this parameter is set, the settings
defined in the "R&S CONTEST" → "Settings" → "Test
Case Run Manager" dialog will be used for the test exe-
cution.
For more information please refer to "Modifying Settings
in the R&S CONTEST GUI" on page 61.

--campaign -c campaign Assigns the test case to the specified campaign. The
GUID campaign is specified by its identifier (GUID). The GUID
of a campaign can be obtained from the Campaign Man-
ager in the R&S CONTEST GUI or R&S CONTEST
Report Manager.
Campaigns are configured by means of the R&S CON-
TEST GUI. For more information please refer to "Modify-
ing Settings in the R&S CONTEST GUI" on page 61.

Additionally, test cases may have their own parameters. Using the --info parameter,
a list of available test case paramters can be shown.

The following list is only an example of test case parameters, not a complete list and
not applicable to all test cases.

Example:
Table 4-6: Example of parameters for starting a test case with the Command Line Interface

Parameter Value Description

--primaryband -pd band Sets the bands for the test case.
For RRM or carrier aggregation test cases
this applies for the primary cell only.
Values: FDD 1, FDD 2, TDD 41, …, not set =
all

--primarybandwidth -ph bandwidth Sets the bandwidth for the test case. This
applies to LTE only.
Values: 1.4 MHz, 3 MHz, 5 MHz, 10 MHz, 15
MHz, 20 MHz, not set = all

User Manual 1175.6026.02 ─ 02 67


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

Parameter Value Description

--forceprimarybandwidth -fph Forces the bandwidth values as set above to


be performed, independent of 3GPP defini-
tion.

--primaryfrequency -py frequency Sets the frequencies for the test case.
Values: Low, Mid, High, not set = all

--forceprimaryfrequency -fpy Forces the frequency values as set above to


be performed, independent of 3GPP defini-
tion.

--secondaryband -sd band Sets the bands of the secondary cell for the
test case. This applies to RRM and carrier
aggregation test cases only.
Values: FDD 1, FDD 2, TDD 41, …, not set =
all

--secondarybandwidth -sh bandwidth Sets the bandwidth of the secondary cell for
the test case. This applies to carrier aggrega-
tion test cases only.
Values: 1.4 MHz, 3 MHz, 5 MHz, 10 MHz, 15
MHz, 20 MHz, not set = all

--forcesecondarybandwidth -fsh Forces the bandwidth values as set above to


be performed, independent of 3GPP defini-
tion.

--secondaryfrequency -sy frequency Sets the frequencies of the secondary cell for
the test case. This applies to carrier aggrega-
tion test cases only.
Values: Low, Mid, High, not set = all

--forcesecondaryfrequency -fsy Forces the frequency values as set above to


be performed, independent of 3GPP defini-
tion. This applies to LTE-Advanced only.

--environment -et environ- Sets the environmental conditions for the test
ment case.
Values: LL, LH, NN, ML, HL, HH, not set = all

--forceenvironment -fet Forces the environmental conditions values


as set above to be performed, independent of
3GPP definition.

Example:
These parameters are used as follows:
RohdeSchwarz.Contest.CommandLineInterface.exe -tc
TestcaseLtePower_6_2_2 -v RF-LTE-2.70 -pd "FDD 5" -ph "5 MHz;10
MHz;15 MHz" -et NN;HL
Written all in one line will start the test case with the following setup:
Table 4-7: Overview of the example setup

Option Value

Test case name TestcaseLtePower_6_2_2

Version RF-LTE-2.70

User Manual 1175.6026.02 ─ 02 68


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

Option Value

Bands FDD 5

Bandwidths 5 MHz, 10 MHz and 15 MHz (if applicable)

Environmental Conditions Normal Voltage/Normal Temperature and High Volt-


age/Low Temperature

Test Frequencies all (i.e. if applicable Low, Mid, High), because no


test frequency parameters have been set.

4.2.4 Output

The following sections describe the output of the command line interface. Except the
Online Report, all that is being generated by R&S CONTEST is generated via CLI as
well (e.g. Summary Report).

4.2.4.1 Standard Output

By default the process writes all report messages to the standard stream stdout.
All test case report messages will be presented as unformatted plain text and depend
on the particular test case. Especially the different application types RF, RRM, PQA
are diverse by nature. The content of the particular test case report is subject to
change without notice.
The following entries are of highest interest for the client. They are common for all
applications and will be kept compatible.
Table 4-8: Output common for all applications

Message Description

Testplan Report Directory local: directory This reports the absolute path of the current test plan which
is containing the test plan name and the time stamp.

Testplan Report Directory UNC: directory This reports the directory in UNC notation:
\\ComputerName\ReportShare\TestPlanDirectory.
With that information a remote access to the report files is
possible.

Verdict: verdict This is the format how the final verdict of the test case will be
reported. The possible values are:
● Passed
● Passed with Restriction
● Failed
● Failed with Restriction
● Inconclusive
● Not Initialized
● Not Applicable
● Completed
● Aborted

The stdout can be utilised in two different ways:


● Redirection of stdout

User Manual 1175.6026.02 ─ 02 69


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

● Callback within client code

Redirection of stdout
When calling the process the output can be re-directed into a file.

Example:
RohdeSchwarz.Contest.CommandLineInterface.exe –tp Testplan.rstt
>> TheContestReportMessages.txt
(written all in one line)

Callback within Client Code


If the command line interface will be called by means of self written client code, the dif-
ferent programming languages provide the possibility to re-direct the output into a call
back function.
Please refer to the dedicated programming language help, how to configure the Proc-
ess Start Info and to establish output data handler methods.

4.2.4.2 Process Return Code

The CommandLineInterface.exe process will return a code depending on the per-


formed action.

Code 130 reflects the missing license for the Remote Control product R&S TS8-
KT130.

If a test case or test plan has been started the code in the range of 0 - 6 is enumerating
the final verdict.
For a test plan the verdict of all single test cases will be cumulated, Inconclusive will
overstrike all other, Failed will overstrike Passed.
Table 4-9: Overview of process return codes

Return Code Description

0 Passed

1 Passed with Restriction

2 Failed

3 Failed with Restriction

4 Inconclusive

5 Not Applicable

6 Not Initialized

Any other value Internal error of the command line interface

If --abort has been performed, the return code will always be 0.

User Manual 1175.6026.02 ─ 02 70


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

A successful run of --availabletestcases or --configuredduts returns 0, any


other value indicates an error.

4.2.4.3 XML Files

The command line interface produces three kinds of XML files:


● Available Test Cases
● Simplified Test Plan
● Configured DUTs
The following sections give an example of the structure of these XML files.

Available Test Cases


The generated .xml file called by --availabletestcases is structured as follows:
<TestcaseInformation>
<Testcases>
<Testcase Name="TestcaseGsmErrorRateSamplesCs_14_2_3">
<Name>GSM 14.2.3 - Reference sensitivity - FACCH/F</Name>
<Description>Reference sensitivity - FACCH/F</Description>
<GUID>5BFA87DC-681D-4D7F-8FCC-6DA6F1A4E3E3</GUID>
<Version>RF-GSM-1.40</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IGsmTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<GSMBand>GSM850;P-GSM900;E-GSM900;GSM1800;GSM1900;</GSMBand>
</Testcase>
<Testcase Name="TestcaseGsmErrorRateSamplesCs_14_2_4">
<Name>GSM 14.2.4 - Reference sensitivity - FACCH/H</Name>
<Description>Reference sensitivity - FACCH/H</Description>
<GUID>59FA4746-8FE5-47C3-B9F6-C9F79988D048</GUID>
<Version>RF-GSM-1.40</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IGsmTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<GSMBand>GSM850;P-GSM900;E-GSM900;GSM1800;GSM1900;</GSMBand>
</Testcase>
<Testcase Name="TestcaseLbsPerformance3gppCplane_7_1_1">
<Name>LTE A-GNSS 7.1.1 - Sensitivity Coarse time assistance (C-Plane, GPS)</Name>
<Description>Sensitivity Coarse time assistance (C-Plane, GPS)</Description>
<GUID>31B6F77B-EC87-462F-8598-0E5413730C75</GUID>
<Version>RF-LBS-1.30</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;HL;HH;</EnvironmentalCondition>
<LTEBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;FDD1
<Bandwidth>1.4;3;5;10;15;20;</Bandwidth>
<TestFrequency>Low;Mid;High;</TestFrequency>
</Testcase>

User Manual 1175.6026.02 ─ 02 71


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

<Testcase Name="TestcaseLbsPerformance3gppSupl_7_1_2">
<Name>LTE A-GNSS 7.1.2 - Sensitivity Fine Time Assistance (SUPL, GPS)</Name>
<Description>Sensitivity Fine Time Assistance (SUPL, GPS)</Description>
<GUID>0BF1FBE3-A588-4D8B-9486-500220490CB7</GUID>
<Version>RF-LBS-1.30</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;HL;HH;</EnvironmentalCondition>
<LTEBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;FDD1
<Bandwidth>1.4;3;5;10;15;20;</Bandwidth>
<TestFrequency>Low;Mid;High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseVzwSvd1xRttAclr_6_6_2_3">
<Name>VZW SVLTE 1xRTT 6.6.2.3 - Adjacent Channel Leakage Power Ratio (ACLR)</Name>
<Description>Adjacent Channel Leakage Power Ratio (ACLR)</Description>
<GUID>917D0023-3EFE-4a68-8C67-28029D914BCF</GUID>
<Version>RF-LTE-2.70</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;HL;HH;</EnvironmentalCondition>
<LTEBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;FDD1
<Bandwidth>1.4;5;10;15;20;</Bandwidth>
<TestFrequency>Low;Mid;High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseLteBlocking_7_6_2_and_7_7">
<Name>LTE FDD 7.6.2 and 7.7 Out-of-band blocking and Spurious response</Name>
<Description>Out-of-band blocking and Spurious response</Description>
<GUID>8088B53E-A381-4f85-A0D4-F8562A862FF7</GUID>
<Version>RF-LTE-2.70</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<LTEBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;FDD1
<Bandwidth>1.4;5;10;15;20;</Bandwidth>
<TestFrequency>High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseLteBlocking7_6_2_and_7_7_TDD">
<Name>LTE TDD 7.6.2 and 7.7 Out-of-band blocking and Spurious response</Name>
<Description>Out-of-band blocking and Spurious response</Description>
<GUID>45B58F49-25A0-4f0e-8DE4-3A11B7138260</GUID>
<Version>RF-LTE-2.70</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<LTEBand>TDD33;TDD34;TDD38;TDD39;TDD40;TDD41;TDD41AXGP;</LTEBand>
<Bandwidth>5;10;15;20;</Bandwidth>
<TestFrequency>High;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseRtteTrx_4_2_4C_TDD">
<Name>R&amp;TTE TDD 4.2.4B Spurious emission band UE co-existence</Name>

User Manual 1175.6026.02 ─ 02 72


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

<Description>Spurious emission band UE co-existence</Description>


<GUID>C5E301DD-08A2-4BDA-9A4B-2F1121175068</GUID>
<Version>RF-LTE-2.70</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>
</EnvironmentalCondition>
<LTEBand>
</LTEBand>
<Bandwidth>
</Bandwidth>
<TestFrequency>
</TestFrequency>
</Testcase>
<Testcase Name="RRM_LTE_6_1_2">
<Name>RRM-LTE 6.1.2: E-UTRAN FDD IntER-frequency RRC Re-establishment</Name>
<Description>E-UTRAN FDD IntER-frequency RRC Re-establishment</Description>
<GUID>B176FA04-6D6B-4da0-8619-6378ACF57EE2</GUID>
<Version>RRM-4.00</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IRrmTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;Vib;ML;HL;HH;</EnvironmentalCondition>
<PrimaryBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;
<SecondaryBand>NoBand;</SecondaryBand>
</Testcase>
<Testcase Name="RRM_LTE_6_2_1_Test1">
<Name>RRM-LTE 6.2.1 Test 1: E-UTRAN FDD - Contention Based Random Access Test 1: Correct
<Description>E-UTRAN FDD - Contention Based Random Access Test 1: Correct behaviour when
<GUID>ABF9B548-47AB-4c77-820C-7A75D1D89891</GUID>
<Version>RRM-4.00</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IRrmTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;Vib;ML;HL;HH;</EnvironmentalCondition>
<PrimaryBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD12;FDD13;FDD14;
<SecondaryBand>NoBand;</SecondaryBand>
</Testcase>
<Testcase Name="RRM_WCDMA_8_6_1_1A">
<Name>RRM-WCDMA 8.6.1.1A: FDD intra frequency measurements: Event triggered reporting in
<Description>FDD intra frequency measurements: Event triggered reporting in AWGN propaga
<GUID>43749AB9-2384-4728-A238-28BC36C0D851</GUID>
<Version>RRM-4.00</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IRrmTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;Vib;ML;HL;HH;</EnvironmentalCondition>
<PrimaryBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD19;</PrimaryBan
<SecondaryBand>NoBand;</SecondaryBand>
</Testcase>
<Testcase Name="RRM_WCDMA_8_6_1_4A">
<Name>RRM-WCDMA 8.6.1.4A: FDD intra frequency measurements: Correct reporting of neighbo
<Description>FDD intra frequency measurements: Correct reporting of neighbours in fading

User Manual 1175.6026.02 ─ 02 73


R&S® CONTEST Remote Control and Test Automation
Command Line Interface

<GUID>86815280-D78E-4BC9-B158-19B015A84064</GUID>
<Version>RRM-4.00</Version>
<ReleaseDate>28.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IRrmTestcase</Type>
<EnvironmentalCondition>LL;LH;NN;Vib;ML;HL;HH;</EnvironmentalCondition>
<PrimaryBand>FDD1;FDD2;FDD3;FDD4;FDD5;FDD6;FDD7;FDD8;FDD9;FDD10;FDD11;FDD19;</PrimaryBan
<SecondaryBand>NoBand;</SecondaryBand>
</Testcase>
</Testcases>
</TestcaseInformation>

Simplified Test Plan


The generated .xml file called by --testplan is identical to the format stated above,
except only one parameter value is allowed. It is structured as follows:

Even a single value must have a trailing semicolon (;).

<TestcaseInformation>
<Testcases>
<Testcase Name="TestcaseGsmErrorRateSamplesCs_14_2_3">
<Name>GSM 14.2.3 - Reference sensitivity - FACCH/F</Name>
<Description>Reference sensitivity - FACCH/F</Description>
<GUID>5BFA87DC-681D-4D7F-8FCC-6DA6F1A4E3E3</GUID>
<Version>RF-GSM-1.40</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IGsmTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<GSMBand>GSM850;</GSMBand>
</Testcase>
<Testcase Name="TestcaseGsmErrorRateSamplesCs_14_2_4">
<Name>GSM 14.2.4 - Reference sensitivity - FACCH/H</Name>
<Description>Reference sensitivity - FACCH/H</Description>
<GUID>59FA4746-8FE5-47C3-B9F6-C9F79988D048</GUID>
<Version>RF-GSM-1.40</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.IGsmTestcase</Type>
<EnvironmentalCondition>NN;</EnvironmentalCondition>
<GSMBand>E-GSM900;</GSMBand>
</Testcase>
<Testcase Name="TestcaseLbsPerformance3gppCplane_7_1_1">
<Name>LTE A-GNSS 7.1.1 - Sensitivity Coarse time assistance (C-Plane, GPS)</Name>
<Description>Sensitivity Coarse time assistance (C-Plane, GPS)</Description>
<GUID>31B6F77B-EC87-462F-8598-0E5413730C75</GUID>
<Version>RF-LBS-1.30</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;</EnvironmentalCondition>

User Manual 1175.6026.02 ─ 02 74


R&S® CONTEST Remote Control and Test Automation
Jenkins Integration

<LTEBand>FDD1;</LTEBand>
<Bandwidth>5;</Bandwidth>
<TestFrequency>Low;</TestFrequency>
</Testcase>
<Testcase Name="TestcaseLbsPerformance3gppSupl_7_1_2">
<Name>LTE A-GNSS 7.1.2 - Sensitivity Fine Time Assistance (SUPL, GPS)</Name>
<Description>Sensitivity Fine Time Assistance (SUPL, GPS)</Description>
<GUID>0BF1FBE3-A588-4D8B-9486-500220490CB7</GUID>
<Version>RF-LBS-1.30</Version>
<ReleaseDate>19.06.2012</ReleaseDate>
<Type>RohdeSchwarz.Contest.CommonInterfacesForPlatformAndGUI.ILteTestcase</Type>
<EnvironmentalCondition>LL;</EnvironmentalCondition>
<LTEBand>FDD4;</LTEBand>
<Bandwidth>5;</Bandwidth>
<TestFrequency>Mid;</TestFrequency>
</Testcase>
</Testcases>
</TestcaseInformation>

Configured DUTs
The generated .xml file called by --configuredduts is structured as follows:
<DUTs>
<DUT Name="DutName#1AsConfiguredWithinContestGUI"></DUT>
<DUT Name="DutName#2AsConfiguredWithinContestGUI"></DUT>
</DUTs>

The DUT name references to a folder named identical and holding all configuration
data of the relevant DUT. The folder can be found in
c:\ProgramData\Rohde-Schwarz\Contest\Common\DevicesUnderTest

4.3 Jenkins Integration


Required additional license: none

4.3.1 Introduction

This chapter describes how to integrate R&S CONTEST into Jenkins in order to run
tests fully unattended. Because the Jenkins integration uses the Command Line Inter-
face to initiate the tests in R&S CONTEST without it's GUI, all that can be accom-
plished with the Command Line Interface can be automated.
Before any automation using Jenking, the following configurations must be done by
means of the R&S CONTEST GUI:
● Test system configuration, if changes on the hardware apply
● Parametrization, cabling and activation of the DUT
● Test system identity, to be updated once

User Manual 1175.6026.02 ─ 02 75


R&S® CONTEST Remote Control and Test Automation
Jenkins Integration

● Automatic transfer of reports to a central report server (if applicable)

For more information about the Command Line Interface, refer to Chapter 4.2, "Com-
mand Line Interface", on page 60.

4.3.2 Jenkins Reporting

To serve the reporting of Jenkins a dedicated R&S CONTEST reporting plugin is


implemented. It is available with every standard installation and will be activated by
means of the following environment variables in a Jenkins Build Step.

Environment Variable Description

JenkinsSummaryReportName=<filename> To the given location a html file will be written which


is providing a link to the
SummaryReportsOverview.xml stored on the
system controller. Using Microsoft Internet Explorer,
an automated forwarding is possible, other browsers
are more restrictive. An Online Report is not availa-
ble if R&S CONTEST is being started via command
line interface.

JenkinsReportName=<filename> This is an XML file compliant to the JUnit XML for-


mat. It is being updated after each test case run.
Out of that Jenkins is able to analyze the test case
verdicts and to present the status as shown in Fig-
ure 4-4.

Figure 4-4: Example of the Test Result Trend shown in Jenkins

4.3.3 Configuring Jenkins

The following lines show an example for the different steps within Jenkins that are
used to execute a test plan. Please adapt this accordingly to your infrastructure.

Build Step
set JenkinsReportName="%WORKSPACE%\JUnitReport.xml"
set JenkinsSummaryReportName="%WORKSPACE%\SummaryReportsOverview.html"
cd "C:\Program Files (x86)\Rohde-Schwarz\Contest\11\GUI\Bin"

User Manual 1175.6026.02 ─ 02 76


R&S® CONTEST Remote Control and Test Automation
Jenkins Integration

RohdeSchwarz.Contest.CommandLineInterface.exe -tp "%WORKSPACE%\%TESTPLAN%"


IF errorlevel 1 IF NOT errorlevel 2 ( ECHO PASSED with restrictions EXIT /B 0 )

Post Build Action


"Archive the Artifacts": JUnitReport.xml, SummaryReportsOverview.html.

Post Build Step


"Publish xUnit test result report" with "JUnit Pattern": JUnitReport.xml.

User Manual 1175.6026.02 ─ 02 77


R&S® CONTEST DUT Automation
Custom DUT Remote Control Plugins

5 DUT Automation

5.1 Custom DUT Remote Control Plugins


Required additional license: none

5.1.1 Introduction

In the context of DUT automation, R&S CONTEST offers an interface for remote con-
trol plugins that can be used to automate certain actions of the DUT. Along with the
diversity and the constant technological advancement of DUTs, the requirements of
remote control plugins change rapidly. Therefore — rather than providing actual plu-
gins — an API has been created, enabling third party developers to easily create their
own custom DUT remote control plugins, tailored to their specific needs.
Custom DUT remote control plugins can be programmed in any .Net programming lan-
guage and must be compiled for the target framework .Net Framework 4.

5.1.2 Interface Description

In addition to the following API documentation, the definition of the interface can be
found in RohdeSchwarz.Contest.DutRemoteControlPlugin.dll, located in
c:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\Base\<ContestBaseVersionNumber>\.

In this path, <ContestMajorVersionNumber> stands for the actual major version


number of R&S CONTEST (e.g. 14). <ContestBaseVersionNumber> stands for the
actual version number of R&S CONTEST-Base (e.g. 14.10).
Example:
c:\Program Files (x86)\Rohde-Schwarz\Contest\14\Base\14.10\
RohdeSchwarz.Contest.DutRemoteControlPlugin.dll

Namespace: RohdeSchwarz.Contest.Contracts
Assembly: RohdeSchwarz.Contest.DutRemoteControlPlugin.dll

5.1.2.1 Syntax

In C#
public interface IDutRemoteControlPlugin : IDisposable

User Manual 1175.6026.02 ─ 02 78


R&S® CONTEST DUT Automation
Custom DUT Remote Control Plugins

5.1.2.2 Properties

Syntax Description

String Description { get; } Gets the description of the plugin to be presented in R&S CON-
TEST GUI or in test reports.

String Resource { get; set; } Gets or sets the resource of the DUT interface formatted in
VISA resource name syntax.

String ErrorMessage { get; } Gets the error message which is set in Read() and Write()
upon an error.

5.1.2.3 Methods

Syntax Parameters Description Return Value

String Read(String command: The com- Read access to the The read response of the
command); mand to be executed device to receive DUT
response upon the
specified command.
Sets the property
ErrorMessage if an
error occurs.

void Write(String command: The com- Write access to the


command); mand to be executed device to set the speci-
fied command. Sets the
property
ErrorMessage if an
error occurs.

5.1.3 Configuring R&S CONTEST

In order to make the assembly of a custom DUT remote control plugin available to R&S
CONTEST, it must be copied to the following location:
c:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\DutPlugins\.

In this path, <ContestMajorVersionNumber> stands for the actual major version


number of R&S CONTEST (e.g. 14)

The plugin can then be activated by means of the R&S CONTEST GUI. During the
configuration, the following parameters must be set:

User Manual 1175.6026.02 ─ 02 79


R&S® CONTEST DUT Automation
Custom DUT Remote Control Plugins

Table 5-1: Parameters for Custom DUT Remote Control Plugins

Parameter Description

Enable Plugin Mode Enable / Disable

Plugin Resource Identification of the connection to the DUT.


If connected via TCP/IP:
TCPIP::<IP address of DUT>::<port>::SOCKET
Example: TCPIP::192.168.52.20::0::SOCKET
If connected via serial link (COM):
ASRL<port>[::INSTR]
Example: ASRL1

Plugin File-Path The file path of the plugin assembly (.dll)


This can be edited directly or choosen via "Browse".

Plugin Description Read only field of the description provided by the loaded plugin.

In order to configure R&S CONTEST to use a custom DUT remote control plugin, pro-
ceed as follows:
1. In R&S CONTEST, select "Test-Environment Configuration" → "DUT Configura-
tion".
The "DUT Configuration" dialog appears.

2. Select the DUT to be remotely controlled from the "Available DUTs" list.

3. Select the "Automation" tab of the selected DUT.

4. Scroll down to the "Remote Control Plugin" box.

Figure 5-1: Screenshot of the Remote Control Plugin box.

5. Activate the checkbox "Enable Plugin Mode".


The parameter fields become editable.

6. Enter the required information in the respective fields.


For more information refer to Table 5-1.

7. Click the "Save" button, then the "OK" button at the bottom of the dialog.
The specified DUT remote control plugin will from now on be used for the selected
DUT.

User Manual 1175.6026.02 ─ 02 80


R&S® CONTEST DUT Automation
DUT Automation Applications

5.1.4 Example Visual Studio Solution

In order to facilitate the development of a custom DUT remote control plugin, an exem-
plary Microsoft Visual Studio 2013 solution is provided with the standard R&S CON-
TEST installation. The solution is located in
c:\Program Files (x86)\Rohde-Schwarz\Contest\
<ContestMajorVersionNumber>\DutPlugins\DutRemoteControlPlugin\
DutRemoteControlPluginExample.sln. The sample code is written in C#.

In this path, <ContestMajorVersionNumber> stands for the actual major version


number of R&S CONTEST (e.g. 14)

5.2 DUT Automation Applications


Required additional license: none

5.2.1 Introduction

This chapter describes the integration of external applications for DUT automation or
execution of customized tasks.
In order to provide an interface for DUT automation, the "Automation Manager Mode"
has been implemented in R&S CONTEST. This interface was originally developed for
the communication between R&S CONTEST and R&S Automation Manager (Option
R&S CMW-KT014) but can also be used by third party DUT automation applications.
R&S CONTEST can thus interact with any DUT automation application via TCP/IP
sockets. Both IPv4 and IPv6 addresses are supported. The desired R&S CONTEST
automation commands need to be interpreted by the DUT automation application and
mapped to the respective DUT-specific commands as illustrated by Figure 5-3 for
instance.

5.2.2 Interface Description

The following figure illustrates the interfaces between R&S CONTEST, a DUT automa-
tion application and the DUT.

User Manual 1175.6026.02 ─ 02 81


R&S® CONTEST DUT Automation
DUT Automation Applications

Figure 5-2: Illustration of the interfaces between R&S CONTEST, a DUT automation application and
the DUT

The DUT automation application as server process needs to bind itself to a socket and
provides unambiguous access via a well known port number. Therefore the DUT auto-
mation application must be started before the execution of the test applications. The
R&S CONTEST test application connects to this socket each time an automation com-
mand is sent to the DUT automation application. The DUT automation application then
has to connect to an interface provided by the DUT (e.g. a serial COM port).
The interface to the DUT depends on the device manufacturer and is out of scope for
this document. DUT automation applications might need to send AT commands or
binary messages or even start and stop DUT-specific applications to control the DUT.

Risk of Software Malfunction


If the DUT automation application is installed together with DUT drivers on the system
controller, driver conflicts may prevent the proper control of the instruments in the test
system. Therefore it is not recommended to install a DUT automation application
together with DUT drivers on the system controller running R&S CONTEST.

5.2.3 Automation Commands

R&S CONTEST supports the following automation commands:


● DUT_SWITCH_ON
● DUT_SWITCH_OFF
● RESET
In addition to these standard automation commands, some test applications such as
PQA or ITS support further command strings. For more information refer to the docu-
mentation of the respective test application.
Each time the positive response message "OK" is received the test application discon-
nects the socket. In the case of a test application-specific command string the
response message could also be a specific DUT response. There is an overall timeout
for the reception of the response message which can be configured in the "DUT Con-
figuration" dialog. If the timeout has passed without receiving a correct response mes-

User Manual 1175.6026.02 ─ 02 82


R&S® CONTEST DUT Automation
DUT Automation Applications

sage the test application is aborted. In this case an error message is added to the test
report.

The timeout for the reception of the response message can be set in "DUT Configura-
tion" → "Automation" tab → "Common" tab of the desired DUT. In the "Registration"
box, adjust the parameter "Maximum Registration Time" to your needs.

The following figures illustrate exemplary message sequences with the respective
automation commands.

DUT_SWITCH_ON Command

Figure 5-3: Illustration of a message sequence using a DUT_SWITCH_ON command

User Manual 1175.6026.02 ─ 02 83


R&S® CONTEST DUT Automation
DUT Automation Applications

DUT_SWITCH_OFF Command

Figure 5-4: Illustration of a message sequence using a DUT_SWITCH_OFF command

RESET Command

Figure 5-5: Illustration of a message sequence using a RESET command

User Manual 1175.6026.02 ─ 02 84


R&S® CONTEST DUT Automation
DUT Automation Applications

Test Application-Specific Command Strings

Figure 5-6: Illustration of a message sequence using a test application-specific command

5.2.4 Configuring R&S CONTEST

Before DUT automation applications can be used, R&S CONTEST needs to be config-
ured accordingly by means of the R&S CONTEST GUI.

In order to configure R&S CONTEST interact with a DUT automation application, pro-
ceed as follows:
1. In R&S CONTEST, select "Test-Environment Configuration" → "DUT Configura-
tion".
The "DUT Configuration" dialog appears.

2. Select the DUT to be automated from the "Available DUTs" list.

3. Select the "Automation" tab of the selected DUT.

4. Scroll down to the "Automation Manager Mode" box.

Figure 5-7: Screenshot of the Automation Manager Mode box.

5. Activate the checkbox "Enable Automation Manager Mode".


The parameter fields become editable.

6. In the "Resource" field, enter the TCP/IP string to the desired DUT automation
application.

User Manual 1175.6026.02 ─ 02 85


R&S® CONTEST DUT Automation
IP Trigger Applications

The resource string needs to be structured as follows:


TCPIP::<IPAddressOfDUTAutomationApplicationComputer>::<Port>.
Example: TCPIP::192.168.52.20::4754.

7. Click the "Save" button, then the "OK" button at the bottom of the dialog.
R&S CONTEST will now interact with the DUT automation application at the speci-
fied TCP/IP socket.

For the combination of the "Automation Manager Mode" and the automatic operation of
power supplies please refer to the documentation of the "Automation" tab in the R&S
CONTEST Help.

5.3 IP Trigger Applications


Required additional license: R&S TS8-KT155

5.3.1 Introduction

This chapter describes the integration of an external application which listens to R&S
CONTEST IP trigger events.
IP triggers allow external applications to start tasks like batch jobs or to execute tools
upon certain events that occur during a test run. For example an IP trigger can start a
job that copies DUT debug files to a dedicated location, when the end of a test case is
reached.
Because IP triggers are evaluated analogue to break conditions, they follow the same
naming as the break conditions. For more information on break conditions please refer
to "Break Conditions tab" in the R&S CONTEST user manual.

5.3.2 Interface Description

The interface between R&S CONTEST and the IP trigger application is analogue to the
interface of a DUT automation application, as described in Chapter 5.2, "DUT Automa-
tion Applications", on page 81. Such an application needs to bind itself to a socket and
provides unambiguous access via a well known port number. The R&S CONTEST test
application connects to this socket each time an IP trigger message is sent to the appli-
cation.

The external application as server process must be started before the execution of the
test application.

The following figure illustrates the interface between R&S CONTEST and an IP trigger
application.

User Manual 1175.6026.02 ─ 02 86


R&S® CONTEST DUT Automation
IP Trigger Applications

Figure 5-8: Illustration of the interface between R&S CONTEST and an IP trigger application

5.3.2.1 Messages

When a trigger event occurs during test execution, a message is sent from the test
application to the connected IP trigger application. The latter must then acknowledge
each received message by sending the response message "OK". If no acknowledg-
ment is received, test plan execution is continued after the specified timeout.
The following figure shows a message sequence as an example to illustrate the con-
cept.

User Manual 1175.6026.02 ─ 02 87


R&S® CONTEST DUT Automation
IP Trigger Applications

Figure 5-9: Illustration of an exemplary message sequence between R&S CONTEST and an IP trigger
application

The following messages are sent depending on the event that triggers the message:

Event Supported Message Additional Info


Application
Type

Test plan is finished All AfterTestplan

Test case starts All AtTestcaseStart TestcaseName: <Testcase-


Name>
TestcaseReportDirectory: <Test-
caseReportFolder>
If testcase is run in repetition
mode:
Testcase repetition number:
<RepetitionNumber>

Test case is finished All AfterTestcase Number: <Testcase Number>


Verdict: <Testcase Verdict>

Internal test step starts All AtTeststepStart

Internal test step is fin- All AfterTeststep


ished

Measurement starts RF BeforeMeasurement

Measurement is finished RF AfterMeasurement

User Manual 1175.6026.02 ─ 02 88


R&S® CONTEST DUT Automation
IP Trigger Applications

Event Supported Message Additional Info


Application
Type

Measurement is outside RF AtMeasurementOutside

Device output will be RF BeforeDeviceOutput


enabled

Device output is enabled RF AfterDeviceOutput

Device reports an error RF AtDeviceProblemIndication

Note that each time a positive response message "OK" is received or the timeout is
reached the test application disconnects the socket.

5.3.3 Configuring R&S CONTEST

Before IP trigger applications can be used, R&S CONTEST needs to be configured


accordingly by means of the R&S CONTEST GUI.

In order to configure R&S CONTEST to interact with an IP trigger application, proceed


as follows:
1. In R&S CONTEST, select "Settings" → "Application Debugging".
The "Application Debugging" dialog appears.

2. Select the "IP Triggers" tab.

Figure 5-10: Screenshot of the IP Triggers tab in the Application Debugging dialog

3. Enter "Client IP-Address", "Port" and "Connection Time-Out" in the respective


fields.

User Manual 1175.6026.02 ─ 02 89


R&S® CONTEST DUT Automation
IP Trigger Applications

GUI Element Description

"Client IP-Address" The IP address of the IP trigger application to which the message should
be sent.

"Port" The port used for the message transfer.

"Connection Time-Out" Time until the test plan execution is resumed if no acknowledgement mes-
sage "OK" is received from the IP trigger application.

4. In the "Trigger Events" box, select the events to trigger a message.

5. Click the "OK" button to save the configuration.


R&S CONTEST is now configured to interact with the specified IP trigger applica-
tion.

User Manual 1175.6026.02 ─ 02 90


R&S® CONTEST Advanced Device Configuration
User Defined Fading Profiles

6 Advanced Device Configuration

6.1 User Defined Fading Profiles


Required additional license: none
Applicable to: R&S AMU200A

6.1.1 Introduction

In addition to the fading profiles specified by 3GPP it is possible to apply user-defined


fading profiles in conformance tests and R&D applications. A user-defined fading pro-
file is created on the R&S AMU200A in the "Fading Settings" menu and saved in a file
directly on the instrument. The file name and the corresponding fading insertion loss
are entered as parameters in the test step parameters of the R&S CONTEST software.
The following sections describe the procedure of applying user-defined fading profiles
in conformance tests and R&D applications:

1. Creating a fading settings file on the instrument.

2. Determining the insertion loss on the instrument

3. Configuring R&S CONTEST on the system controller

Familiarity with the graphical user interface of R&S AMU200A is assumed in this chap-
ter.

6.1.2 Creating a Fading Settings File on the Instrument

The "Fading" dialog is used to configure multipath fading signals. This dialog can be
accessed by either selecting the "Fader" block or by pressing the MENU key.

User Manual 1175.6026.02 ─ 02 91


R&S® CONTEST Advanced Device Configuration
User Defined Fading Profiles

Figure 6-1: Screenshot of the Fading dialog on R&S AMU200A

In order to create a fading settings file on the instrument, proceed as follows:


1. Select a fading profile to start with from the "Standard" list box or click the "Set to
Default" button to start from scratch.

2. Use the "Path Table" submenus to edit the path settings.


Note: For details, refer to the operating manual "Fading Simulator R&S SMU200A
and R&S AMU200A".
3. Click the "Save/Recall" button.

4. Select "Save Fading Settings".


Tip: It is recommended to save the file in the folder D:\Contest\ (it may be nec-
essary to call the file manager and create this folder first).
The settings file is now created and located in the specified folder.

6.1.3 Determining the Insertion Loss on the Instrument

R&S CONTEST requires the insertion loss of the fading settings to calculate the cor-
rect power compensation value for the faded signal. The insertion loss is displayed in
the "Insertion Loss Configuration" submenu of the fading dialog.

In order to read out the insertion loss for the fading settings, proceed as follows:
1. Make sure the mode of the "BB Input Block" is set to "Digital Input":
a) Select "config...".
b) Select "BB Input Settings".
c) In the "Mode" field, select "Digital Input".

User Manual 1175.6026.02 ─ 02 92


R&S® CONTEST Advanced Device Configuration
User Defined Fading Profiles

2. Set the fading signal routing to A → A || B → B.

Figure 6-2: Screenshot of the Fading screen on R&S AMU200A

3. Set the fading state "On".

4. Set either the "Baseband" block or the "BB Input" block "On".

5. Open the "Fading" dialog and click the "Insertion Loss Configuration" button.
The "Fading: Insertion Loss Configuration" dialog appears.

User Manual 1175.6026.02 ─ 02 93


R&S® CONTEST Advanced Device Configuration
User Defined Fading Profiles

Figure 6-3: Screenshot of the Fading: Insertion Loss Configuration dialog

6. Make sure the "Mode" field is set to "Normal".

7. Read out the "Insertion Loss" field.


This value is required for the "Fading Insertion Loss" parameter during the configu-
ration of R&S CONTEST.

6.1.4 Configuring R&S CONTEST

User-defined fading profiles can be used in conformance test cases as well as in R&D
applications. The following parameters are applicable to both cases.

User Manual 1175.6026.02 ─ 02 94


R&S® CONTEST Advanced Device Configuration
Climate Chamber Configuration

Table 6-1: Parameters for user-defined fading settings

Parameter name Description

"Fading Profile" "UserDefinedFile"

"Fading File Name" or "UserDefinedFadingFile- Name of the fading settings file on the instrument.
Name" The file extension .fad may be omitted. If the file
was stored in the default folder D:\Contest\, the
file name is sufficient. Otherwise, the complete path
must be specified.

"Fading Insertion Loss" or "UserDefinedFadingInser- Insertion loss for this fading profile.
tionLoss"

In order to apply user-defined fading settings to conformance test cases, proceed as


follows:
1. In the test plan editor of R&S CONTEST, right click on the desired test case.
A context menu appears.

2. Select "Properties".
The "Test Case Parameters" dialog appears.

3. Select the "Test Step Parameters" tab.

4. Enter the values described in Table 6-1 in the respective fields of the desired test
step.

5. Click the "OK" button and save the test plan.


The specified fading profile will from now on be used for the selected test step(s).

In order to apply user-defined fading settings in an R&D environment, proceed as fol-


lows:
1. In R&S CONTEST, add the R&D test case "Set Fading AWGN AMU" to the desired
test plan.

2. Enter the values described in Table 6-1 in the respective fields.

3. Save the test plan.


The specified fading profile will from now on be used.

6.2 Climate Chamber Configuration


Required additional license: R&S TS8-KT120

6.2.1 Introduction

The basic settings of climate chambers, such as the device's resource string and the
device type can be defined in the "Test System Configuration" dialog. A more

User Manual 1175.6026.02 ─ 02 95


R&S® CONTEST Advanced Device Configuration
Climate Chamber Configuration

advanced configuration can be achieved by means of an XML configuration file, that


contains additional parameters.
The following sections give a detailed overview of both the basic settings and the
parameters for the advanced configuration.

6.2.2 Basic Settings

Basic settings are configured in the "Test System Configuration" dialog. In the "Hard-
ware Configuration" tab, the "Resource String" and "Device Type" can be defined.

Figure 6-4: Screenshot of the Test System Configruration dialog

6.2.2.1 Resource String

The resource string consists of the interface type and addressing information.

Resource String Description

ASRL<n> Serial Interface (RS232), where <n> is the interface number,


i.e. ASRL2 means COM2.

GPIB::<n> GPIB interface, where <n> is the primary address of the


device.

TCPIP::<ip-address> TCP/IP interface with VXI-11 protocol, where <ip-address> is


the IPv4 address of the device, e.g. TCPIP::192.168.0.35

TCPIP::<ip-address>::<port>::SOCKET TCP/IP socket interface, where <ip-address> is the IPv4


address of the device and <port> is the port number used for
the socket communication, e.g. TCPIP::
192.168.0.121::2049::SOCKET

User Manual 1175.6026.02 ─ 02 96


R&S® CONTEST Advanced Device Configuration
Climate Chamber Configuration

Please refer to the manual of the climate chamber to determine the correct interface
type and addressing information.

6.2.2.2 Device Type

The "Device Type" field is only editable for the device "ClimateChamber". This field is
used to select the appropriate device driver for the chamber. Drivers are available for
the following climate chamber models:

Device Type Supported climate chamber models

ClimateChamberEspec Espec SH-241

ClimateChamberTestEquity TestEquity 105A

ClimateChamberThermotron2800 Thermotron chambers with the 2800 controller

ClimateChamberThermotron3800 Thermotron chambers with the 3800 controller

ClimateChamberVoetsch Vötsch VT4002 and similar chambers

6.2.3 Advanced Settings

Advanced settings for climate chambers are defined in the XML configuration file
ClimateChamberConfigurationParameters.xml. The file is located in:
c:\ProgramData\Rohde-Schwarz\Contest\<CONTEST-Major-Version>\
ConfigurationParameters\

Please note that in this path, <CONTEST-Major-Version> is a placeholder for the


R&S CONTEST version number (e.g. 14).

Four sample configuration files are shipped with the R&S CONTEST-Base installation:
● ClimateChamberSuppressResetConfigurationParameters.xml
Using this configuration file, the climate chamber will not be switched off after the
test plan execution.
● ClimateChamberVoetschVT4002ConfigurationParameters.xml
Contains the (default) parameters for the model Vötsch VT4002. The driver uses
the same values if the configuration file is not present.
● ClimateChamberVoetschVTL4003ConfigurationParameters.xml
Contains the parameters for the model Vötsch VTL4003.
● ClimateChamberTestEquityConfigurationParameters.xml
Contains the parameters for the model Test Equity 105A.
These sample configuration files are located in:
c:\ProgramData\Rohde-Schwarz\Contest\<CONTEST-Major-Version>\
ConfigurationParameters\Templates\
The following parameters are applicable to all climate chambers:

User Manual 1175.6026.02 ─ 02 97


R&S® CONTEST Advanced Device Configuration
Climate Chamber Configuration

Table 6-2: Advanced Climate Chamber Parameters

Parameter Description

SuppressReset If enabled the climate chamber will not be reset at test plan start and
finish.
Default value: false

6.2.3.1 Activating a Configuration File

In order to use an advanced climate chamber configuration file, proceed as follows:


1. In R&S CONTEST, open the "Test System Configuration" dialog.

2. Right click on the row of the climate chamber


A context menu appears.

3. In the context menu, select "Create", then the desired configuration parameter file.
The paramters defined in the selected configuration file are copied to the climate
chamber configuration file (
ClimateChamberConfigurationParameters.xml) and will from now on be
used.

6.2.3.2 Editing a Configuration File

In order to edit an activated configuration file, proceed as follows:


1. In R&S CONTEST, open the "Test System Configuration" dialog.

2. Right click on the row of the climate chamber


A context menu appears.

3. In the context menu, select "Edit".


The active configuration file
(ClimateChamberConfigurationParameters.xml) opens in the default XML
editor.

4. Make the desired changes, save, and close the file.


The changes become effective and will from now on be used.

6.2.3.3 Deactivating a Configuration File

In order to deactivate an advanced climate chamber configuration file, proceed as fol-


lows:
1. In R&S CONTEST, open the "Test System Configuration" dialog.

2. Right click on the row of the climate chamber

User Manual 1175.6026.02 ─ 02 98


R&S® CONTEST Advanced Device Configuration
Climate Chamber Configuration

A context menu appears.

3. In the context menu, select "Remove".


The configuration file (ClimateChamberConfigurationParameters.xml) will
be deleted. From now on, only the basic settings will be used.

6.2.3.4 Vötsch Climate Chambers

The remote interface of Vötsch climate chambers requires additional configuration


parameters that can not all be shown in the user interface of the "Test System Configu-
ration" dialog. The command strings for the chamber may be slightly different for differ-
ent models and also depend on the installed hardware options of the climate chamber.

If the configuration file does not exist or if a parameter entry is not present in the file,
the driver uses the default value described in Table 6-3.

The following table lists the settings that can not be configured within R&S CONTEST
and have to be set in the control panel of the chamber:

Parameter Value

Interface protocol ASCII-2

Operating Mode external

Baud Rate 9600 baud

Please refer to the operating manual of the chamber model before making changes to
the parameters.

In order to activate and edit advanced configuration parameters for Vötsch climate
chambers, follow the steps described in Chapter 6.2.3.1, "Activating a Configuration
File", on page 98 and Chapter 6.2.3.2, "Editing a Configuration File", on page 98.
In addition to the parameters described in Table 6-2, the following parameters are
applicable to Vötsch climate chambers:

The stated default values are those of the model Vötsch VT4002.

User Manual 1175.6026.02 ─ 02 99


R&S® CONTEST Advanced Device Configuration
Climate Chamber Configuration

Table 6-3: Advanced Parameters for Vötsch Climate Chambers

Parameter Description

DeviceAddress The device address of the chamber, in the manuals also referred to as
"bus address". Note that leading zeroes must be specified.
Permissible values: 01 - 32.
Default value: 01

SetValuesCommand The command string pattern for the "set values" command $01E. The
command will be prefixed by a dollar sign and the DeviceAddress
parameter. Parameters in curly braces are dynamically replaced by
the driver with the specified format.
Parameter {0:0000.0} is the target temperature in the format 4.1
digits, e.g. a target temperature of 23.5 °C will be formatted as
0023.5.
Parameter {1:0000.0} is the target humidity in percent. If the cham-
ber does not support humidity control, this value is taken from the
reply to the $01I command. A target humidity of 50 % will be format-
ted as 0050.0.
Parameter {2:0000.0} is the target fan speed in percent. This value
is taken from the reply to the $01I command. A fan speed of 80% is
formatted as 0080.0.
Parameter {3} consists of the first 16 bits of the bit mask for the digi-
tal channels. It is replaced by either the BitmaskEnabled or
BitmaskDisabled parameter, depending on the climate chamber
state. The last 16 bits are set to zero.
Default value: E {0:0000.0} {1:0000.0} {2:0000.0} 0000.0
0000.0 0000.0 0000.0 {3}0000000000000000
Example: To use a fixed fan speed of 75% modify this entry as fol-
lows: E {0:0000.0} {1:0000.0} 0075.0 0000.0 0000.0
0000.0 0000.0 {3}0000000000000000

BitmaskEnabled The first 16 bits of the bit mask used to enable the climate chamber.
This string is inserted as parameter {3} into SetValuesCommand.
Default value: 0100000000000000

BitmaskDisabled The first 16 bits of the bit mask used to disable the climate chamber.
This string is inserted as parameter {3} into SetValuesCommand.
Default value: 0000000000000000

ModelName The model name entry is merely used for informational purpose. It will
appear in the R&S CONTEST reports and has the format <Ven-
dor>,<Model Name>.
Default value: Voetsch,VT4002

6.2.3.5 Test Equity 105A Climate Chamber

The Test Equity 105A climate chamber is controlled by a set of registers that R&S
CONTEST reads and writes.

User Manual 1175.6026.02 ─ 02 100


R&S® CONTEST Advanced Device Configuration
Climate Chamber Configuration

Example:
1 3 0 100 0 1 197 213
or
1 3 2 0 74 57 179

These registers can be split into functional entities:


[1] [3] 0 [100 0] [1] [197 213]: At the controller address 1 ([1]), R&S CONTEST reads
([3]) the register 100 ([100 0]), that is exactly one register ([1]). Additionally, the register
includes a 2-byte checksum ([197 213]).
[1] [3] [2] [0 74] [57 179]: At the controller address 1 ([1]), the climate chamber sends,
using the ReadCommand ([3]), two bytes ([2]) that contain the temperature information
([0 74]). Additionally, the register includes a 2-byte checksum ([57 179]). This exam-
plary temperature ([0 74]) translates to 7.4° C as follows: (0 x 256 + 74 x 1) / 10° C =
7.4° C.
Some versions of the Test Equity 105A climate chamber use 3 as ReadCommand,
while others use 4. Which ReadCommand is to be used can be set in the advanced
configuration parameters file. In order to determine the adequate ReadCommand,
please refer to the manufacturer of the climate chamber.

If the configuration file does not exist or if a parameter entry is not present in the file,
the driver uses the default value described in Table 6-4.

In order to activate and edit advanced configuration parameters for the Test Equity
105A climate chamber, follow the steps described in Chapter 6.2.3.1, "Activating a
Configuration File", on page 98 and Chapter 6.2.3.2, "Editing a Configuration File",
on page 98.
In addition to the parameters described in Table 6-2, the following parameters are
applicable to the Test Equity 105A climate chamber:
Table 6-4: Advanced Parameters for the Test Equity 105A Climate Chamber

Parameter Description

ReadCommand Defining which ReadCommand is to be used for the communication


between R&S CONTEST and the climate chamber.
Permissible values: 3 or 4
Default value: 4

User Manual 1175.6026.02 ─ 02 101


R&S® CONTEST Index

Index
A F
Automatic Report Transfer ................................................ 21 Fading Profiles .................................................................. 91
Automation Manager Mode ......................................... 81, 85 Configuring R&S CONTEST ....................................... 95
Fading Settings File .................................................... 91
B Insertion Loss ............................................................. 92
Fading Settings File .......................................................... 91
Browser ............................................................................. 10
G
C
Graphical User Interface
Central Report Server ................................................. 10, 16 Command Line Interface ............................................ 61
Configuration .............................................................. 17 GUI
Database .................................................................... 10 Command Line Interface ............................................ 61
Database Server ......................................................... 16
File Server .................................................................. 17 I
Settings ....................................................................... 17
Test Report Directory .................................................. 10 Infrastructure ....................................................................... 7
CETECOM ........................................................................ 30 Browser ....................................................................... 10
CLI ..................................................................................... 60 Central Report Server ................................................. 10
Climate Chamber .............................................................. 95 Command Line Interface .............................................. 8
Device Type ................................................................ 97 Database .................................................................... 10
Resource String .......................................................... 96 Illustration ..................................................................... 7
Test Equity 105A Configuration Parameters ............ 101 R&S CONTEST ............................................................ 7
Vötsch Configuration Parameters ............................. 100 Remote Server .............................................................. 8
XML configuration file ................................................. 97 Report Manager ............................................................ 9
Command Line Interface ......................................... 8, 49, 60 Reportal ........................................................................ 9
Graphical User Interface ............................................. 61 SOAP Client .................................................................. 9
Output Messages ........................................................ 69 System Controller ......................................................... 7
Parameters (General) ................................................. 62 System Controller Database ......................................... 9
Parameters for Information Retrieval .......................... 63 System Controller Test Report Directory ...................... 8
Parameters for Starting a Test Case .......................... 66 Test Report Directory .................................................. 10
Parameters for Starting a Test Plan ........................... 63 User .............................................................................. 9
Return Codes .............................................................. 70 Insertion Loss .................................................................... 92
Starting a Simplified Test Plan .................................... 65 IP Trigger Applications ...................................................... 86
Starting a Standard R&S CONTEST Test Plan .......... 63 Configuring R&S CONTEST ....................................... 89
Starting a Test Case ................................................... 65 Messages ................................................................... 87
CONTEST IP Triggers ......................................................................... 88
Command Line Interface ............................................ 60
Report Server Settings ............................................... 17 J
Report Transfer Service .............................................. 23
ContestReports ................................................................. 11 Jenkins .............................................................................. 75
Custom Summary Report .................................................. 38 Build Step ................................................................... 76
Configuration .............................................................. 76
D JUnit Report ................................................................ 77
Post Build Action ......................................................... 77
Database ....................................................................... 9, 10 Post Build Step ........................................................... 77
Device Type ...................................................................... 97 Reporting .................................................................... 76
DUT Automation .......................................................... 78, 81 JSON Report Generator .................................................... 37
Commands ................................................................. 82 JSON Reports ................................................................... 36
Configuring R&S CONTEST ....................................... 85 Generator .................................................................... 37
DUT Remote Control Plugins ..................................... 78 Location ...................................................................... 36
DUT_SWITCH_OFF ............................................. 82, 84
DUT_SWITCH_ON ............................................... 82, 83 L
RESET .................................................................. 82, 84
Test Application-Specific Command Strings ............... 85 Licenses
DUT Drivers ...................................................................... 82 R&S CMW-KT014 ....................................................... 81
DUT Remote Control Plugins ............................................ 78 R&S TS8-KT115 ......................................................... 16
API .............................................................................. 78 R&S TS8-KT120 ......................................................... 95
Configuring R&S CONTEST ....................................... 79 R&S TS8-KT130 ......................................................... 60
Methods ...................................................................... 79 R&S TS8-KT140 ......................................................... 38
Properties ................................................................... 79 R&S TS8-KT150 ......................................................... 16
Sample Code .............................................................. 81 R&S TS8-KT155 ......................................................... 86
Syntax ......................................................................... 78 TS8-KT130 ................................................................. 49

User Manual 1175.6026.02 ─ 02 102


R&S® CONTEST Index

M Reports
Automatic Transfer ..................................................... 21
Manual Report Transfer .................................................... 22 Backup ........................................................................ 16
Data Serialization ........................................................ 36
N Interchanging Data ..................................................... 36
JSON Reports ............................................................. 36
Network Share .................................................................. 11 JUnit Report ................................................................ 76
Manual Transfer .......................................................... 22
P PDF Report ................................................................. 38
Test Result Header Reports ....................................... 30
PDF File Attachments ................................................. 38, 39 Transfer ...................................................................... 21
PDF Reports ..................................................................... 38 Resource String ................................................................ 96
Attachments ................................................................ 39 Return Codes .................................................................... 70
Embedded Reports ..................................................... 39
Filename ..................................................................... 48 S
Information Density ..................................................... 44
Page Orientation ......................................................... 39 Simplified Test Plan .................................................... 53, 65
Report Style ................................................................ 44 SOAP ................................................................................ 49
Requirements ............................................................. 38 SOAP Client ........................................................................ 9
stdout ................................................................................ 69
R Summary Report
Custom Summary Report ........................................... 38
R&S AMU200A ................................................................. 91 System Controller ................................................................ 7
R&S Automation Manager ................................................ 81
R&S CMW-KT014 ............................................................. 81 T
R&S CONTEST ................................................................... 7
R&S CONTEST Command Line Interface ........................ 60 Test Equity 105A Climate Chamber ................................ 100
R&S CONTEST Remote Server ....................................... 49 Test Report Directory .............................................. 8, 10, 11
R&S CONTEST Report Transfer Service ......................... 23 Test Result Header Generator .......................................... 35
R&S TS8-KT115 ............................................................... 16 Test Result Header Reports .............................................. 30
R&S TS8-KT120 ............................................................... 95 Generator .................................................................... 35
R&S TS8-KT130 ............................................................... 60 Test System Configuration ................................................ 96
R&S TS8-KT140 ............................................................... 38 Transferring Reports ......................................................... 21
R&S TS8-KT150 ............................................................... 16 TS8-KT130 ........................................................................ 49
R&S TS8-KT155 ............................................................... 86
Remote Server .............................................................. 8, 49 U
Actions ........................................................................ 55
Architecture ................................................................. 50 User .....................................................................................9
Code Samples ............................................................ 56 User Defined Fading Profiles ............................................ 91
Interface Description ................................................... 51
Parameters ................................................................. 52 V
Port ............................................................................. 51
Sample Code .............................................................. 51 Visual Studio Solution ................................................. 51, 81
Starting a Test Case ................................................... 59 Vötsch Climate Chambers ................................................ 99
Starting an .rstt Test Plan ........................................... 57
Starting an .xml Test Plan ........................................... 57 W
WSDL ......................................................................... 51
Report Analyzer ................................................................ 16 WSDL ................................................................................ 51
Report Folder .................................................................... 11
Report Folder Manager ..................................................... 11
Report Manager .................................................................. 9
Report Server .................................................................... 16
Configuration .............................................................. 17
Database Server ......................................................... 16
File Server .................................................................. 17
Settings ....................................................................... 17
Report Transfer Service .................................................... 23
Advanced Configuration ............................................. 28
Buttons ........................................................................ 24
Error ............................................................................ 25
Messages ................................................................... 26
Report Window ........................................................... 26
Status Application ....................................................... 23
System Tray Icons ...................................................... 23
Warning ...................................................................... 25
Reportal ............................................................................... 9

User Manual 1175.6026.02 ─ 02 103

You might also like