QTM User Manual 8
QTM User Manual 8
Version: 2025.1
www.qualisys.com
Manufactured by:
Qualisys AB
Kvarnbergsgatan 2
411 05 Göteborg
Sweden
Applications 915
Analysis Modules 915
PAF module installation 916
Downloading installation files 916
New installation 916
Upgrading an existing installation 919
PAF Project view 921
Calqulus 923
Qualisys Cloud ecosystem 924
Index 1049
Important information
Intended use
Qualisys cameras are high-performance motion capture cameras intended to
be used within optical motion capture systems that capture three-dimensional
trajectories of reflective markers or objects. Software is available to calculate
derived measures based on this data.
If the motion capture system is intended for Clinical use, characterized by per-
forming measurements on patients to benefit the clinical assessment of that
individual, Qualisys recommends QCS (Qualisys Clinical System) set-up to be
used. If you would like to make a transition to the QCS, please contact sup-
[email protected].
Safety notices
IR radiation notice
The Qualisys camera uses short but quite strong infrared flashes to illuminate
the markers. The flash is generated by LEDs on the front of the camera. The
Qualisys cameras belong to the exempt group according to IEC / SS-EN 62471,
which means that the LED radiation is not considered to be hazardous.
However, any light of high intensity might be harmful to your eyes. Because
infrared light is invisible to the human eye, you can be exposed to IR light
without noticing. Therefore we recommend that you do not stare directly at the
LEDs at a short distance for a prolonged time period when the camera is run-
ning.
User safety
GENERAL INFORMATION 44
Installation risks
l Secure cameras placed at heights over 3 meters, for example using a Kens-
ington lock.
l Use high-quality tripod heads (ask for advice from Qualisys AB if needed).
GENERAL INFORMATION 45
l Make sure Windows Defender is activated and up-to-date. If your IT
department requires third party firewalls or antivirus software, contact
Qualisys support at [email protected] to make sure they will not
interfere with the operation of the Qualisys system.
l To minimize the risk of external threats to the computer, it is recom-
mended not to check email or receive messages through other channels
on the QTM computer.
l Be sure to regularly, preferably every day, back up data/files to a server or
an external hard drive.
l Software updates are available in the client login area of our website
(https://www.qualisys.com).
EU customer information
Waste Electrical and Electronic Equipment (WEEE)
In the European Union (EU), waste from electrical and electronic equipment
(WEEE) is now subject to regulation designed to prevent the disposal of such
waste and to encourage prior treatment measures to minimize the amount of
waste ultimately disposed. In particular, the EU WEEE Directive 2002/96/EC
requires that producers of electronic equipment be responsible for the col-
lection, reuse, recycling and treatment of WEEE which the producer places on
the EU market after August 13, 2005. Qualisys is providing the following col-
lection process to comply with the WEEE Directive.
Qualisys WEEE Collection Process
If you have purchased Qualisys products in the EU on and after August 13,
2005, and are intending to discard these products at the end of their useful life,
please do not dispose of them in a landfill or with household or municipal
waste. Qualisys has labeled its electronic products with the WEEE label to alert
our customers that products bearing this label should not be disposed with
waste in the EU. Instead, Qualisys requests you to return those products using
the instructions provided here, so that the products can be collected, dis-
mantled for reuse and recycled, and properly disposed.
GENERAL INFORMATION 46
Qualisys will take back WEEE, i.e. all of the electrical equipment which is part of
Qualisys equipment, from its customers within the EU. Please visit the website
www.qualisys.com/weee or contact Qualisys AB at [email protected] for
information on how to return your WEEE.
有害物质声明
按 照 中 华 人 民 共 和 国 电 子 工 业 标 准 SJ/T11364 2006的 要 求 ,此 文 档 提 供 了 由
Qualisys AB生 产 制 造 的 Oqus-,Miqus-和 Arqus-系 列 的 危 险 材 料 声 明 。
部件 有毒有害物质或元素
名称
铅 汞 镉 六价 多溴 多溴
铬 联苯 二苯
醚
印制 x o o o o o
电路
配件
显示 x o o o o o
器
按钮 o o o o o o
内部 o o o o o o
配线
围栏 o o o o o o
镜头 x o o o o o
外接 o o o o o o
电缆
及端
口
交流 o o o o o o
电 /直
流电
电源
纸质 o o o o o o
说明
书
CD说 o o o o o o
明书
GENERAL INFORMATION 47
O:表 示 该 有 毒 有 害 物 质 在 该 部 件 所 有 均 质 材 料 中 的 含 量 均 在 SJ/T 11363-2006标 准 规 定 的 限
量要求以下。
X:表 示 该 有 毒 有 害 物 质 至 少 在 该 部 件 的 某 一 均 质 材 料 中 的 含 量 超 出 SJ/T 11363-2006标 准 规
定的限量要求。
System requirements
2. A calibration kit.
NOTE: QTM will start a wizard with fixes for the graphic
board, if a problem with the graphic board is detected. Follow
the instructions in the wizard to fix the problem.
GENERAL INFORMATION 48
e. It is required to have at least 900 pixels in vertical resolution to
make sure that all of the camera settings are visible in the 2D view
window.
f. There must be an Ethernet card that supports 1000Base-T.
g. For the built-in help to work you need a web browser that can
handle basic HTML5.
WARNING: QTM does not support the use of network drives or syn-
chronized drives, such as OneDrive. The use of such storage facilities may
lead to corrupted data files.
External devices
The following additional equipment can be used with Qualisys systems. For
Arqus or Miqus systems a Camera Sync Unit is required for synchronizing or
triggering external equipment. Alternatively, for Oqus systems without a Cam-
era Sync Unit a trigger/sync splitter cable or an Oqus Sync Unit connected to
the control port of one of the cameras can be used.
l Analog interfaces (Measurement Computing), which allow capture of up to 64
channels of analog data (e.g. force plate data, EMG sensor data or other user
specific analog data).
l Blackmagic Intensity Pro and Decklink Mini Recorder cards for video capture,
DirectShow compatible DV cameras or standard USB Web camera, with which
QTM can record video sequences for documentation.
GENERAL INFORMATION 49
Hardware compatibility and version requirements
Cameras
Win 11 Win 10
Win 11 Win 10
5. For AM6500 and AM6800 amplifiers, the minimum required firmware ver-
sion is 1130 (shown as "46A" on the AM6800 LED display).
GENERAL INFORMATION 50
Other hardware
Win 11 Win 10
3. Requires a computer with a PCI Express slot and that the graphic board can
handle hardware acceleration. Contact Qualisys AB to make sure that it
works on your computer.
4. Requires Blackmagic drivers 9.7.2 or later and a computer with a PCI Express
slot and that the graphic board can handle hardware acceleration. Contact
Qualisys AB to make sure that it works on your computer.
GENERAL INFORMATION 51
Getting started
Qualisys offers “marker cameras” for marker detection and “video cameras”
for synchronized video recording. The camera system can be calibrated
through various methods, primarily using wand calibration with a “Qualisys
calibration kit.” To synchronize with external devices, a “Qualisys Camera
Sync Unit” must be included in the system.
For detailed information about setting up a Qualisys motion capture system,
please refer to the System setup chapter.
Qualisys software
QTM Connect
Real-time QTM clients for specific external programs, such as Matlab,
LabView, MotionBuilder, etc.
Analysis modules
Predefined applications based on the Project Automation Framework
(PAF) in QTM, see chapter Applications.
QCloud
Online resources, including online processing and the web report center,
see chapter Applications.
GENERAL INFORMATION 52
Developers' resources
SDKs for building real-time clients for QTM, QTM scripting resources, and
Open Project Automation Framework (OpenPAF) resources, available via
the Qualisys GitHub page at https://github.com/qualisys.
The online resources for Qualisys users are available via the user dashboard at
https://www.qualisys.com/my/. Additional resources are available via
https://www.qualisys.com/downloads/.
For an overview of help and training resources, see chapter "Training
resources" below.
Qualisys user account
To access the Qualisys user dashboard, you need a Qualisys user account. To
create an account associated with your QTM license, follow the instructions
below:
1. Navigate in your browser to Qualisys.com, and hover over the lock icon to
log in or to sign up for an account.
2. When you log in for the first time, enter your QTM username and license
key so that you can access the relevant content. If you are a lab manager,
selecting the checkbox will let you add team members and customize the
online report center for your lab.
If you are new in a lab that already uses Qualisys, ask your lab manager to cre-
ate a new account.
Training resources
Besides the QTM manual the following training resources are available.
GENERAL INFORMATION 53
QAcademy
QAcademy is the official Qualisys online training library, including video
tutorials, courses and guides covering a wide range of topics from basic
camera setup to advanced data processing. Most video tutorials also
include a written manual. A selection of basic tutorials is publicly avail-
able. For access to all QAcademy resources, an active support contract is
required. QAcademy can be accessed online via https://www.qualisys.-
com/my/qacademy/#!/.
Documentation
The installation of QTM contains a Documentation folder, including several
PDF documents, for example, the latest Getting Started guide, a keyboard
shortcut reference, and marker set guides for sports and animation. Docu-
mentation about the real time protocol of QTM is available in the
RT Protocol folder or can be accessed online via https://-
docs.qualisys.com/qtm-rt-protocol/.
Software installation
Make sure you are logged in with an administrator account before you start
installing QTM. To install the software, insert the USB installation stick, and loc-
ate and execute the QTM installer (QTM_yyyy_x_Build_xxxx_Setup_
xxxxxxxx.exe). You can also download the QTM installer via your registered cli-
ent account at http://www.qualisys.com/my/.
Follow the instructions given during the installation. In the installer you can
select the languages for the menus and dialogs in QTM. There are three avail-
able languages: English (default), Chinese and Japanese.
During the installation you can select the components that you want to include.
The following components can be selected:
l Instacal (A/D board driver).
Enter the user name and the license id, that you have received from Qualisys
AB, see chapter "QTM registration" on the next page.
GENERAL INFORMATION 54
If there is an internet connection, QTM will automatically check for updates
when it is started. You can also use the Check for updates option on the Help
menu. You can also find the latest software updates by logging in with your
registered client account at http://www.qualisys.com/my/.
QTM registration
The first time QTM is started you must enter a user name and a license key.
This is provided on the front cover of the QTM installation USB.
NOTE: If the license is time limited you must check the Time limited
checkbox and enter the correct expiration date.
Once you have registered QTM you can proceed to create a project, see chapter
"Starting QTM" on page 58.
Adding licenses
For some analysis modules or plug-ins a license request will appear when you
start QTM. In those cases just enter the user name and license key in the dia-
log. However other plug-ins must be installed after QTM has started, e.g. the
MotionBuilder plug-in. To enter a plug-in license in QTM click on About
Qualisys Track Manager in the Help menu.
GENERAL INFORMATION 55
In the About Qualisys Track Manager dialog you can see information about
the current version of QTM. Click on Licenses to view the installed licenses and
add new licenses.
GENERAL INFORMATION 56
Enter the license key in the dialog and then click OK.
NOTE: If the license is time limited you must check the Time limited
checkbox and enter the correct expiration date.
Alternatively, the licenses can be imported from a text file (*.licenses). The
information in the file is organized per row in the following format (replace text
with registration data, using exact names for the QTM user name and the plu-
gins, which you can find at http://www.qualisys.com/my/):
GENERAL INFORMATION 57
QTM user int erface
Running QTM
Starting QTM
The first time you start QTM on a new computer, QTM will prompt you to create
a project.
Create project
This is the default option because you must have a project to capture data
in QTM, see chapter "Creating a new project" on page 69.
Open project...
Use this option to open a project folder that has been copied from
another computer.
No project
If you only want to open QTM files you can start QTM without a project,
but you will not be able to capture any data or change any project
options.
Once you have created one or more projects on the computer, QTM will open
by default with the Manage projects dialog to select a project. For more
information about the dialog see chapter "Manage projects" on page 72.
NOTE: QTM will use the latest calibration made on the computer, that
was made with the same cameras (placed in the same order), even if it is
not included in the current project.
You can now start managing your project and capturing data. The main func-
tions in QTM are:
2. Start a preview
Press the New measurement button (Ctrl + N) to start the cameras in
Preview (real-time) mode. This requires that a Qualisys camera system
is connected.
In Preview mode the motion capture data is displayed in real-time in the
2D and/or 3D View windows, see chapter "View windows" on page 84. The
data can also be accessed via a real time TCP/IP protocol, see chapter
"Real-time streaming" on page 590.
3. Start a calibration
Calibrate your camera system for capturing 3D motion data, see chapter
"Calibration of the camera system" on page 543.
4. Start a capture
Start a capture to record your motion capture data in a file, see chapter
"Capturing data" on page 566.
5. Open a file
Open an existing file with recorded motion capture data. QTM will display
the data in File mode in which you can process, manage and edit the
recorded motion capture data, see chapter Processing data.
Projects
QTM needs a project to capture measurement data. The project is a folder that
contains all the files and information needed for QTM to process the data. A
project can therefore be easily transferred to for example another computer
with all the settings and files needed for the processing. To create and use pro-
jects follow the instructions in the chapters below.
Data
This is the default location for the captured QTM files. You can create sub-
folders in this folder if you want to sort the files, for example for different
subjects.
AIM models
This folder contains all of the AIM models created in the current project.
Calibrations
This folder contains all of the calibrations made in the current project.
Meshes
This folder contains all of the meshes associated with the current project.
Messages
This is the folder for the messages log files.
Settings
This folder contains the backups of project settings.
Settings.qtmproj
This file contains the current settings of the project.
NOTE: The project file may change format with a new version of
QTM. A backup is saved of the previous version of the file, it can be
named for example Settings.qtmproj.ver_100-101.
The project file in before QTM 2.9 was called settings.qps. A backup
(.bak) is saved of this file when it is converted to the qtmproj format.
NOTE: The QTM program and other components installed by the QTM
installer are placed in Qualisys Track Manager folder under \Program
Files\Qualisys.
WARNING: QTM does not support the use of network drives or syn-
chronized drives, such as OneDrive. The use of such storage facilities may
lead to corrupted data files.
Project view
The Project view window displays the data files included in the current project.
The view consists of two parts: Project data tree and Details. The name of the
current project is displayed at the top of the Project view.
To open the Project view window go to View menu and select Project view,
the shortcut to toggle the window is Ctrl + R.
Project data tree
The Project data tree window is used to display and open the files in the Data
folder of the current project. This includes folders and QTM files, but also all
other files. You can drag and drop files to the Project data tree from for
example Windows explorer. If the file is dragged from another folder on the
same hard drive it is moved to Data folder. You can toggle whether to copy or
move the file with the Ctrl key. If the file is on another hard drive or the net-
work then a new copy is made in the Data folder.
Add
New folder
When adding a folder it is placed in the currently selected folder, if
none is selected it is placed in the root of the Data folder.
New measurement
Adding a new measurement is the same as starting a new capture
with the Capture command on the Capture menu and the folder to
save the file in is set to the currently selected folder in the project
view.
Open
The Open command is only available when a file is selected and it will
then open that file. If it is a QTM file it is opened in the current QTM win-
dow or a new window, depending on the current settings on the GUI page
in Project options. Other files are opened with its corresponding Win-
dows program.
Find next
Use Find Next (or F3) to search for the next occurrence of the current
search term.
The QTM file that is open in the current QTM window is displayed in bold. If you
open a file that is already open in another QTM window you will switch to that
window.
Rename
Rename the selected file or folder.
Batch Process...
Batch process the currently selected files. It will open the Batch pro-
cessing dialog so that you can select which processing steps that you
want to apply, for more information about batch processing, see chapter
"Batch processing" on page 605.
Batch Export...
Batch export the currently selected files. It will open the Batch exporting
dialog for selecting the export formats and their options. For more inform-
ation about batch exporting, see chapter "Batch exporting" on page 710.
Delete
Delete the selected file or folder.
Add
New folder
Add a new folder in currently selected folder. If no folder is selected
it is placed in the root of the Data folder.
New measurement
Add a new measurement, which is the same as Capture on the Cap-
ture menu and the folder to save the file in is set to the currently
selected folder in the project view.
Refresh
Refresh the content in the Project data tree so that it matches the con-
tent of the project data folder. This is usually done automatically by QTM.
The Details window displays information about the currently selected file. The
displayed data is:
File name
File type
File size
File created
This is not the time when the data was captured, but when the file was cre-
ated on the computer which is not the same if the file has been copied.
File modified
Full path
Using projects
The following sections describe some typical use scenarios and recom-
mendations on how to use projects. However, projects can be used in many dif-
ferent ways, depending on how you want to organize and share your settings.
Think about the following when deciding how to use the projects.
For qualified users, projects are helpful to manage the settings of multiple pro-
jects or studies. Below follow some suggestions.
l Create a new project when there is a new study, for example with a spe-
cific marker set.
l Save all of the QTM files in the project data folder as you can then browse
the data in the Project view in QTM, see chapter "Project view" on
page 62.
l Make a backup of the project settings if you like to be sure that you can
always go back to the settings you know are correct, see chapter "Backup
of project settings" on page 73.
l Pin projects that are often used in the Open project dialog, see chapter
"Manage projects" on page 72.
l Create a Project preset if you create a lot of new projects and want to
use the same settings when it is created, see chapter "Project presets" on
page 74.
When working with students using QTM, projects can be used as follows.
l Create a Project preset so that the students can start with the same set-
tings, see chapter "Project presets" on page 74.
l Tell the students to create a new project with the Project preset and save
them at a specific place. The default folder is Documents in Windows, but
it can be changed, see chapter "Folder options" on page 427.
l Make sure that the QTM files are saved in the project data folder. It is the
default path if nothing has been changed in the project settings.
NOTE: It is a good idea to use a different Windows login for the students
so that they cannot access other projects.
If the students can access the other projects make sure that you backup
the settings of the projects, see chapter "Backup of project settings" on
page 73.
If you use a project on multiple computers, you can either synchronize the
whole project, including the project settings, or only selected folders, for
example the Data and AIM folders.
l Synchronizing the Settings.qtmproj file involves the risk of losing settings,
for example recent changes of the camera settings.
l On the other hand, when not synchronizing the Settings.qtmproj file, you
have to remember to change settings in both projects manually, if
needed.
WARNING: QTM does not support the use of network drives or syn-
chronized drives, such as OneDrive. The use of such storage facilities may
lead to corrupted data files.
NOTE: To change the project name after it has been created use
the Rename Project option on the File menu.
NOTE: The Default path for new projects can be set on the
Folder options page in the Project options dialog.
NOTE: If you want several computer users to access the same pro-
ject, it must be saved in a folder which all users have access to, for
example in the C:\Users\Public folders on the computer.
Default settings
All settings are set to the default values and all of the current set-
tings are deleted.
NOTE: If you select any of the options other than the Current settings,
you will lose the current settings in Project options. If you haven't got
any projects in the recent projects list, then the settings are saved in a
backup in C:\ProgramData\Qualisys\Workspace backups.
NOTE: QTM files outside the project can still be opened and processed if
you like. QTM will then note that the file is not in the current project in
the title bar.
NOTE: QTM will use the latest calibration made on the computer, that
was made with the same cameras (placed in the same order). It means
that the calibration may be loaded from another project when switching
projects.
The calibration is also checked if the camera configuration has changed
at Locate or New. If there is a matching calibration for the new camera
configuration, it will be loaded in the project.
The list in the Manage projects dialog displays the 100 most recently used pro-
jects. Double-click on a project to open it. You can pin projects so that they
are always on the top of the list, you have to click Open for the pinning to
change.
The current project is shown when with [Current project] next to the name.
The current project is also displayed in the title bar of QTM and Project
options.
Browse
Use this option to open a project that is not in the list above, e.g. one
copied from another computer.
New project
Create a new project, see chapter "Creating a new project" on page 69.
Open settings for the startup of QTM, see chapter "Startup" on page 429.
Open
Open the project selected in the list above.
Restore
The list displays all of the backups saved in the current project. Select one
and click Restore backup to copy the settings to Project options.
NOTE: QTM will use the latest calibration made on the computer,
that was made with the same cameras (placed in the same order). It
means that the calibration may be loaded from another project
when switching projects.
Project presets
The project presets can be used when creating projects to make sure that you
start with the same settings. The project preset contains all of the settings QTM
(Project options and other settings such as those in the Start capture dialog).
The preset also contains any AIM model that was in the project AIM folder
when creating the preset.
1. Open a project with the settings and AIM models that you want to use.
2. Enter a Project name and select the settings to Base the new project on
from the drop-down list. The presets are listed as Custom preset: followed
by the name.
Maintenance of presets
If you want to change any settings in a preset you need to create a project with
the preset and then change the settings. Then create the preset again with the
same name.
A preset can be deleted from the Project presets dialog.
QTM windows
QTM can display various types of windows. Some windows are confined to the
QTM main window, whereas other windows can be either floating or docked.
Floating windows are not confined to the QTM main window and can be freely
positioned anywhere on the computer screen, even when using multiple dis-
plays. Floating windows can also be docked to specific docking locations in the
QTM main window or another floating window. The maximum number of win-
dows that can be simultaneously displayed in QTM is 30.
Title bar
Title bar of the QTM window displaying the current measurement or file,
the current project, and the current user in case a user has logged in to
QTM.
Menu bar
List of drop-down menus located at the top of the main window for access
to all QTM commands, see chapter "Menus" on page 184.
Toolbars
Collection of toolbars located below the menu bar for access to the most
important QTM commands, see chapter "Toolbars" on page 199.
The following windows are shown within the QTM main window:
Messages window
Log of events since the start of QTM, see chapter "Messages window" on
page 179. The Messages window is always displayed at the bottom of the
QTM main window when activated.
Floating windows
Plot windows
Windows containing graphs of data, see chapter "Plot window" on
page 179. The default docking location is on the left side of the primary
View window.
The main Status bar contains messages about what is going on in QTM, e.g.
when QTM is capturing or processing data. There can also be status messages
for the real time processing and the camera synchronization.
It also shows the latency and the different frequencies during real-time and
when capturing a measurement. The frequencies are updated continuously so
that if the computer cannot process the data fast enough the frequencies will
decrease. Next to the frequencies is a symbol that shows the status of the cal-
ibration, see chapter "Introduction to calibration" on page 543.
Time code
Displays current value of incoming time code (SMPTE, IRIG, Camera time).
GUI
This is the update frequency of the QTM GUI. It can be changed on the
GUI page in the Project options dialog.
RT
This is how fast the data is processed by QTM in real-time. The frequency
is set on the Camera system page in the Project options dialog or for a
measurement in the Start capture dialog.
The RT frequency can be lower than the camera frequency in two cases.
First in RT/preview and Capture mode if the camera frequency is too high
so that the computer cannot process all of the data. The second case is
during a measurement in capture mode if Reduced real time frequency
is selected.
This is how fast the data is captured by the cameras. In RT/preview mode
the frequency depends on the Real time frequency setting on the Cam-
era system page in the Project options dialog. When Reduced real
time frequency is selected the frequency will be displayed as reduced in
RT in the status bar. During a measurement the displayed frequency is
always the same Marker capture frequency.
Window handling
Docking and floating
You can float docked windows by clicking and holding the title bar of the win-
dow and dragging them from their current dock. For tabbed windows, you can
click the tab and drag it from the dock.
To dock a window, click the title bar, drag it to the workspace you want to dock
it, and drop it onto one of the dock symbols that appear in the interface. When
hovering with the mouse over a dock symbol, the dock position of the window
is indicated by a colored area.
The dock locations are:
Arranging windows
Floating windows can be manually resized and freely arranged on the com-
puter screen. Specific window arrangements and customizations can be easily
restored by saving them as a Window layout.
NOTE: When using multiple displays, all displays must have the same
scaling in the Windows display settings.
Window layouts
With the window layouts you can save customized layouts, which include the
placements of all QTM windows, both docked and floating. The layouts are
saved in the project and can therefore be reused on any capture file.
To use window layouts click Window layouts on the Window menu. There are
5 shortcut layouts, which can also be applied with keyboard shortcuts (Ctrl + 1-
5), and two default layouts. The default layouts are for file and capture mode
and they are used when opening a saved file respectively when opening a new
capture file before a measurement.
To save the current layout click Save as and then the desired layout. The 5
shortcut layouts can also be saved with the keyboard shortcuts Ctrl + Shift + 1-
5.
3D view windows
l Zoom and orientation of the coordinate system and the trace range.
NOTE: If the selected data type is not available in the file the
2D data is shown instead.
Plot windows
l The analysis or data plot which was used when saving the layout.
The measurement must have labeled trajectories with the same
name as in the saved layout or the same data in the Data info win-
dow.
NOTE: If all labeled trajectories were selected for the plot, the
labels of the trajectories are insignificant. And therefore the
layout will work for any file with labeled trajectories.
Timeline
l The display settings for the Timeline, not for example the meas-
urement range.
Toolbars
View windows
In a View window the motion capture data can be viewed in 2D, 3D view. The
video data of Oqus and DV/webcam is displayed in the 2D view window.
For each view there is a View window menu with settings. The menu is
accessed by right-clicking in a View window.
The Timeline control bar is common for all View windows and placed at the
bottom of the QTM window, see chapters "Timeline control bar" on page 133.
2D view window
Camera feeds
Camera information
Camera ID
The number in the lower left corner of the 2D view of a camera is the cam-
era id. A motion capture camera is displayed as for example #1, and a
DV/webcam camera is displayed as for example #1V.
Camera type
After the camera ID, the type of camera is indicated for Qualisys cameras.
Exposure group
When delayed exposure is enabled, the exposure group number is dis-
played after the camera type, e.g. (exp. group: 1) for cameras in expos-
ure group 1.
Image area
Image size
The current Image size of the camera is shown as a red square and the
part of the image that is outside the image size is greyed out. For most
types of Qualisys cameras the Image size can be changed with the Image
size tool on the 2D view toolbar, see chapter "2D view toolbar" on
page 89.
NOTE: In a file, only the active part of the sensor is displayed. I.e. if
the Image size has been reduced, then the aspect ratio of that 2D
view will match the reduced image size.
Marker masks
l Green squares indicate camera marker masks, see chapter "Marker
masking" on page 536.
l Blue squares indicate software marker masks, see chapter "How to
use software marker masks" on page 611.
Detected markers
Marker segments
The 2D markers are color coded when marker filtering is activated
(only available for Oqus cameras), see chapter "Marker circularity fil-
tering (Oqus only)" on page 541.
NOTE: If the markers are grey and it says Not used for tracking in
the middle of the view, that camera has been deactivated on the
page in the Project options dialog.
Video feed
The image from Qualisys cameras in video mode is also shown in the 2D
view window, both in preview and in a file. This means that all of the
actions, like zoom and 3D overlay, can be performed both in preview and
in a file. How to capture video using Qualisys cameras is described in the
chapter "Qualisys video capture" on page 574.
NOTE: In a file, only the active part of the video image is displayed.
If the Image size has been reduced, then the aspect ratio of that
video view will match the reduced image size.
External video
The external video devices are displayed after the motion capture cam-
eras in the 2D view window. You can use zoom on the video in both pre-
view and file mode. The video cameras will appear in the same order as
they are on the Video devices page in the Project options dialog. For
more information about video devices see chapter "External video devices
in 2D view" on page 100.
The appearance of the 2D view can be modified with the following options.
NOTE: The DV/webcam cameras are called for example '1V' and are
always placed last in the list.
l Hold the Ctrl key and click on a camera button to display just that cam-
era. When only one camera is displayed hold Ctrl key and click on the but-
ton for that camera to display all cameras.
l Double-click in the area of a camera to just display that camera in the 2D
view window. Use the arrow buttons to step to the next and previous
camera in the system.
l Use the mouse or the buttons on the 2D view toolbar to change the zoom
and translation of a camera view. The 2D views can be zoomed and trans-
lated individually.
The mouse wheel can also be used to zoom. Click on the Zoom but-
ton to use the left mouse button for zoom.
2D view toolbar
The 2D view toolbar contains settings for manipulating the 2D view of the dif-
ferent cameras and for switching between 2D and 3D view. From left to right
the icons have the following use.
3D View Button
Switch to 3D view.
Selection
Use the normal mouse behavior.
Translation
Use the left mouse button to translate the 2D view.
Zoom
Use the left mouse button to zoom the 2D view.
NOTE: When changing the capture rates from the Camera settings
sidebar the image size is reduced automatically if the frequency is
higher than max frequency at full image size. However when chan-
ging the capture rate from the Project options dialog the image
size must be reduced first. For example if you have one camera in
video mode you still have to reduce the image size in video mode
for all of the cameras.
Reorder Tool
Use the left mouse button and drag and drop the whole camera view to
change the camera order in QTM. Use this cursor to change the number
on the camera display so that they come in the order that you want.
Auto Exposure Tool (only available for Miqus Video and Oqus 2c cam-
eras)
Draw the Auto exposure area with this tool. The area is displayed as a
gray rectangle when the tool is active. By default the area is maximized to
the current image size, but if for example there is a very bright part of the
image the auto exposure will work better if the area is reduced. For Miqus
Video Color the auto exposure area is also used to set the white balance.
Identification Tool
Click to switch on the green LED ring on the selected Arqus or Miqus cam-
eras.
3D Overlay
Click to turn on/off 3D overlay for the selected cameras.
The Camera settings sidebar contains the basic camera settings for the
Qualisys cameras. It is displayed at the right side of the 2D view window when
the camera system is live. Use these settings when setting up the camera sys-
tem to get the best data. The settings are also available on the Cameras page
in the Project options dialog, refer to chapter "Cameras" on page 225 for more
details about the settings.
The sidebar is pinned by default, so that it is always visible. When unpinned, it
slides out when you move the mouse on the edge to the right in the 2D view
window.
The sidebar will only display settings that are available for the currently visible
(selected) cameras. For example, when all cameras are in marker mode the
Video settings are hidden. All of the settings, except the Marker Capture Rate,
NOTE: If the currently visible cameras have different settings it will say
Differs. When changing such value it will set all the currently visible cam-
eras to the same setting.
The Marker and Video settings apply to all marker cameras models, as well as
Oqus high-speed video cameras. The settings of the streaming video cameras
(Miqus Video and Oqus 2c) are available under Streaming Video.
The following settings are available on the sidebar.
Camera Mode
These settings change the mode of the camera and you can also activate some
other options.
3D Overlay
Toggle the 3D overlay on and off.
The 3D overlay can also be turned on individually for a camera from the
2D view window menu.
NOTE: Active filtering is not available for Oqus 3 and 5 series cam-
eras. If you have a system that partly consists of these camera
types, you can still turn on active filtering for the others.
Advanced...
Clicking the Advanced… link will open the Cameras page in the Project
Options. The currently selected cameras will be selected in the camera
list.
Marker settings
Capture Rate
The capture rate that is used by cameras measuring markers. The capture
rate applies to all cameras.
Marker Threshold
The intensity level in the image used to detect markers, where the default
value is 17. For example a lower value means that areas of less bright
pixels will become markers, for advice on this setting see chapter "Tips on
marker settings in QTM" on page 483.
Below the slider is the color scale which used for color-coding the video
image in Marker intensity mode. The image will be green at the marker
threshold and then blue below and yellow to red above threshold.
Auto-Mask
Create masks over all of the visible markers in the current cameras.
It is important to make sure that it is only unwanted reflections that
are visible when pressing the button. For more information about
marker masking see chapter "How to use auto marker masking" on
page 538.
Sensor Mode
Switch Sensor mode for the current cameras in marker mode. You can
select between a full size mode and high speed sensor modes. Use the
high speed modes for example if you need to capture at higher fre-
quencies and still want the full FOV, but you do not need the full res-
olution. For an overview of available sensor modes per camera type, refer
to the table in "Qualisys camera sensor specifications (marker mode)" on
page 926.
These settings change streaming video settings for the visible streaming video
cameras (Miqus Video or Oqus 2c).
Capture Rate
The capture rate that is used by cameras capturing video. The video cap-
ture rate can be set by pressing one of the available buttons. The buttons
show commonly used values for video capture rate. Integer divisions or
multiples of the current marker capture rate are indicated bold. The num-
ber of buttons and their exact values depend on the current maximum
video capture rate. The maximum capture rate for the current settings
(resolution and aspect ratio) is displayed above the buttons.
NOTE: The video capture rate can differ between cameras. When
changing the video capture rate, the change is applied to the cur-
rently selected cameras in the 2D View window.
Resolution
Set the resolution of the video image by pressing one of the four buttons.
The available values are 1080p, 720p, 540p and 480p (the values indicate
the vertical dimension of the image in pixels).
Auto Exposure
Check to use automatic exposure for streaming video cameras. When
activated the Exposure Time and Gain options are hidden and controlled
by the auto exposure. Optionally, use the Auto Exposure Tool to limit the
image area used to set the exposure (see chapter "2D view toolbar" on
page 89).
These settings change video settings for the visible cameras when in video
mode (uncompressed video).
Capture Rate
The capture rate that is used by cameras capturing video. The maximum
capture rate at full image size is displayed above the slider. If the camera
system includes different camera types, the maximum capture rate is
determined by the camera with the lowest maximum, also indicated by
the dark blue bar in the slider.
When setting a frequency beyond the maximum capture rate, the image
size of the cameras of which the maximum is exceeded is automatically
reduced. If the image size has been set manually for a camera the
reduced image size will have the same relations for x and y.
NOTE: The video capture rate can differ between cameras. When
changing the video capture rate, the change is applied to the cur-
rently selected cameras in the 2D View window.
Exposure time
The time used by the cameras in video mode. Set it to a value where the
image is bright enough, for more information see chapter "Outline of how
to capture high-speed video" on page 580. The current maximum
Flash time
The time of the IR flash in video mode. This setting can be set to Off (0)
microseconds unless you have markers placed on the subject that you
want to be visible in the video. The current maximum Flash time is dis-
played with a dark blue bar.
Gain
Set the gain for the current cameras in video mode to get a brighter video
image. Depending on the camera type you can use gain values of 1, 2 or 4
and for some cameras also 8 and 16.
Compression
The Compression setting can be used to switch between None, In-cam-
era MJPEG and Software compression. The default for Oqus 2c, 5+, and
7+ is In-camera MJPEG, which is the recommended setting for those cam-
eras. For the other cameras the default is None, however, most of the
time it is recommended to select Software and a Codec to reduce the
video file size.
Sensor Mode
Switch sensor mode for the current cameras in video mode. You can
select between a full size mode and high speed sensor modes. Use the
modes for example if you need to capture at higher frequencies and still
want the full FOV, but you do not need the full resolution. For an overview
of available sensor modes per camera type, refer to the tables in
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927 and
"High-speed video" on page 960.
The lens control settings are only available for cameras with a motorized lens.
The type of lens on the camera is shown above the settings.
Focus
Set the focus for the current cameras. The same setting is used for
marker and video mode. The minimum distance is 1 m and furthest dis-
tance is 20 m.
Aperture
Set the aperture for the current cameras. The same setting is used for
marker and video mode. The available aperture values are dependent on
the lens in the camera.
NOTE: Once focus and aperture have been set for the cameras, it is pos-
sible to disable lens control using the Qualisys Firmware Installer. This
way the current lens settings will be fixed. For more information, see
"How to use Qualisys Firmware Installer (QFI)" on page 471.
The external video devices, such as video from Blackmagic cards or web cam-
eras, are displayed last in the 2D view window. The video data can be used for
documentation purposes, but it is not used in the calculation of trajectories.
NOTE: You can have one video camera per 2D view window if you like.
Use the camera buttons at the bottom of the 2D view window to select
which cameras to view.
In preview mode the picture is the current view of the web camera. In file mode
the picture is taken from the saved video file and it shows the video frame that
corresponds to the capture frame. The video probably has fewer frames than
the motion capture and therefore the numbers will not be the same. For
information on how to record a video file see chapter "How to use external
video devices" on page 898.
The following options are different from the options on the regular 2D view
window menu. The available options depend on if QTM is in RT/preview or File
mode.
NOTE: Only try this alternative if the default for some reason
doesn't work.
The options for individual cameras are presented in the 2D view window
menu, which is opened by right-clicking in a 2D view. It can also be opened on
the View menu. The available options depend on if QTM is in RT/preview or File
mode.
The below options are available for Qualisys cameras. For external video cam-
eras (DV/webcam), additional options are available, see chapter "External video
devices in 2D view" on page 100.
Switch to 3D view
Switch the View window to 3D view.
Markers
Switch to the default Markers mode.
Marker intensity
Switch to the Marker intensity mode.
Video
Switch to the Video mode.
NOTE: The option is only available when the Exposure delay mode
called Camera groups is selected on the Cameras page in Project
options,
Rotate view
Change the rotation of the displayed 2D view so that it matches the cam-
era rotation. For example if the camera is placed upside down you can
use the 180 degree rotation.
The rotation is stored with the file, so you can rotate the cameras after
you have calibrated the camera system. The 2D view rotations will then be
stored in the QTM file. It is also possible to rotate the 2D views in a QTM
file, but to save it you must make sure to make another modification to
the file as well.
NOTE: To change multiple cameras at the same time you can use
the 2D view rotation setting on the Cameras page in Project
options.
3D data overlay
The 3D data can be overlayed on the 2D image area to show the 3D view from
the camera viewpoint. This can for example be used for showing the force
arrow in the video of someone stepping on a force plate or to check which 2D
markers that are used for the 3D calculation.
Follow these steps to activate the 3D data overlay:
1. Calibrate the camera system, including the Qualisys cameras that are
used in video mode.
2. Open a 2D view window in RT/preview or a file.
3. Right-click on the camera where you want to turn on the overlay and
select Show 3D data overlay.
4. The 3D elements displayed in the overlay and the opacity can be changed
on the 2D view settings page in the Project options dialog, see chapter
"2D view settings" on page 417.
NOTE: The marker and video data display in the 2D view can be
switched between linearized and unlinearized on the 2D view set-
tings page. To match the 2D data with the 3D data the data must be
linearized.
To review the camera settings in a file right-click on a camera 2D view and the
select the Review camera settings option. This opens the Review settings
dialog, which shows the settings used during the capture for the camera sys-
tem and other input devices.
The settings are greyed out so that they cannot be changed, but otherwise you
can navigate in the same way as in the Project options dialog. Any setting that
were not available in the QTM version in which the file was capture will be set
to its default value. For more information about the settings, refer to the
chapters "Cameras" on page 225, "Synchronization" on page 266, and the
chapters of the respective input devices.
In 3D view window the motion capture data is shown in 3D using the coordin-
ate system that was defined in the calibration.
Axes
Axes of the coordinate system of the motion capture (X = red, Y = cyan
and Z = dark blue).
Grid
A grid showing the floor of the measurement volume (e.g. Z = 0, if Z is the
vertical axis of the measurement setup).
Volumes
Covered and calibrated volumes, see chapter "Volumes in 3D views" on
page 125.
Bounding box
A white outlined box for the bounding box used by the 3D tracker, see
chapter "Bounding box restricting 3D data" on page 329.
Force plates
Rectangular areas showing the location of the force plates.
Mesh objects
Static mesh objects as defined in the Static Mesh Objects settings page
or Rigid Body mesh objects, see chapters "Static mesh objects" on
page 424 and "Rigid body meshes" on page 667.
Cameras
Cameras
The position and orientation of each camera in this specific setup.
Rays
Rays showing which cameras contributed to the selected trajectories, see
chapter "Rays in 3D views" on page 130.
Markers
Markers for the current position of the trajectories, see chapter "Tra-
jectories in 3D views" on page 116.
Bones
Bones between markers, see chapter "Bones in 3D views" on page 119.
Rigid bodies
6DOF bodies, see chapter "6DOF bodies in 3D views" on page 121.
Skeletons
Skeleton segments and segment markers, see chapter "Skeletons in 3D
views" on page 123.
Force vectors
Force vectors (red) displaying the current forces on the force plates, see
chapter "Viewing force data" on page 704.
Gaze vectors
Gaze vectors (yellow) displaying the current gaze vectors of eye tracking
devices.
Other information
Text labels
Text labels can be enabled for some of the graphical elements.
Tooltips
Hover with the mouse on an object to display detailed information.
The 3D view can be rotated, translated or zoomed so that the data can be seen
from the viewpoint that you want. For most actions which change the view of
the 3D view a red crosshair is shown indicating the center of the 3D view. By
double clicking anywhere in the background, the center is automatically moved
to the geometrical center of all 3D points at the current frame.
The Selection cursor is the default cursor. With modifiers it can be used to
perform a wide range of tasks in the 3D view, e.g., selecting and identifying tra-
jectories or navigating in the 3D view.
The following actions can be performed with the mouse to navigate in the 3D
view:
NOTE: With the mouse buttons the zoom is continuous, while with
the mouse wheel it is done in steps.
The following mouse actions can be useful when manually checking and man-
aging trajectories in measurement files.
Scrubbing
Hold the Ctrl key, press left mouse button anywhere in an empty part of
the 3D view and drag sideways to move forward or backward through the
measurement's time line.
Projections
The default projection used for the 3D view is Perspective. You can change the
projection to specific orthogonal projections. To change the projection, right-
click to open the 3D view window menu and select on a projection under
View. Alternatively, press the P key to toggle between Perspective and the
closest orthogonal projection.
3D view toolbar
The 3D view toolbar contains tools for manipulating the 3D view and the tra-
jectories and for switching between 2D and 3D view. From top to bottom the
icons have the following use.
2D button
Switch to 2D.
3D button
Switch to 3D.
Selection cursor
Use the normal mouse behavior. It has the following keyboard modifiers:
l Shift - Add trajectories to the selection.
Rotate cursor
Use the left mouse button to rotate the 2D view. The difference to the
Selection cursor is that you can no longer select anything with this
cursor. It has the following keyboard modifier:
Translate cursor
Use the left mouse button to translate the 2D view. This is the same altern-
ative as using the right mouse button with the Selection cursor.
Zoom cursor
Use the left mouse button to zoom the 2D view. This is the same altern-
ative as using both mouse buttons or the mouse wheel with the Selection
cursor. It has the following modifier:
l Ctrl - Use the alternative zoom method. For the default settings the
alternative method is zoom to the current position of the cursor.
NOTE: The gap between the two parts is there to visualize that
there are two parts it is not a missing frame.
Trajectories in 3D views
Select
Click on the marker or the trace of the trajectory to select it. Use the fol-
lowing keys to modify the action when you click on a trajectory.
l Hold down Alt to only select a part of the trajectory.
l Use Shift and Ctrl to add and delete (only Ctrl) trajectories to the
selection.
l Hold down Shift and drag the mouse to use area selection.
Information
Place the cursor over a marker or a trace to see a tool-tip with inform-
ation about the trajectory.
Delete
Use the Delete key to delete selected trajectories and parts of trajectories
directly in the 3D view window.
Quick identification
The quick identification method provides an easy way to manually identify
the markers. First select an empty label. Then hold down Ctrl + Alt or
select the Quick identification cursor to activate this method. Then click
on a marker and it will be identified as the currently selected empty label
in the Labeled trajectories window. For more information about quick
identification, see section "Manual identification of trajectories" on
page 620.
Create bones
To create a bone, hold down Shift and select a pair of labeled trajectories
by clicking in the 3D view window. Then press B and there will be bone
between the pair. Several bones can be created with the Create bones
Center on trajectory
Select a trajectory and press C to change the viewpoint so that it centers
on that marker. You can also use the Center trajectory cursor and click
on the marker.
Bones in 3D views
Bones are used to visualize the connection between two markers in 3D views,
e.g. if the measurement is done on a leg, bones can connect hip to knee, knee
to foot and so on. The bones can have different colors and the bone colors that
are used are saved with the AIM model.
Create bones
To create bones you need to select at least two labeled trajectories, e.g. by hold-
ing Shift and clicking in the 3D view window. Then press B or click Create bone
in the Bone menu to create bones between the selected trajectories. Several
bones can be created in succession by using the Create bones sequence tool
in the 3D view toolbar. The bones will then be created between the trajectories
in the order that you click on them.
Modify bones
Right clicking on a bone opens the Bone menu. The bone menu can be used to
change the bone color or delete bones. The action can be applied to several
bones at once. Multiple bones can be selected as follows:
l Hold Ctrl and click on successive bones to add them to the selection.
l Select one bone, and then press Shift and drag the mouse over the area
in which you want to select the bones.
l To deselect a bone, hold Ctrl and press on the selected bone.
To delete one or more bones, select the bones. Then press delete or right-click
on the bone and click Delete bone. To delete all bones use Delete all bones
on the Bone menu.
The bones visualization settings are set on the 3D view settings page in the
Project options dialog.
NOTE: You can use the Bones button on the GUI Control toolbar or Alt
+ 2 to show or hide the bones.
Bone menu
The Bone menu is opened when right-clicking on a bone, but it can also be
opened on the Edit menu and in the 3D view window menu.
Delete bone
Delete the selected bones.
NOTE: In a capture file, definition, name and color of a 6DOF body can
only be changed by reprocessing the file, see chapter "Reprocessing a
file" on page 601.
Segments
Cones representing the position and orientation of the rigid parts of the
skeleton, with the base indicating the proximal ends and the apex the
distal ends. Place the mouse over the segment to view the segment name,
the skeleton it belongs to, the segment markers used by the segment and
the DOFs used for the segment.
Skeleton labels
Text labels attached to skeletons showing their name.
Segment markers
Three dimensional cross-shaped markers indicating the position of the
markers according to the calibrated skeleton definition. This element is
only used for the marker-based skeletons. The segment markers can have
three different colors depending on their status in the current frame.
l White: Default color, indicating a good fit with the measured marker
position.
l Yellow: Indication of a bad fit (deviation of 5 cm or more) with the
corresponding measured marker position.
l Red: Corresponding marker is missing.
Place the mouse over the segment marker to view the segment marker
name and the skeleton and segment it belongs to.
QTM can help you see the volume in which you will be able to measure by cal-
culating the covered and the calibrated volumes. The view cones can also be
used for visualizing the FOV, see chapter "Camera view cones in 3D views" on
page 128.
The covered volume is displayed as light blue cubes. It is the volume that is
seen by a certain number of cameras, specified by the user with the Cameras
required to consider volume covered setting on the 3D view settings page
in the Project options dialog. The covered volume can be used to determine
where 3D data can be measured and is calculated by combining the view cones
and is therefore affected by the length of them, i.e. the Smallest marker size
visible setting.
The default marker size differs between the camera models. e.g. for Oqus
3-series the default marker size is 12 mm. The covered volume is also cut
default by the floor level, but that can be changed by disabling the Cut
covered volume at floor level option on the 3D view settings page in
the Project options dialog.
The calibrated volume is displayed as light red cubes. It is the volume that the
wand moved through when the camera system was calibrated. It therefore
indicates where the most accurate 3D results can be expected and can be used
The volumes can be enabled from the Volumes menu on the 3D view toolbar.
Click on the Volumes button to open the dialog and enable the volumes
with the Show calibrated volume checkbox and the Show covered volume
checkbox. The features can also be enabled on the 3D view settings page in
the Project options dialog.
The camera view cones display the field of view of the camera, i.e. what a cam-
era will be able to measure. It can be used to evaluate how the cameras are
placed in the system so that the placement can be better optimized. By
enabling multiple cones, you can also study what is covered by a certain subset
of the camera system.
The view cones can be enabled per camera from the Camera view cones
menu on the 3D view toolbar. Click on the Camera view cones button to
open the dialog and enable the cones with the Show camera view cones
checkbox. The feature can also be enabled on the 3D view settings page in the
Project options dialog.
NOTE: You can also use the Camera view cones button on the
GUI Control toolbar to toggle the display of the cones.
Camera rays in the 3D view window show which cameras have contributed to
the selected trajectories. The colors of the rays correspond to the colors of the
selected trajectories.
The camera rays are based on a mapping between 2D data of the cameras and
the 3D trajectories. Rays are only shown for cameras that have actually con-
tributed to the 3D tracking. The rays may not perfectly intersect with the cal-
culated 3D position of the trajectories since they represent the actual projected
2D position on the sensor based on the used calibration. That means that the
rays can be used to visualize the residual of the trajectories.
Switch to 2D View
Switch the View window to 2D view.
Reset Viewpoint
Reset the camera viewpoint to center of mass of the tracked markers
(same as double clicking in the 3D view window).
View
Toggle projection between Perspective and Orthogonal (keyboard short-
cut P), and choose viewpoint of orthogonal projection (orthogonal front,
back, etc.). When toggling the projection from Perspective to
Orthogonal, the viewpoint snaps to the closest orthogonal projection.
The Grid Rotation option rotates the grid in accordance with the selected
orthogonal view when checked (only available in orthogonal projection
mode).
Trajectory
Open the Trajectory info window menu for the selected markers, see
chapter "Trajectory info window menu" on page 144.
Bone
Create and delete bones and change the bone color. For more inform-
ation see chapters "Bone menu" on page 120 and "Bones in 3D views" on
page 119.
The Timeline control bar is shown at the bottom of the QTM window in file
mode. It is used to indicate and to select current frame, trace range, meas-
urement range and events.
The current frame is indicated by the top slider , the exact frame is shown in
the text below the control bar. To go to a frame, left-click on the desired pos-
ition on the timeline, or drag the top slider with the mouse. In the 3D view win-
dow you can also use the scrubbing feature (Ctrl + drag) to browse through
the measurement.
The trace range is the amount of trace that is shown in the 3D view window
and it is selected with the two bottom sliders . Drag the sliders to the desired
positions for the trace range, the exact position of the sliders is shown in the
Status bar. You can also use the trace range zooming feature (Shift + Mouse
wheel) to increase/decrease the trace range.
The measurement range is the amount of the original measurement that is
used in the analysis, i.e. when plotting, exporting or playing the data. It is selec-
ted with the two scroll boxes at the ends of the time range. Drag the boxes to
No scale
Do not display any scale.
Show text
Toggle the display of information about Marker frames, Marker trace,
Video frames, Time and time stamp (SMPTE/IRIG/Camera time) in the
timeline.
Events
Open the Events menu, see chapter "Events menu" on the next page.
Timeline parameters
In the Timeline parameters dialog you can set the parameters of the Timeline
control bar.
The parameters in the dialog are the same which can be set manually in the
bar. The numbers inside the parenthesis show the possible values of that para-
meter. When setting the different parameters you will be warned if the para-
meter is outside its possible range. The parameters are as follows:
Events menu
Edit event
Edit the current event. It will open the Edit Event dialog, where you can
edit Label, Time and Frame.
Go to event
Move the current frame of the measurement to the current event.
Remove event
Remove the current event.
The windows are tool windows and the Labeled and Unidentified trajectories
window can be opened in preview mode. By default the Trajectory info win-
dows are placed on the right side of the main window, but they can be floated
or docked in other locations, see chapter "Window handling" on page 81.
The Trajectory info windows displays the data about each trajectory for the cur-
rent frame. The data can be sorted by clicking on the column headers, see
chapter "Sort trajectories" on page 140.
The following data is listed in all of the Trajectory info windows:
Trajectory
The label of the trajectory and the color of its marker in 3D views. In addi-
tion to the color the symbol next to the label also shows if the trajectory
and its trace are displayed in 3D views. The following symbols are used:
Both trajectory and trace are displayed.
Just the trajectory is displayed.
Type
The type of trajectory, which can be one of the following types.
Measured
A trajectory or a part that has been tracked from the measurement
data.
Gap-filled
A trajectory or part that has been calculated with a gap fill function.
Virtual
A trajectory or part that has been calculated from 6DOF body.
Edited
A trajectory or part that has been edited using a smoothing function.
Measured slave
A trajectory or part that has been imported from a Twin slave file.
Gap-filled slave
Gap-filled trajectory or part that has been imported from a Twin
slave file.
ID#
The sequential id of the short range active marker.
Fill level
The percentage of the current measurement range that the trajectory or
part is visible. The measurement range is selected using the Timeline con-
trol bar.
Range
The range of frames, within the measurement range, with data for the tra-
jectory or part.
X, Y and Z
The position (in mm) of the trajectory in the current frame. The coordin-
ates use the coordinate system of the motion capture system set up when
the system was calibrated.
Residual
The average of the different residuals of the 3D point. This is a quality
check of the point’s measured position.
NOTE: The number after the window name is the number of trajectories
in the window.
Sort trajectories
The data in the Trajectory info windows can be sorted by clicking the column
headers. The trajectories are then sorted in ascending order for that column.
Click on the column header to toggle between ascending and descending
order.
NOTE: The trajectories can also be drag and dropped from the 3D view
window, see chapter "Trajectories in 3D views" on page 116.
Two trajectories cannot be joined as long as they have data in the same frames.
This will result in a clash and then the Overlapping ranges dialog is displayed.
Click OK to open to the Overlapping trajectory parts dialog, with which the
correct data can be chosen for the joined trajectory.
The data of the trajectories are shown in the chart area of the Overlapping tra-
jectory parts dialog. This is used to visualize the data which will be included in
the joined trajectory. The data is shown for the X, Y and Z coordinates, where
the coordinate that is displayed is chosen with the Plot coordinate option. The
names and colors of the trajectories are shown in the legend box below the
plot area.
NOTE: The area between the range selectors must always include at
least two frames, which must be gap filled or deleted.
The Trajectory info window menu contains options for the tracked tra-
jectories in a file. Most of the options are not available in real-time. The menu
can be accessed by right-clicking in the following places:
l On a trajectory or a part in one of the Trajectory info windows.
When multiple trajectories are selected the options on the menu is applied to
all of the trajectories. The following options are available on the menu:
Show trace
Toggle the display of the trace of the selected trajectories.
Center trajectory in 3D
Center the 3D view on the selected trajectory or part. The time will be
moved to the first frame of the selected marker, if the trajectory is not vis-
ible in the current frame.
Jump to
Jump in time to the next unidentified trajectory or part (see illustration
below). The trajectory or part is selected and the 3D view is centered.
Next unidentified trajectory
Jump to the next unidentified trajectory (keyboard shortcut J).
Identify
Identify a selected trajectory by choosing an available label from the list. If
you have selected multiple trajectories, you can move them to another
Trajectory info window.
Rename
Rename the selected trajectory. This is the same as double-clicking on the
label.
Swap parts
Swap parts between two selected trajectories, see chapter "Swap parts"
on page 151.
NOTE: The parts that are swapped must not overlap any other
parts in the trajectories.
Delete
Delete the selected trajectories or parts. If you delete a trajectory in the
Labeled trajectories window, it is moved to the Unidentified tra-
jectories window. More information see chapter "Delete trajectories" on
page 152.
Virtual (Static from current frame): Create one or more new static
virtual trajectories at the current frame position of the currently
selected trajectory or trajectories.
Gap-fill trajectory
Fill the gaps of the selected trajectories using one of the following gap-fill
methods.
Linear
Fill the gaps of the selected trajectories using linear gap filling.
Polynomial
Fill the gaps of the selected trajectories using polynomial gap filling.
The maximum gap size is set to the Max frame gap setting under
Project Options, Trajectories (default 10 frames) used at the time
the file was being processed. For gap filling single trajectories with
alternative gap fill settings or methods, see chapter "Filling of gaps"
on page 642.
Relational
Apply gap filling of relational type according to the order of the selec-
tion. There is no maximum gap limitation. The selection order is as
follows:
1. Trajectory to be filled (required)
For more information about relational gap filling, see chapter "Filling
of gaps" on page 642.
Kinematic
Fill gaps of selected trajectories using kinematic gap filling based on
current skeleton data. The selected trajectories must be included in
a skeleton definition.
The 6DOF body will be added both to the 6DOF Tracking page in Project
Options and in the current file. The origin of the rigid body will be the geo-
metric average of the included marker positions. The orientation cor-
responds to the orientation of the rigid body at the first frame of the file.
NOTE: Make sure that the body includes at least three points.
NOTE: The trajectories will keep their label name unless they are
named 'New XXXX' or are unidentified.
Analyze…
Calculate parameters from the 3D data and/or filter the 3D data, see
chapter "Analyze" on page 153. Analysis can also be accessed from the
Analyze trajectory button on the AIM toolbar.
Plot 3D data
To plot 3D data select one or more trajectories, click Plot on the Trajectory
info window menu and then the type of data. The data that can be plotted are:
3D X-position, 3D Y-position, 3D Z-position and Residual. For information
about the Plot window see chapter "Plot window" on page 179.
The curve of each trajectory will have the same color as in the Trajectory info
windows. Trajectories in the Unidentified trajectories window can be sep-
arated by their sequential number.
If trajectories with the same or similar colors are plotted the plot will auto-
matically use different colors. This means that if you used Set different colors
on a large number of trajectories then the color of two trajectories close to
each other in the list may change their color in the plot.
Split part after current frame (shortcut X) on the Trajectory info window
menu splits a trajectory or a part of trajectory into new parts after the current
frame. The first part will end at the current frame and the other part will start
on the frame after the current frame. This means that there is no gap between
the two parts, but in the trace in the 3D view it is visualized as a gap between
the two parts to show the split.
You can also use the Cut trajectory trace tool in the 3D view window to split a
trajectory. Click on a trace with the tool to split the trajectory at that position.
Swap parts
The Swap parts function on the Trajectory info window menu makes it easy
to swap parts that have been added to the wrong trajectories. Follow these
steps to the swap parts.
NOTE: If there is any overlap with parts outside the parts that are
swapped, then QTM will give you a warning and you may select to
swap all of the parts that are overlapped.
Delete trajectories
Calculations and filters can be applied to the trajectory data with Analyze on
the Trajectory info window menu, . Select the trajectories that you want to
analyze and click on Analyze to open the Analyze dialog. The dialog can also
be opened with the Analyze trajectory button on the AIM toolbar. To
export the results to TSV file, check the Export to TSV file option.
Filters can be applied both before and after the calculation, by selecting Use fil-
ter under the respective heading. Set the number of frames that are used for
each point in the filter with the Frames in filter window option. The number
of frames must have an odd value.
There are two available filters:
Fit to 2nd degree curve
This filter uses a 2nd degree curve when processing the data. For each
frame, the filter will first find the 2nd degree curve that best fits the data
in the filter window around the current frame. Then the data of the cur-
rent frame is set to the value of that curve at the current frame.
Moving average
For each frame, this filter first finds the average of the data in the filter
window around the current frame. Then the filter sets the data of the cur-
rent frame to the average found. (This can also be seen as fitting the data
of the filter window to a polynomial of degree zero.)
Calculation
Under the Calculation heading there are five parameters that can be cal-
culated:
Position
No calculation is performed. Use this setting to filter the motion data,
together with selecting Use filter either before or after calculation.
Unless the data is filtered or you select the Magnitude radio button there
is no difference between the result of the analysis and the original data.
Velocity
Calculates the velocity of the trajectories. Select Magnitude to calculate
the speed.
Acceleration
Calculates the acceleration of the trajectories.
Distance
Calculates the distance between two trajectories over time.
Distance traveled
Calculate the distance that the trajectory has traveled in the meas-
urement. The distance will increase with every frame that marker position
has changed more than 0.2 mm compared with the last time the marker
moved. The 0.2 mm hysteresis is there to remove the static noise, oth-
erwise a marker that is static will still get a distance traveled over time. If
there is a gap in the marker data the distance traveled will restart on 0.
Angle
Calculates the angle between two lines in space, using either two, three or
four trajectories. Select the trajectories that you want to match with the
markers in the diagram to calculate the angle between the lines. For three
and four markers the angles are calculated in the range 0 – 180 degrees.
When four trajectories are selected the dialog looks like this:
When three trajectories have been selected the dialog looks like this:
For two markers the angle is calculated between the line and the planes
of the coordinate system. Select which planes you want to use with the
checkboxes XY, XZ and YZ. The angle is calculated between -90 and 90
degrees. Which sign the angle has depends on how line from Center Tra-
jectory to Distal Trajectory is pointing. For example if you are using the
XY plane then the positive side is when the line from central is pointing in
the positive Z direction.
The angle between the line and an coordinate axis is the complementary
angle to the perpendicular plane according to this formula, "angle to axis"-
"="angle to perpendicular plane" - 90 degrees.
Angular velocity
Calculates angular velocity for an angle defined in the same way as for the
angle calculations (see above). The angular velocity is the first derivative
of an angle, i.e. the rate of change of an angle, in degrees per second.
Angular acceleration
Calculates angular acceleration for an angle defined in the same way as
for the angle calculations (see above). The angular acceleration is the
second derivative of an angle, i.e. the rate of change of an angular velo-
city, in degrees per second.
For each of the calculations it is possible to choose whether the output will be
the Magnitude or the Components of the results.
l For Position, Velocity, Acceleration and Distance, the Components means
that the result is shown for the three coordinates (X, Y and Z) separately. The
components of Distance are the distance projected on the three axes (X, Y
and Z).
The Magnitude means that the result is shown as the length of the vector
made up of the three components. For Position this means the distance in
mm from the origin of the coordinate system and for Velocity this means
the speed (in mm/s) of the trajectories. For Acceleration it does not have a
separate name, it is simply the magnitude of the acceleration. For Distance,
the magnitude is probably the result that is most appropriate, since it is the
actual distance between the two trajectories.
l For Angle and Angular velocity, the Components means that the angle is
projected onto the three perpendicular planes (YZ, XZ and XY respectively),
while the Magnitude is simply the angle between the two arms.
Output name
The name under the Output name heading is used in the Plot window and in
the TSV file. When the Add as suffix to the trajectory name setting is selec-
ted the trajectory name is added before the Output name.
The Add as suffix to the trajectory name setting can only be changed when
a single trajectory is selected. If more than one trajectory is selected and Pos-
ition, Velocity or Acceleration calculations are performed, the setting is
always selected. Otherwise all the results would have the same name. On the
other hand, when Angle, Angular Velocity or Distance calculation is per-
formed, the setting is always deselected, since there are two, three or four tra-
jectories selected but the result is a single value.
The labels of the trajectories in the Labeled trajectories window can be saved
in label lists. The label list contains information about the names and colors of
the trajectory labels and the bones between the trajectories in the 3D view.
The label lists are controlled from the Labels menu.
To save a label list from the current capture file, click Save on the Labels
menu. Enter the name of the list and click Save. By default, the label list
file is saved as an XML file in the project folder.
To load a label list, click Load on the Labels menu and open the label list
file. Any existing trajectories in the Labeled trajectories window will be
discarded when a label list is loaded.
Labels menu
The Labels menu can be opened from the Edit menu or by either right-clicking
in the empty area of the Labeled trajectories window or on the names of the
columns.
For an overview of keyboard shortcuts and mouse gestures, see chapter "Tra-
jectory Editor shortcuts" on page 211.
The plot area of the Trajectory Editor window shows the selected data series.
The lay out of the plot area depends on the view settings. The view can be mod-
ified via the buttons on the toolbar, see chapter "Trajectory Editor toolbar" on
the next page. The plot area contains the following elements.
Data series
Red, green and blue lines for X, Y and Z, respectively.
Tool tip
Show data value at current mouse pointer position
Current frame
Gray line and frame/time value
Selection
Gray area, edge values and width
Gap indicators
Amber and blue indicators below the time axis indicating unfilled and
filled gaps, respectively, and corresponding areas in the plot area.
Spike indicators
Red indicators below the time axis indicating detected spikes.
The toolbar of the Trajectory Editor window contains the following elements.
The left side of the toolbar shows the label of the currently selected trajectory.
The lock button to the left of the label can be used to lock the currently selec-
ted trajectory.
To the right of the label the action buttons are located. The action applies to
the current selection of the trajectory. The label of the selected trajectory is
shown on the right side of the action buttons. If needed, the latest actions can
be undone by pressing Undo in the Edit menu or Ctrl + Z.
Delete (Del)
Delete current selection. The selection will be permanently deleted.
Smooth (S)
Smooth current selection of trajectory using the smoothing method spe-
cified in the Trajectory Editor settings sidebar.
Fill (F)
Fill gaps within current selection using the fill method specified in the Tra-
jectory Editor settings sidebar. When expanding the button, you can
choose to fill all gaps (F) or only unfilled gaps (Shift + F).
The remaining buttons can be used to modify the view of the data series and
the lay out of the Trajectory editor window.
Vertical Axis
Toggle visibility of vertical axes.
Series
Select data series to show in the chart (X, Y, Z, (R)esidual, (V)elocity, (A)c-
celeration)
View
The options are: (1) Combined view, (2) Component view; (3) Merged view.
Settings Sidebar
Show/hide Settings Sidebar.
The Points of Interest sidebar contains information about gaps and detected
spikes.
The Gaps pane shows the start and end frames of gaps contained in the selec-
ted trajectory as well as their fill status.
The Spikes pane shows the start and end frames of detected spikes, as well as
their width (number of frames). The detection is based on the acceleration
level, which can be set in the Trajectory Editor settings.
In the settings sidebar of the Trajectory Editor window you can select meth-
ods for gap filling and smoothing, and change their parameters. In addition,
you can set the spikes detection level.
The following gap fill types are available. For more information on how to use
the different gap fill types, see chapter "Filling of gaps" on page 642.
Static
Gap filling by adding a fixed 3D position (virtual point).
Linear
Gap filling by means of linear interpolation.
Polynomial (default)
Gap filling by means of a cubic polynomial interpolation.
Relational
Gap filling by interpolation based on the movement of surrounding mark-
ers selected by the user.
Virtual
Gap filling by adding a virtual trajectory based on the movement of sur-
rounding markers selected by the user and an optional offset.
Kinematic
Kinematic gap fill of markers associated with skeleton segments or rigid
bodies.
Butterworth
Smoothing by means of a Butterworth filter.
The Trajectory Editor window menu contains options for the selected tra-
jectory. The menu can be accessed by right-clicking in the plot area of the Tra-
jectory Editor.
The following options are available:
Fill
Fill all gaps included in the selected frame range using the current gap fill
settings.
Fill Unfilled
Fill all unfilled gaps included in the selected frame range using the current
gap fill settings.
Smooth
Smooth the data included in the selected frame range using the current
smoothing settings.
Delete
Delete the data included in the selected frame range.
Delete Spikes
Delete spikes included in current selection. Only the frames with accel-
eration values above the current acceleration threshold will be deleted,
excluding the additional margins.
Next Gap
Select next gap.
Previous gap
Select previous gap.
Next Spike
Select next spike.
Previous Spike
Select previous spike.
Vertical Axis
Show/hide vertical axis information.
The Data info window menu is accessed by right-clicking in the Data info win-
dow. With the Data info window menu the data can be switched between the
data types and plotted. The following options are available on the menu:
Display 2D data
Current camera
Display the current camera, i.e. the last the camera that you clicked
on, in the 2D view window.
Camera …
Choose which camera’s 2D data that will be displayed.
Plot
Plot the selected data, see chapter "Plot window" on page 179.
Calculate
Calculate magnitude of distance from origin of a 6DOF body, see chapter
"Calculate" on page 178.
Data types
2D data information
Click Display 2D data in the Data info window menu and then click a camera
to show its 2D data in the Data info window. Data can only be shown for one
camera at a time. Use Current camera to display the data for the current cam-
era in the 2D view window.
x, y
The position of the marker on the sensor in subpixels. The 2D data in this
window is unlinearized.
xSize, ySize
The size of the marker in subpixels.
To plot the data select the data for one or more markers, click Plot or Plot
filtered on the Data info window menu and then the type of data. With Plot
filtered you can apply a Fit to 2nd degree curve or Moving average filter.
For information about the Plot window see chapter "Plot window" on page 179.
The data of the 6DOF bodies in the current frame can be viewed in the Data
info window. The bodies will be shown in the same order as on the 6DOF
Tracking page. The data is relative to the reference coordinate system as
defined for the respective rigid bodies (global coordinate system by default),
see chapter "Coordinate system for rigid body data" on page 354. The angles
are expressed in Euler angles according to the definition on the Euler angles
page.
Click Display 6DOF data in the Data info window menu to show the 6DOF
data in the following eight columns:
x, y and z
The position (in mm) of the origin of the measured rigid body’s local
coordinate system relative to its reference coordinate system.
If the 6DOF data cannot be calculated, the x column displays "Not found".
When the 6DOF body is disabled in the 6DOF Tracking settings, the x
column displays "Disabled".
Residual
The rid body residual, calculated as the average of the errors (distance) in
mm of the measured markers compared to the points in the rigid body
definition.
To plot the data select the data for one or more 6DOF bodies, click Plot or Plot
filtered on the Data info window menu and then the type of data. With Plot
filtered you can apply a Fit to 2nd degree curve or Moving average filter.
You can also plot the velocity and acceleration in the three directions of the
coordinate system for rigid body data and the angular velocity and acceleration
for the three rotation angles. It is recommended to use Plot filtered and apply
the filter before the calculation for Velocity and Acceleration, because the
noise is amplified by the calculations.
The data of the skeletons in the current frame can be viewed in the Data info
window. The skeletons will be shown in the same order as on the Skeleton
solver page and the data will use the definitions for angles and local coordinate
system on the Euler angles page.
Click Display Skeleton data in the Data info window menu to show the skel-
eton data in the following eight columns:
Skeleton
Name of the skeleton.
Segment
Name of the skeleton segment.
X, Y, Z
Segment 3D positions.
The following menu options are available when right clicking in the Skeleton
data information window.
Local Coordinates
Display local coordinates of segments relative to their respective parent
segment when checked. Display global coordinates of segments when
unchecked.
NOTE: Local coordinates with the Qualisys Sports Marker set gives
you the joint angles in Roll, Pitch and Yaw.
To plot skeleton segment data, select one or more segments, click Plot or Plot
filtered in the Data info window menu and select the type of data to plot.
With Plot filtered you can apply a Fit to 2nd degree curve or Moving aver-
age filter.
For information about the Plot window, see chapter "Plot window" on
page 179.
Click Display Analog data in the Data info window menu to show the analog
data of the analog board in the Data info window. It will then show the data
for the current frame in the following two columns:
Channel
The name of the analog channel.
Value
The voltage input to the analog board.
NOTE: If the data is not in V then the unit is displayed after the
data. If you have a sensor on the EMG that gives you other data
than V then it is in the SI unit of that type of data, except for accel-
erometers that are in g.
Board Name
The name of the analog board or of other analog device.
Channel No
The channel number on the analog device.
Right click on a analog channel to open the Data info window menu with the
following settings.
To plot the voltage, select one or more channels, click Plot or Plot
filtered on the Data info window menu. With Plot filtered you can
apply a Fit to 2nd degree curve or Moving average filter. For inform-
ation about the Plot window see chapter "Plot window" on page 179.
Click Display Force data in the Data info window menu, to show the force
data for all force plates in the Data info window.
The Data info window will then contain data for the force plate's Force,
Moment and COP (Center Of Pressure) vectors. The data is for the actual force
on the force-plate. The displayed data is for the current frame in the following
five columns:
Parameter
The name of the vector.
X, Y, Z
The size (in N, Nm respectively mm) of the vector in the internal coordin-
ate system of the force plate, if you are using the Local (Force plate) set-
ting. The Z direction of the force plate coordinate system is pointing
down.
Force plate
The name of the force plate for the data in that row.
The coordinate system that is used is displayed in the title of the window,
plate or lab. You can switch between the coordinate system on the Force
data page in Project options dialog, see chapter "General settings" on
page 360.
To plot a vector select one parameter, click Plot on the Data info window
menu and then click Parameter. The data of the X, Y and Z directions are plot-
ted for the selected parameter in one window. It is not possible to plot more
than one parameter in one window. For information about the Plot window
see chapter "Plot window" on the next page.
The number of seconds in the force plot is specified on the GUI page in the Pro-
ject options dialog, see chapter "GUI" on page 415.
Calculate
The Calculate dialog is used to calculate the magnitude of the distance from
the origin of the reference coordinate system to the current position, under
Location of body heading. Or the current rotation of the body under the
Angle of body heading. Click on Calculate under the respective heading to get
the result, the value will only be updated when you click Calculate.
The distance can be calculated for different planes, select the plane with the X,
Y and Z options. The angle can only be calculated in one of the three rotations
at a time. The data is calculate as the average of the data in the number of
frames specified in the Calculate the mean of the last option.
The Messages window contains the processing messages, camera error mes-
sages and some status information about what has happened in QTM, since it
was last started. The messages are displayed with the oldest first in the list,
scroll down to the bottom to see the latest messages.
The following mouse actions are supported:
Use the scroll wheel to scroll up and down the message list.
Right click and select Clear list to clear the message list.
Plot window
The Plot window is used for displaying graphs of various types of data via the
Plot or Analyze Trajectory commands.
The Plot window can be docked into QTM or it can be used as a floating win-
dow, and the Window arrangements can be saved as Window layouts. For
detailed information, see chapter "Window handling" on page 81.
The axes showing the X and Y values of the data and their units. The X
units can be Frame or Time, dependent on the type of data displayed.
Events
Vertical lines representing events on the time line. The colors cor-
respond to the event colors.
Legend
The legend shows the names and colors of the plotted data series.
The legend box allows for the following interactions:
l Hover with the mouse over a label to highlight it in the graph.
Tool tip
When moving the mouse in the plot area a tool tip shows the Y values of
the displayed data series at the current X position of the mouse. The color
of the tool tip corresponds to that of the data series.
Mouse position
When mowing the mouse in the plot area, the X and Y values of the
mouse pointer position is shown in the lower-right corner.
Plot range
Areas outside the measurement range are grayed out. If the range filter in
the plot menu is disabled, data outside of the selected measurement
range is shown in the gray area. The plot range allows for the following
interactions:
l Click and drag the plot range edges to modify the measurement
range.
Plot menu
The Plot menu contains the settings and layout options of the plot. It can be
accessed by right-clicking anywhere in the Plot window.
The following options are available:
Edit Title
Open a dialog for modifying the plot title. Use the Reset button to revert to
the initial title of the plot.
Legend
Show or hide the legend.
Range Filter
Enable or disable the range filter. When the Range Filter is enabled, only
data within the selected measurement range is shown. When disabled, all
data within the measurement range is shown, and data outside of the
selected range is shown in the gray area.
Events
Show or hide events.
Style
Choose between dark or light mode.
When using Window layouts, Plot menu settings are stored as part of the Win-
dows layout.
Zooming, panning and other plot interactions
Panning
The panning behavior depends on the axis settings in the Plot menu.
When checking Min, Max or Auto-Fit for an axis, the panning behavior is
restricted. The following panning interactions are supported:
Right-click and drag (plot area)
Panning in both X and Y directions.
Legend
Hover
Hover with the mouse over a label to highlight the data series.
Selection
Click on a label to show or hide the data series.
Timeline
Click and drag (time cursor)
Click and drag the time cursor to move the current frame.
The Plot window can be used to graph data series from a file or during preview.
When creating a new plot, the plot settings depend partly on if the plot is cre-
ated from a file or during a preview.
When including Plot windows as part of a saved Window layout, the current
plot settings are saved with the following exception:
l The X and Y Axis limit values (Min, Max) are not stored when the check
boxes are unticked.
As a result, the reset behavior of the axes may depend on if the Window layout
is applied to a file or a preview. Furthermore, the zoom and panning behavior
is dependent on the Axis settings. It is therefore recommended to use Window
layouts with plot settings that are optimal for either files or preview. For
example, when creating a Window layout to be used as the default capture lay-
out, it is good practice to create the plots while in preview, or to make sure the
Auto-Fit check box for the Y Axis settings is ticked.
Menus
The following chapters contain a short description of the menu items available
in the QTM menu bar.
For a description of the popup window menus, please refer to the chapters of
the respective windows, e.g. the Trajectory info window menu in the chapter
about the Trajectory info window.
File
The File menu contains the following items:
Open
Open an existing capture file. The following file types can be opened in
QTM:
QTM files: capture files (.qtm) and calibration files (.qca).
C3D files: import of C3D files (.c3d), see chapter "C3D import" on
page 187.
Batch Process...
Open configuration dialog for the batch processing function, see chapter
"Batch processing" on page 605.
Save
Save a new capture file or save an existing file that has been changed.
Save as...
Save the active file with a new name.
NOTE: When saving files you can always go to the Project data
folder with the link at the top of the list to the left in the Save dia-
log.
Close
Close the active capture file.
New project...
Create a new project, see chapter "Creating a new project" on page 69.
Save project
Save the current project settings to the Settings.qtmproj file.
Rename project
Rename the current project, this will rename the project folder.
Manage projects...
Open the Manage Projects dialog, see chapter "Manage projects" on
page 72.
Settings management
Project presets
Open a dialog to save project presets that can be used when cre-
ating projects, see chapter "Project presets" on page 74.
Export
Export the capture file to one of the following formats, see chapter "Data
export to other applications" on page 710.
Batch export
Select files and export formats in batch exporting dialog, see chapter
"Batch exporting" on page 710.
To TSV
To MAT
To AVI
To FBX
To JSON
To TRC
To STO
Import
Recent projects
List of recently opened projects
1-10...
The last ten opened QTM files.
Exit
Exit the application.
C3D import
When opening a C3D file, it will be imported in QTM. The import includes:
l 3D data
l Analog data
l Force data
You can display, manage and edit the imported data using the regular QTM
functionality. The imported data can be saved as a QTM file.
NOTE: When exporting the imported file as C3D, the exported C3D file
may have a different format than the original C3D file.
NOTE: Not all C3D files may be compatible with the import function.
Contact Qualisys support at [email protected] if you have problems
to import your C3D files.
TRB/TRC import
A TRB file (.trb) is a binary 3D data file format, including meta data about
the capture, as well as marker colors and bones.
It is assumed that the Y-axis is defined as the upward axis. When importing into
QTM, the data will be transformed so that the Z-axis is pointing upwards.
Edit
The Edit menu contains the following items:
Undo
Undo the last edit action on trajectories.
Delete
Delete trajectories or Delete bones depending on what is selected in
the file.
Trim Measurement
Trim the measurement to the current selected range on the time line.
NOTE: When trimming the file, the time stamp of the file cor-
responding to the start of the capture and event times are recal-
culated.
NOTE: The QTM file size will not change when trimming a file that is
included in a PAF project, but the data is trimmed to the selected
range.
Find
Search for the first occurrence of the search term in the Project data
tree, see chapter "Project view" on page 62.
Find Next
Search for the next occurrence of the search term in the Project data
tree.
Trajectory
The Trajectory info window menu for the selected trajectories, see
chapter "Trajectory info window menu" on page 144.
Bone
The Bone menu for creating and deleting bones, see chapter "Bone
menu" on page 120.
Events
The Events menu for creating and editing events, see chapter "Events
menu" on page 136.
Rigid Body
Change mesh settings or change color of selected rigid body. The changes
apply to the rigid body definition in the file, not the one in the project.
View
The View menu contains the following items:
Labeled
Toggle the display of the Labeled trajectories window, which shows
the trajectories that have been identified and labeled.
Unidentified
Toggle the display of the Unidentified trajectories window, which
shows the trajectories that are unidentified.
Discarded
Toggle the display of the Discarded trajectories window, which
shows all trajectories that have been discarded.
Toolbars
For information about toolbars, see chapter chapter "Toolbars" on
page 199.
Capture toolbar
Toggle the display of the Capture toolbar.
AIM toolbar
Toggle the display of the AIM toolbar.
Standard toolbar
Toggle the display of the Standard toolbar.
Trajectory toolbar
Toggle the display of the Trajectory toolbar.
Timeline
For information about the Timeline and the menu see chapter "Timeline
control bar" on page 133.
Data info 1 2 3
Toggle the display of the Data info windows (maximum 3 windows) for
showing and plotting 2D, 3D, 6DOF, analog and force data in both preview
and file mode. For detailed information see chapter "Data info window"
on page 167.
Project View
Toggle the display of the Project view window on the left side of the main
window see chapter "Project view" on page 62.
Messages
Toggle Messages window. For detailed information see chapter "Mes-
sages window" on page 179.
Status Bar
Toggle the display of the Status bar at the bottom of the main window.
Terminal
Open the terminal window for the QTM scripting API.
Trajectory Editor
Open the Trajectory Editor window.
Full Screen
Enable/disable full screen mode.
File Information
Opens a window showing information about the capture, see chapter "File
Information" below.
3D view/2D view
3D view window menu or 2D view window menu depending on the act-
ive View window, see chapter "3D View window menu" on page 131
respectively "2D view window menu" on page 103.
File Information
The File Information window shows the following information about the cap-
ture.
Analog devices used for the capture and their respective capture fre-
quencies.
Play
The Play menu contains the following items:
Play/Stop
Switch between play and stop. The file will be played in the direction it
was played the last time.
Play
Stop
Play backwards
Capture
The Capture menu contains the following items:
Capture
Start the motion capture, see chapter "Capturing data" on page 566. If
used without preview the cameras will start in the same mode as last pre-
view or measurement.
Refine calibration
Open the Refine calibration dialog, see chapter "Refine calibration" on
page 550.
Reprocess
Reprocess the active capture file, see chapter "Reprocessing a file" on
page 601.
Recalculate forces
Recalculate the force data in the active capture file, see chapter "Cal-
culating force data" on page 703.
Linearize camera
Start linearization of a camera, see chapter "Linearization procedure and
instructions" on page 486.
Apply model
Apply the current AIM model to the active capture file.
NOTE: The current AIM model is found on the AIM page in the Pro-
ject options dialog.
Generate model
Generate an AIM model from the active capture file.
Skeleton
The Skeleton menu contains the following items. For more information on skel-
eton calibration and tracking, see chapter "Tracking of skeletons" on page 671.
Calibrate skeletons
Create skeletons from T-pose in current frame and solve. The skeleton
definition will be added to or updated in the Skeleton solver page in the
Project Options and applied to the current file or real time preview.
Solve Skeletons
Apply skeleton solver to the currently open file.
Tools
The Tools menu contains the following items:
Project options
Open a dialog with settings for the QTM software, see chapter "Project
options dialog" on page 217.
Analyze Trajectory
Open dialog to analyze trajectories, see chapter "Analyze" on page 153.
Window
The Window menu contains the following items:
2D
Open a new 2D view window.
3D
Open a new 3D view window.
Video
Open a new 2D view window displaying only the DV/webcam video
cameras.
Window layouts
See chapter "Window layouts" on page 82.
Save as
Help
The Help menu contains the following items:
Login
Log in to your Qualisys customer account for access to online processing.
Toolbars
Standard toolbar
The Standard toolbar contains the following icons, from left to right:
Undo
Redo
Open a new 2D view window displaying only the DV/webcam video cam-
eras
Playback toolbar
The Playback toolbar contains the following icons, from left to right:
Go to the first frame
Play reverse
Stop
Play forward
Playback speed
Capture toolbar
The Capture toolbar contains the following icons, from left to right:
Start capture
Linearize camera
Add event
The GUI Control contains icons for toggling GUI elements in the 3D view. To
change the appearances of the GUI elements open the 3D view settings page
in the Project options dialog. The toolbar contains the following icons, from
left to right:
Bones
Force Arrows
Force Plates
Grid
Covered Volume
Calibrated Volume
Bounding Box
Camera Rays
Skeletons
AIM toolbar
The AIM toolbar contains the following icons, from left to right:
Apply AIM model (in RT mode this button restarts AIM)
Calibrate skeletons
Analyze trajectory
Reload scripts
The Trajectory toolbar contains the following icons, from left to right:
Show/hide Data info window 1
Show/hide Terminal
Keyboard shortcuts
Menu shortcuts
The following menu commands can be accessed through keyboard shortcuts.
Workflow shortcuts
Ctrl + N
New capture file.
Ctrl + Shift + R
Open the File reprocessing dialog.
F9
Apply AIM model to a capture file or restart AIM and 6DOF calculation in
preview and capture mode.
F10
Calibrate skeletons.
Capture file shortcuts
Ctrl + O
Open capture file.
Ctrl + S
Save capture file.
Ctrl + F4
Close capture file.
Editing shortcuts
Ctrl + Z
Undo the last editing of the trajectories or bones.
Ctrl + Y
Redo the last action that was undone.
Ctrl + E
Add a manual event.
Del
Delete selected bones or "degrade" selected trajectories:
Ctrl + Shift + T
Trim a capture file to the current measurement range.
Display and window shortcuts
Ctrl + W or Ctrl + ,
Open the Project options dialog.
Ctrl + D
Toggle the display of the Data info window.
Ctrl + R
Toggle the display of the Project view window.
Ctrl + T
Toggle the display of the Trajectory Editor window.
F1
Open the QTM help window.
3
Switch to 3D view.
Shift + M
Activate the Marker mask tool.
2
Switch to 2D view.
P
Toggle 3D view projection between orthogonal and perspective.
C
Center on the selected marker in the 3D view window.
L
Move the selected trajectories/parts to the Labeled trajectory info win-
dow.
U
Move the selected trajectories/parts to the Unidentified trajectory info
window.
Del
Delete selected bones or "degrade" selected trajectories:
Labeled trajectories: move the selected trajectories/parts to Uniden-
tified trajectories window.
X
Split the selected trajectories after the current frame.
Shift + B
Activate the Create bones sequence tool.
Shift + C
Activate the Center trajectory tool.
Shift + X
Activate the Cut trajectory trace tool.
B
Create bones between all selected labeled trajectories.
Left-click
Select a trajectory.
Alt + Left-click
Select only the current part of the trajectory.
Shift + Left-click
Select multiple trajectories.
Ctrl + Left-click
Select multiple trajectories, possible to de-select trajectories.
Ctrl + drag
Scrubbing feature (3D view window). Hold control key, press left mouse
button anywhere in the empty 3D space and drag to scrub through the
measurement's time line.
J and Shift + J
Jump in time to the next unidentified trajectory or part, respectively, and
center on it in the 3D view window.
S
Swap the selected parts of two trajectories.
W
Swap the part of the current frame in two selected trajectories.
X
Split the selected trajectories after the current frame.
Shift + F8
Define a rigid body with the current frame of the selected trajectories.
F8
Define a rigid body with an average of data in all frames of the selected
trajectories.
L
Move the selected trajectories/parts to the Labeled trajectory info win-
dow.
U
Move the selected trajectories/parts to the Unidentified trajectory info
window.
D
Move the selected trajectories/parts to the Discarded trajectories win-
dow.
Ins
Add a new label in the Labeled trajectories window.
F2
Rename the selected trajectory.
Alt + Up
Move the trajectory up in the list.
Alt + Down
Move the trajectory down in the list.
Ctrl + A
Select all trajectories in the current Trajectory info window.
Ctrl + Space
Toggle the selection of a trajectory.
Playback keys
The following keys can be used to play the file:
Space
Play forward and Stop.
Ctrl + drag
Scrubbing feature (3D view window). Hold control key, press left mouse
button anywhere in the empty 3D space and drag to scrub through the
measurement's time line.
Ctrl + G
Go to frame.
Right Arrow
Go forward one step.
Left Arrow
Go back one step.
Home
Go to the first frame.
End
Go to the last frame.
Page Down
Go to next event.
Page Up
Go to previous event.
Space
Click the selected button or toggle the selected option.
Tab
Step through the settings. Use shift to step backwards.
Ctrl + Tab
Step out of the settings page to the options buttons at the bottom of the
dialog. Use shift to step backwards.
NOTE: There may be overlapping shortcuts for the 3D/2D view. Make
sure that the Trajectory Editor window has focus when using the below
shortcuts.
Keyboard shortcuts
F (fill gap(s))
Fill selected gap(s) using current fill type.
S(smooth)
Smooth the data in the selected frame range using the current smoothing
settings.
Z (zoom to selection)
Zoom to the currently selected range.
X (zoom to extents)
Zoom horizontal and vertical axis to data extents.
H (zoom horizontal)
Zoom horizontal axis to data extents.
V (zoom vertical)
Zoom vertical axis to data extents.
G (zoom to gap)
Toggle automatic zoom to selected gap.
T (time mode)
Toggle time units in seconds/frames.
Q, Shift + Q
Traverse the trajectory list down and up, respectively.
Mouse gestures
Mouse wheel
Horizontal zoom.
Ctrl + left mouse button - drag anywhere in the chart (time line scrub-
bing)
Move the Current Frame marker forwards or backwards.
Scrolling
Mouse wheel
Scroll the trajectory list vertically.
Left mouse button - click and drag the separator between the tra-
jectory labels column and the chart
Change the width of the trajectory labels column.
Left mouse button - double click on the separator between the tra-
jectory labels column and the chart
Reset the width of the trajectory labels column to match the longest
name.
Hover
Hover with the mouse over a label to highlight the data series.
Selection
Click on a label to show or hide the data series.
Time line
Dialogs
The following chapters contain a short description of the main dialogs in QTM.
QTM dialogs
The dialogs in QTM where different settings are specified are described in the
chapters were they are used. The essential dialogs are:
Project options dialog
This is the main dialog in QTM with the settings for the hardware and the
processing of data, see chapter "Project options dialog" on the next page.
Calibration dialog
This dialog is used to start a calibration, see chapter "Calibration dialog"
on page 545.
Click Apply to save the settings and apply it to the current measurement
without closing the dialog.
NOTE: Most of the settings will only affect a measurement if QTM is still
in the preview mode, i.e. before the measurement has started. To change
settings on a file you must use the reprocessing dialog instead, see
chapter "Reprocessing a file" on page 601.
Input Devices
The Input Devices page lists the available devices in QTM. Double-click on an
entry to go to the settings for that device. The list contains the following inform-
ation:
Enabled
The Enabled column contains the checkbox to enable the device and also
a status light for the device. A green circle means that the device is con-
nected, a yellow circle means that the device status is unknown and red
circle means that the device is disconnected.
Name
The Name of the device.
Analog Boards
Analog boards supported by QTM.
NOTE: The analog board settings are saved with the serial
number of the board. So the analog board will be visible in the
project even if you disconnect the board from the computer or
it is not active in Instacal.
AV Devices
Any video camera connected to the computer, for example via a
Blackmagic Design card or a webcam.
Force Plates
Digitally integrated force plates.
Instrumented Treadmills
Digitally integrated instrumented treadmills.
EMGs
Digitally integrated EMG systems, see "Wireless EMG systems" on
page 804
Eye Trackers
Digitally integrated eye trackers, see chapter "Eye tracking hardware
in QTM" on page 864.
Manufacturer
The Manufacturer column contains a link to the website of the device's
manufacturer. If the manufacturer is unknown there is a link to Google.
Refresh
Use the Refresh button to refresh the status of the devices.
NOTE: The refresh button does not work on analog boards when
they are added or changed in Instacal. In that case you need to
restart QTM.
Camera system
The Marker capture frequency is the capture rate that will be used in a
marker measurement. The frequency can be set to an integer between 1 Hz
(frames per second) up to the maximum frequency of the current camera type.
For an overview of maximum capture frequencies per camera, see chapter
"Qualisys camera sensor specifications (marker mode)" on page 926.
Real time frequency
The Real time frequency is the frequency that is used for the real-time pro-
cessing. It applies both in Preview mode and Capture mode. For more inform-
ation on real-time measurements see chapter "Real-time streaming" on
page 590. There are two options for the Real time frequency.
Marker capture frequency
With this option the real-time frequency will be the same as set with
Marker capture frequency. Which means that the cameras will capture
at that frequency and QTM will process as many frames as it can.
The reduced frequency cannot be set higher than the Marker capture
frequency setting. This is to ensure that you are not using an exposure
setting that is too high for the real-time frequency.
Under the Camera system settings heading, the settings for the current cam-
era system are displayed. Every entry is marked with a notification icon. There
are three different notification icons, which have the following meaning:
The specified setting is configured in a way that differs from the normal
setting. Make sure that it is correct.
The specified setting is configured in a way that can ruin the meas-
urement if it is not correct. Make absolutely sure that the setting is cor-
rect. Some settings may actually damage the camera system, see chapter
"External timebase" on page 278.
QTM will scan the Ethernet connections for Qualisys systems. When QTM has
found the camera systems that are connected to the Ethernet ports, it will
report the configuration of the camera systems in the Finding camera system
(s) dialog.
NOTE: If one or more devices have old firmware, a dialog appears for
applying a necessary firmware update for using the system with the
installed version of QTM. For more information see chapter "Firmware
update when locating system" on page 470.
NOTE: If more than one camera systems are found, select the camera
system that will be used from the list. Only one camera system can be
used at a time.
Cameras
The Cameras page contains all of the camera settings for Qualisys cameras. It
is often easier to use the tools in the 2D view and the Camera settings sidebar
if you want to change settings that are used more often, see chapters "2D view
toolbar" on page 89 and "Camera settings sidebar" on page 91.
Type
The type of camera.
Serial
The serial number of the camera.
Ip-address
The IP address for the camera.
On the Cameras page you can set individual settings for the cameras. Select
the cameras that you want to change the settings for in the camera list. You can
use Ctrl and Shift to select multiple cameras. The settings list will be updated
depending on the cameras that are selected. If multiple cameras are selected
and there is a setting that has been set individually its value is red. When chan-
ging such values, it will set all the selected cameras to the same setting. Use the
Select all button to select all of the cameras. Only the global settings will be
shown in the settings list if none of the cameras are selected.
The settings list contain the settings for the selected cameras. All of the set-
tings marked with * are global. The settings are described in the chapters
below.
Check the Show description option to get a short description of the selected
setting.
Use the Reset settings button to reset all of the camera settings to the default
values.
Camera settings
Camera Mode
The Camera Mode setting switches the camera mode for the selected cam-
eras. The available modes are Marker, Marker Intensity and Video. The
modes can also be changed during preview in the 2D view. Changing the
modes in the 2D view will update the Camera Mode setting.
When starting a measurement, the cameras will start in the mode selected with
the Camera Mode setting. However, since changing in the 2D view updates the
setting it means that it is usually the same as the last preview.
Marker settings
Capture rate
The Capture rate is the frequency that is used during a marker measurement.
The setting is global for all cameras in Marker mode.
It is the same setting as Marker capture frequency on the Camera system
page, which means that if the setting is changed it is updated on both pages.
The possible capture range is shown to left of the setting. The range is changed
depending on the camera types within the system, as well as the size that is set
with Image size. For an overview of the maximum capture frequencies per cam-
era model at full size image, see "Qualisys camera sensor specifications
(marker mode)" on page 926.
The Exposure time setting changes the exposure time for marker mode. For
marker mode the exposure time and flash time is the same because it is no
meaning exposing the image longer than the flash. The setting can be set indi-
vidually for each camera.
Increase this setting if the markers are not visible. Decrease the exposure if you
have extra reflections. The Range shown to the left of the setting is the range
that can be used with the current capture rate. The maximum value is a tenth
of the period (1/capture rate) or at most 1000 ms in marker mode.
This setting is the main option to change amount of light in the image. Because
then you can change the light input without touching the cameras. Especially
for Oqus 3- and 5-series it is good to have the aperture maximum opened and
change the exposure time instead.
For tips on how to set the exposure see chapter "Tips on marker settings in
QTM" on page 483.
Marker threshold
The Marker threshold sets the threshold intensity level for markers in
Qualisys cameras. The setting can be set individually for each camera.
The level can be set between 5 and 90 % the max intensity level, the default
value is 17. Every pixel value that is brighter than the specified value will be cal-
culated as a marker.
Marker mode
Under Marker mode click on for the Type setting to select the used marker
type. The setting is global for all cameras. The options for this setting are:
Passive
Use this setting for any marker with reflective material. This is the default
option.
Active
Use this setting for Traqrs, the active 500 mm calibration wand, or short
range active markers (SRAM) with sequential coding.
Passive + Active
In this mode the camera will capture both passive and sequence coded
active markers. This mode can be used if you need to add some tem-
porary markers to the subject and do not want to add active markers.
However, if you mix the passive and active markers all the time you lose
some of the advantages of both types.
Marker ID range
When Type is set to Active or Passive + Active, the setting ID range becomes
available. The ID range setting sets the sequence length that is used to define
unique blinking patterns that are used to identify the markers. The options are:
Standard (1-170)
Standard ID range of 170 uniquely defined markers using a sequence
length of 21 frames. This is the default option.
Extended (1-740)
Extended ID range of 740 uniquely defined markers using a sequence
length of 41 frames. Use this range if you have more than 170 active mark-
ers. This option is only supported for the Active and the Naked Traqr in
combination with Arqus or Miqus cameras. The Traqr units need to be cor-
rectly configured using the Traqr configuration tool.
For information about the use of passive and active markers, see chapter
"Passive vs Active markers" on page 529. For information about Qualisys active
marker solutions, see chapter "Active marker types" on page 1000.
NOTE: The old type of close range active marker is not supported in QTM
2.5 or later. Please contact Qualisys AB if you use this type of active
marker.
Image size
NOTE: The red rectangle representing the image size is not drawn lin-
earized, this means that with wide angle lens it is best to turn off the
Show linearized data option on the 2D view settings page to see the
true positions of the mask and image size.
Marker masking
With the Marker masking option, masks can be added to the camera image.
The masks will be shown as green areas in the Marker and Marker intensity
mode. Any detected markers within the masked areas are discarded from the
camera output. For information on how to use the marker masking, see
chapter "How to use marker masking" on page 537.
Use the button on the Auto marker masking option to automatically add
markers masks to the selected cameras. You can select several cameras when
applying the auto marker mask, but it is important to remove any real markers
Right (x-max)
The end pixel of the masking area in X-direction in camera coordinates.
Top (y-min)
The start pixel of the masking area in Y-direction in camera coordinates.
Bottom (y-max)
The end pixel of the masking area in X-direction in camera coordinates.
l Marker masks are not drawn linearized, this means that with wide
angle lens it is best to turn off the Show linearized data option on
the 2D view settings page to see the true positions of the mask
and image size.
For Oqus cameras the option to use marker circularity filtering is available. The
Marker circularity filtering options are used to filter non-circular markers in
the 2D data. The default is that the filtering is not Enabled. For more inform-
ation about the filtering see chapter "Marker circularity filtering (Oqus only)" on
page 541.
The marker filtering is done in the camera according to the options below.
Enabled
To turn on the filtering check the Enabled checkbox. Which markers that
are filtered out depends on the Circularity level option.
Circularity level
The Circularity level option defines which markers are filtered out as too
non-circular. These markers are then processed according to the Non-cir-
cularity marker settings on the 2D preprocessing and filtering page.
The option has five levels: All markers, Low, Medium, High and Very high.
The default value is Medium. When set to All markers, then all markers
Marker limits
The Marker limits settings decides which reflections that are detected as mark-
ers by the cameras. When Use default settings is set to True no marker limits
are used and QTM will detect all reflections as markers.
Set the Use default settings to False to set other Marker limits settings. The
setting can be set individually for each camera. The Marker limits settings are
applied on-camera, which means that discarded markers are not recorded in
QTM and cannot be restored.
There are three discrimination settings:
Marker size
Smallest
The Smallest parameter controls the smallest detectable size of a
marker (in subpixels) for each camera in every frame. Any marker
smaller than this will be discarded from the camera output.
This option might be useful to screen out tiny non-marker reflec-
tions or to assure a minimum size of the markers seen by the cam-
era.
The minimum value of this parameter is 128 and the maximum is
the value of the Largest parameter.
Exposure delay
The Exposure delay setting shifts the exposure of a camera compared to other
cameras in the system. This can be used when the flash of a camera disturbs
another camera in the system. However, because the time of the exposure will
be different in the camera system the 2D position that corresponds to the first
The recommended mode is Camera group because then the delay is cal-
culated automatically. First select the cameras that you want in the group from
the camera list to the left. Then select one of the five groups in the Camera
groups setting. E.g. if you only have two groups you should use group 1 and 2.
QTM will then automatically delay the exposure time of the groups so that the
delay for a group is the same as the longest exposure time in the group before.
This means that you can use any exposure time for the cameras and the delay
will always be correct.
The Advanced mode should only be used if you are completely sure about
what you are doing. The delay is then set with an absolute time in the Delay
time setting. The delay time will be the same even if you change the exposure
time of a camera in the system, which means that you must then manually
change the Delay time for the cameras to keep the correct delays to reduce
reflections.
Sensor mode
The Sensor mode option can be used to reduce the resolution to get a higher
capture ratewhile keeping the same field of view. The option is only available
for camera models that have more than one sensor mode. The setting can be
changed individually on each camera and also for Marker and Video mode sep-
arately. The sensor mode setting can also be changed on the Camera settings
sidebar. For an overview of available sensor modes per camera model, see
"Qualisys camera sensor specifications (marker mode)" on page 926.
Video settings
Capture rate
The Capture rate on the Video settings is only used for video capture and can
be set independently of the marker capture rate. The setting is also individual
for the cameras so that you can capture at different video capture frequencies.
The maximum capture rate can be increased by reducing the Image size. The
current capture range is shown next to the capture rate. In the Project options
dialog the Image size must be changed first to get a higher frequency.
However, in the Camera settings sidebar the Image size will be reduced auto-
matically when choosing a higher frequency.
Exposure time
The Exposure time setting changes the exposure time for video mode. The set-
ting can be set individually for each camera.
The setting sets the exposure time in microseconds. The maximum exposure
time is up to 60 microseconds less than the period time, i.e. 1/capture rate.
Flash time
The Flash time setting changes the flash time for video mode. The setting can
be set individually for each camera.
The setting sets the flash time in microsecond. The maximum flash time is
always limited by the exposure time but there are also other limitations. First of
all the flash time cannot be longer than 2 ms. The other limitation is that it can
only be one tenth of the period time of the capture rate. For example, if the cap-
ture rate is 200 Hz the maximum flash time is 500 μs.
The IR flash does not contribute much when you capture video. Therefore, it is
often a good idea to set the Flash time to its minimum of 5 μs.
Gain
NOTE: The Gain option is available on the Camera settings sidebar for
all cameras except the Oqus 3- and 5-series.
Image size
With Image size it is possible to change which part of the image that is cap-
tured in video mode. The setting can be set individually for each camera. Use
the maximize button to reset the selected cameras to the maximum image
size.
The selected image size will be shown with a red square in the video preview
window. The frequency range will be updated automatically when you change
the size. The camera will still capture video outside the rectangle in preview,
but when you make a measurement the image is cropped.
The image size is specified in Left and Right, Top and Bottom, where the Left
and Right values can only be specified in certain steps depending on the cam-
era type. The step size for the current model is displayed next to the settings.
Image resolution
The image resolution will change the quality of the images that are captured by
Qualisys cameras according to the options below. The setting can be set indi-
vidually for each camera.
Resolution
Set the number of pixels that are captured, e.g. with 1/4 resolution every
other line and column is deleted. Choose the resolution for the camera
from the options below.
Video compression
The Video compression option can be used to reduce the video file size. By
default the Mode is set to In-camera MJPEG for all cameras that support that
option. You can set the Compression quality of the MJPEG compression to
change the quality and file size of the video; the default is 50. Increasing the
NOTE: For Miqus Video, the Video compression setting is not available.
Miqus Video cameras always stream video using In-camera MJPEG.
The default mode on the other cameras is None for uncompressed image and
maximum quality. If the file is uncompressed it may not play at the correct
speed in most external programs.
Switch the Mode setting to Software to specify a codec on the computer for
compression of the video.
Choose a codec from the list on the Compression codec line to activate it. The
codecs in the list are the ones that are installed on your computer.
The codecs are then grouped in Recommended codecs and Other codecs. For
more information about the recommended codecs, see chapter "Recom-
mended codecs" on page 583.
If available, the settings for a codec are opened with the button on the Con-
figure codec line. The settings depend on the codec, please refer to the codec
instructions for help on the settings.
Click on the button on the Codecs recommended by Qualisys line to go to
a web site with links to codecs.
Color
Use bilinear color interpolation to get a color image.
Auto exposure
Color temperature
The Color temperature setting is only available for the Oqus Color video (2c)
camera. With it you can adapt the color temperature used for the white balance
in the video. If you use the wrong option the colors may be very wrong, so it is
important to test the different options to find the best one.
Daylight
This option uses a color temperature that is most suitable for daylight. It
works best outdoors, but also works well if you have large windows.
Sensor mode
The Sensor mode option can be used to reduce the resolution to get a higher
capture rate while keeping the same field of view. The option is only available
for camera models that have more than one sensor mode. The setting can be
changed individually on each camera and also for Marker and Video mode sep-
arately. The sensor mode setting can also be changed on the Camera settings
sidebar. For an overview of available sensor modes per camera model, see
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927 and
"High-speed video" on page 960.
Active filtering
Single
With the Single setting the camera only captures an image before starting
a measurement. As for the Continuous settings the extra image uses no
IR flash but is otherwise the same as the actual image. This setting can be
used if you have a very static setup.
Lens aperture
The Lens aperture setting is only available for Qualisys cameras with a motor-
ized lens. The limits of the aperture depends on the mounted lens. In most
cases it is recommended to have the aperture around 4, for more tips on aper-
ture and focus, see chapter "Tips on setting aperture and focus" on page 481.
It is best to change this setting in preview from the Camera settings sidebar in
the 2D view, so that you can check the brightness and focus of the image dir-
ectly.
NOTE: Once focus and aperture have been set for the cameras in a fixed
camera system it is possible to disable lens control using the Qualisys
Firmware Installer. This way the current lens settings will be fixed. For
more information, see "How to use Qualisys Firmware Installer (QFI)" on
page 471.
The Lens focus distance setting is only available for Qualisys cameras with a
motorized lens. The limits of the setting is 0.5 m and 20 m, for more tips on
aperture and focus, see chapter "Tips on setting aperture and focus" on
page 481.
It is best to change this setting in preview from the Camera settings sidebar in
the 2D view, so that you can check the focus of the image directly.
NOTE: Once focus and aperture have been set for the cameras in a fixed
camera system it is possible to disable lens control using the Qualisys
Firmware Installer. This way the current lens settings will be fixed. For
more information, see "How to use Qualisys Firmware Installer (QFI)" on
page 471.
2D view rotation
The 2D View Rotation setting defines the rotation of the camera in the 2D
view window. The available options are 0, 90, 180 and 270. You can select mul-
tiple cameras in the camera list to change rotation for all of the selected cam-
eras at the same time.
NOTE: The setting does not apply to camera views in a QTM file. To
change the rotation in a file you must use the Rotate view option on the
2D view window menu.
Start delay
The Start delay option sets the delay in μs for the camera system. When
enabled the camera system will delay the start in relation to the time that the
master camera receives the start command from QTM. This delay is required
on the Main system in a multiple video system setup.
NOTE: The Start delay option is not in use if the cameras are syn-
chronized with an external timebase.
Linearization
Under the Camera linearization parameters heading there is a list of all the
linearization files of the connected cameras. In the list you can manage the lin-
earization files and select whether a camera will be used for tracking or not.
Each camera is delivered with its own linearization file (*.lin) stored in its
memory. The file name includes the serial number of the camera. When con-
necting a system for the first time in a project, the linearization files are loaded
into the project and downloaded to the C:\ProgramData\Qualisys\Linearization
folder of the computer.
The linearization files that are currently loaded in the project are used as the
intrinsic calibration parameters of the cameras.
Serial #
The serial number of the camera.
Lin-file
Name of the linearization file.
Focal Length
The focal length reported in the linearization file.
Created (date)
Date of creation for the linearization file.
Deviation (CU)
The deviation reported in the linearization file. This is usually a number
between 2-4 subpixels (CU).
WARNING: With this method, there is no check that the file cor-
responds to the serial number of the camera, so make sure to select
a file that matches the serial number of the camera.
NOTE: The camera will still capture data during a measurement even if it
is deactivated. Therefore, it can be included again in the measurement by
reprocessing the file, see chapter "Reprocessing a file" on page 601.
However, if a camera has been deactivated during calibration, the cal-
ibration must be reprocessed first, see chapter "Recalibration" on
page 563.
Calibration type
Select the calibration type that will be used. The supported calibration types
are Wand calibration and Fixed camera calibration, see the chapters below.
Wand calibration
The wand calibration method requires two calibration objects to calibrate the
system. One is a stationary L-shaped reference structure with four markers
attached to it. The stationary L-structure (called reference object below) defines
the origin and orientation of the coordinate system that is to be used with the
camera system. The other calibration object is called calibration wand. It con-
sists of two markers located a fixed distance from each other. This object is
Calibration kit
Define the calibration kit you are using under the Calibration kit heading. The
calibration kit is used for scaling and locating the coordinate system in the
measurement volume. Two objects are needed to calibrate the system: a ref-
erence structure and a wand.
NOTE: The default origin of the coordinate system for the 300 mm
and 600 mm carbon fiber kits, as well as the 1011 active wand kit, is
at the corner where the frame rests on the floor. For the other kits
the default origin is at the center of the corner marker, for inform-
ation on how to translate the origin see chapter "Translating origin
to the floor" on page 556.
With Apply coordinate transformation you can translate and rotate the
global coordinate system to any desired position. Select the checkbox and then
click Define to set the coordinate transformations on the Transformation
page, see chapter "Transformation" on page 259.
On the Calibration page for Fixed camera calibration you should enter the
data from the survey measurement. If you cannot see this Calibration page
change the Calibration type option to Fixed camera calibration. For more
detailed information on fixed camera systems contact Qualisys AB about the
QTM - Marine manual. They include detailed descriptions of the camera
installation, survey measurement, fixed camera calibration, validation and the
use of 6DOF bodies in marine applications.
Use the options Save definition and Load definition to save respectively load
the data for the Fixed camera calibration. The default folder is the project
folder.
NOTE: The first time you enter the survey data it must be entered manu-
ally.
Under the Reference marker locations heading you should enter the survey
data of the reference marker positions. Use the Add marker and Remove
marker options to add or delete reference marker locations. Add the markers
Camera locations and markers seen by each camera in order from left
to right
Under the Camera locations and markers seen by each camera in order
from left to right heading you should enter the survey data of the camera pos-
itions. Use the Add camera to add a new camera last in the list. The cameras
must be entered in the same order as the camera system. It is not possible to
rearrange the cameras after they have been added, just to remove a camera
with Remove camera. Double-click the column to enter the following data:
Location X, Location Y and Location Z
The survey measurement data of the camera.
Markers seen (l to r)
The markers seen by the camera. Enter them in order from left to right as
seen by the camera and separate them with commas (the numbers refer
to the first column in the Reference marker locations list).
NOTE: QTM uses the top markers in the 2D view window as ref-
erence markers.
With Apply coordinate transformation you can translate and rotate the
global coordinate system to any desired position. Select the checkbox and then
click Define to set the coordinate transformations on the Transformation
page, see chapter "Transformation" below.
Transformation
The Transformation page contains the settings for defining a new global
coordinate system. The two changes that can be made to the coordinate sys-
tem is Translate origin and Rotate coordinate system. By changing these
parameters you can in fact move and turn the coordinate system to any pos-
ition and orientation. The change is always related to the original position and
orientation of the calibration coordinate system. In the case of the Fixed cam-
era calibration this means the origin of the survey measurement.
To activate the change you must select the checkbox of respective setting.
Translate origin (X(mm), Y(mm), Z(mm))
Enter the new position of the origin of the coordinate system (in mm). The
direction of the parameters (X, Y and Z) is always related to the original
coordinate system of the calibration.
Use the Rotate axis to line or Fetch rigid body buttons to define the
rotation from a measured line or a rigid body, see chapters "Rotate axis to
line" on the next page and "Transform coordinate system to rigid body
(floor calibration)" on page 262.
With the Rotate axis to line function a line can be used to define the direction
of one of the axes. This can for example be useful to define a vertical or hori-
zontal axis if the floor is not level enough. Follow this procedure to use the func-
tion:
1. Make a measurement with two static markers that defines the line that you
want. It is important that the markers are as static as possible, because an
average is used to define the line. It is also important that the file uses the
current calibration.
2. Keep the file open and go to the Transform page in the Project options dia-
log. If the Transform page is not active go to the Calibration page and check
the Apply coordinate transformation box.
4. Click on the Rotate axis to line button to open the Rotate axis to line
dialog.
5. Select the axis that you want to define the rotation for.
6. Then select the trajectory that the Axis is pointing from and the trajectory
that the Axis pointing to.
7. Click OK to calculate the rotation and show the result on the Trans-
formation page.
8. A new calibration file will be saved in the Calibration folder after you click
OK in the Project options dialog.
The position and rotation of a rigid body can be used to define the global
coordinate system. This can for example be used as a floor calibration. Follow
this procedure to use the function:
c. Use the Translate and Rotate options on the 6DOF Tracking page to
achieve the desired definition. Use the Align the body using its points
option to define a plane, see chapter "Rotate body" on page 352.
d. Check that the new rigid body definition is correct. It is usually easiest
to do this in preview mode.
2. Have preview open and make sure that the rigid body is tracked.
6. Click on the Fetch rigid body button to update the values under Translate
origin and/or Rotate coordinate system. If the coordinate system already
included an earlier transformation, the values will be relative to the untrans-
formed coordinate system.
Current calibration
The Current calibration page displays the calibration that is used by QTM.
Open the current calibration with the Open button. With the Load other
option you can open a dialog and load another calibration file. A calibration file
can only be loaded when it includes all cameras that are currently included in
the project.
The Load other option can be used to merge calibration files, see chapter
"Merge calibration files" on page 565.
Calibration quality
The Calibration quality page contains setting for how to detect if the cameras
needs to be calibrated. The check can be performed in two different ways.
Residual settings
This will test in the 3D tracker if the cameras have a Residual in the File
information on the View menu that is higher than the residual check.
The default value for the residual test is 3 mm. It also tests if there are too
few of the captured 2D markers used by the 3D tracker, i.e. the number of
Points. If a camera is considered to be uncalibrated, then there is a warn-
ing after the tracking of the file.
Time settings
This test will only check how long a calibration is considered as new. The
calibration will still be used in a measurement it is just a visual warning.
When the time set with A calibration is new for has passed the triangle
at the bottom right corner turns yellow . Then when the time A new cal-
ibration is recommended after has passed the triangle changes to
orange .
Synchronization
Type
The type of the camera or device.
Serial
The serial number of the camera or device.
Ip-address
The IP address for the camera or device.
NOTE: When a Camera Sync Unit is included in the system, it will be the
only device in the list. The synchronization settings of any Oqus cameras
included in the system are not displayed.
The settings list contain the settings for the selected devices. You can use Ctrl
and Shift to select multiple devices, or the Select all button to select all
devices. All of the settings marked with * are global. If multiple cameras are
selected and there is a setting that has been set individually its value will say
Differs. When changing such value it will set all the selected cameras to the
same setting. If there is no synchronization device connected, only the Wire-
less/software trigger option is present.
Check the Show description option to get a short description of the selected
setting.
Wireless/software Trigger
The Wireless/software trigger settings contain all the settings for triggering
the Qualisys system using a wireless trigger, the keyboard or RT client applic-
ations. Possible input:
Keyboard
Use space for starting captures. The space key cannot be used for stop-
ping captures or creating events. In addition, the PageDown key can be
used for starting and stopping measurements, and the PageUp key for
settings events.
RT Client application
For example mobile apps and plugins for Matlab and Labview.
Start capture
The start of a capture is delayed until a trigger event is received.
Listener enabled
Enable starting or stopping a capture when receiving an
UDP start/stop packet.
Event color
Color associated with this type of event. Click in the value field to pick a
color.
UDP start/stop
QTM supports starting and stopping captures via a UDP start/stop protocol.
QTM also broadcasts UDP start and stop messages for every capture that can
be used to start and stop captures on external devices that support the UDP
start/stop protocol.
To enable external devices to control captures in QTM via UDP start/stop, fol-
low these steps:
1. Make sure the computer running QTM and the controlling device are on
the same local area network.
2. Make sure that the QTM and the controlling device use the same port for
UDP communication. In QTM the Capture Broadcast Port can be set on
the Real-Time output page under the Project Options.
3. In the Wireless/Software trigger settings, set Function to the desired
action, for example Start and Stop capture if you want to start and stop
the capture using UDP start and stop.
4. Enable the Start/Stop on UDP packet option.
The next time you start a capture, QTM will wait for a trigger or stop the cap-
ture depending on the chosen trigger function. If you want QTM to do an auto-
matic series of captures controlled via UDP start/stop, make sure the Batch
capture option is checked in the Start Capture dialog.
The UDP start/stop packets sent by QTM have the following XML format.
Name
Filename without the .qtm ending
DatabasePath
Name of the folder if automatic saving is enabled in the Start capture dia-
log
Delay
Not in use
PacketID
Unique ID
ProcessID
PID for QTM process
RESULT
l SUCCESS
The capture is ended without any issues
l FAIL
The capture ended with an error
l CANCEL
The capture has been canceled
Name
Filename without the .qtm ending
DatabasePath
Name of the folder if automatic saving is enabled in the Start capture dia-
log
Delay
Not in use
PacketID
Unique ID
HostName
Computer name
ProcessID
PID for QTM process
QTM receives start packet
Name
Sets the QTM file name used for the capture
ProcessID
Used in combination with HostName to ignore messages from the local
QTM
Name
Sets the QTM file name used for the capture
HostName
Used in combination with ProcessID to ignore messages from the local
QTM
ProcessID
Used in combination with HostName to ignore messages from the local
QTM
Trigger ports
The Trigger port(s) settings contain all the settings for triggering the Qualisys
system using one or more external trigger devices.
For Oqus systems the use of an external trigger device requires a splitter cable
connected to the control port of one or more of the cameras. The Trigger port
settings apply globally to the control ports of all cameras within the system.
Start capture
The start of a capture is delayed until an external trigger event is
received.
Stop capture
An incoming trigger event will stop the capture.
TTL signal edge/Trig NO: TTL signal edge/Trig NC: TTL signal edge
Select the trigger edge to Positive (rising edge), Negative (falling edge) or
Any edge (rising or falling). The setting applies to all start, stop and event
signals arriving at this trigger port.
Generate events
Activate this option to generate events with the external trigger. This
option applies to all trigger ports, and it is activated by default.
The Event port settings contain all the settings for creating events using an
external trigger device connected to the event input of the Camera Sync Unit.
The following settings are available:
TTL signal edge
Select the trigger edge to Positive (rising edge), Negative (falling edge) or
Any edge (rising or falling).
NOTE: The Qualisys trigger button connected to the event input will
produce a negative signal edge when pressing the button, and a pos-
itive signal edge when releasing the button. The button is not
debounced though, so positive edges may be produced even when
pressing the button.
Event text
Text label associated with this type of event. Click in the value field to edit.
NOTE: When using IRIG as external timebase source, the Event port set-
tings are ignored, since the IRIG time code signal needs to be connected
to the Event input on the Camera Sync Unit.
Pretrigger
For using pretrigger in combination with the USB-2533 analog board the use of
an external trigger is required, see chapter "Measurement with analog capture
while using pretrigger" on page 493 for details on how to connect the external
trigger device. The pretrigger option is not supported in combination with
other any other external devices, such as digital force plates, eye trackers, EMG
devices and A/V devices.
WARNING: Make sure that the time between the start of the capture
and the trigger event is long enough to capture all the pretrigger frames,
i.e. check that the Status bar says ”Waiting for trigger”. Otherwise there
will not be enough pretrigger frames to collect from the buffer.
External timebase
GENLOCK
Use a video signal (black burst) to synchronize the system.
IRIG
Use an IRIG signal to synchronize the system.
NOTE: IRIG cannot be used when there are any Oqus cameras
included in the system.
Internal 100 Hz
The system is synchronized with an internal 100 Hz signal. This
option will be automatically selected if 100 Hz continuous has been
selected as Synchronization output mode on one of the syn-
chronization output ports, see chapter "Synchronization output" on
page 285.
IR receiver
Send an IR signal which is received by the IR receiver on the front of
the camera.
SMPTE
Use a SMPTE signal to synchronize the system. You need to use the
Oqus Sync Unit to convert the signal, see chapter "Using Oqus sync
Video Sync
Use a video signal (black burst) to synchronize. You need to use the
Oqus Sync Unit to convert the signal, see chapter "Using Oqus sync
unit for synchronization" on page 509.
Internal 100 Hz
The system is synchronized with an internal 100 Hz signal. This
option can only be selected if 100 Hz continuous has been selected
as Synchronization output on one of the cameras, see chapter
"Synchronization output" on page 285.
NOTE: This option is needed for the Twin system frame syn-
chronization option, in which case it is selected automatically.
Signal mode
Select the way to synchronize the camera frame to the signal.
Periodic
In this mode the system locks on to the signal of the external
timebase. The capture frequency can be set as a multiplier/divisor.
This is the recommended setting for periodic signals see chapter
"Using External timebase for synchronization to a periodic TTL sig-
nal" on page 494.
Non-periodic
In this mode every single frame that is captured must be triggered
by a signal from an external timebase source. This means that the
Frequency multiplier/divisor
Set the multiplier respectively divisor to multiply and divide the incoming
sync signal.
Timestamp
QTM has the option to add timestamps to camera frames for synchronization
with external signals or devices. The timestamp is displayed in the Timeline
control bar see chapter "Timeline control bar" on page 133.
The following settings are available:
Use timestamp
Check to add a timestamp to the camera frames.
Timestamp type
Select type of timestamp from the drop down menu. The options are:
SMPTE
Time code used for audio and video synchronization. This requires
an Oqus or Camera Sync Unit to convert the SMPTE signal. For more
information about using SMPTE, see chapters "Using Oqus sync unit
for synchronization" on page 509 and "Using SMPTE for syn-
chronization with audio recordings" on page 512.
IRIG
Time code standards by the Inter-Range Instrumentation Group.
This requires a Camera Sync Unit. The IRIG standards currently
Camera time
Time of the exposure in seconds.nanoseconds. When used without
external clock master, the reference time is the time at which the
master camera was started. When using PTP synchronization with a
GPS-based external clock master the reference time is 1 January
1970. For more information about the use of an external clock mas-
ter, see chapter "How to use PTP sync with an external clock master
(Camera Sync Unit)" on page 501.
Timestamp frequency
Select one of the supported SMPTE or IRIG frequencies from the drop
down menu.
Synchronization output
The Camera Sync Unit (CSU) has three synchronization outputs, Out 1, Out 2
and Measurement time. The outputs Out 1 and Out 2 can be configured con-
trolled to get a customized synchronization output.
For Oqus systems, each camera has a synchronization output, which can be
accessed via a splitter cable connected to the control port. The Syn-
chronization output can be controlled to get a customized synchronization
output from the cameras. Additional modes for the Synchronization output for
Oqus cameras are Camera frequency – Shutter out, Measurement time and
Wired synchronization of active markers. The settings apply to individual
cameras so that there can be different synchronization outputs within the
same system. Select the cameras in the device list to the left. The syn-
chronization signal is sent as soon as the camera is measuring, i.e. both during
preview and capture, with the exception of measurement time, which is only
NOTE: When a Camera Sync Unit is included in the system, QTM only dis-
plays the Synchronization settings of the Camera Sync Unit. The syn-
chronization settings of Oqus cameras are not displayed.
NOTE: If you want to use a sync output signal that is faster than 5000 Hz
with an Oqus system, you must use the master camera as sync device.
The master camera is displayed with an M next to the ID in the camera
list to the left.
There are four different sync modes which are described below. The image
below is a description of the different sync output settings.
The following modes are available. The settings depend on the chosen mode.
Use this mode to set the frequency as a multiplier/divisor of the camera cap-
ture frequency. The multiplication factor/divisor is controlled with the Mul-
tiplier/Divisor setting. The current Sync output properties (Frequency [Hz],
Period time [µs] and Pulse duration [µs]) are shown below the setting both
for marker mode and video mode (if applicable), because the marker and video
capture rates can be different. The Sync output frequency will change if any
of the capture rates are changed. The displayed period time is rounded to the
nearest microsecond, but the number of pulses will always be correct com-
pared to the camera frequency.
NOTE: The maximum Multiplier is 1000 and the maximum Output fre-
quency is 100000 Hz.
NOTE: The video capture rate is individual per camera, which is indicated
by a red number if it differs. Select only the camera that Sync out is con-
nected to find out the frequency in video mode.
NOTE: The pulse starts at the start of each period, i.e. you cannot apply
an offset to the pulse. However, by changing the Duty cycle and the TTL
signal polarity you can get an edge at any time between two frames.
Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low. Each signal will then be synchronized with its corresponding cap-
ture frame which depends on the Multiplier/Divisor setting, see image above.
Independent frequency
Use this mode to set the frequency independently of the camera capture fre-
quency. The Output frequency can be set between 1 and 100000 Hz. When
changing the frequency the Period time [µs] will then update to its cor-
responding value.
The Duty cycle setting for controls how long the pulse will be in percent of the
current Period time. The actual Pulse duration (in μs) is displayed below.
Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low. The first pulse will always be synchronized with the camera capture.
The following pulses will be dependent on the relation between the camera cap-
ture rate and the Output frequency.
NOTE: For the Camera Sync Unit, measurement time is not available as a
mode for the Out 1 and Out 2 ports. Instead, the it has a separate Meas-
urement time output with the same options.
100 Hz continuous
Use this mode to output a continuous 100 Hz TTL signal. It is sent even when
the system is not measuring so that external equipment can lock on to this sig-
nal. In Oqus systems, only one camera can have this option activated.
Use this mode to get a pulse that lasts for the whole preview or capture time.
The output will go low/high at the start of a measurement and not go high/low
until the end of the last frame. System live time applies to both preview and
captures.
Use this mode to synchronize the active markers via a wire connected from
Sync out on the camera to the Sync in connector on the driver. The signal is
pulsed so it cannot be used for any other synchronization.
NOTE: This mode is only available if you are using active markers.
Use this the Meas. time output on the Camera Sync Unit to get a pulse that
lasts for the whole capture time. The output will go low/high at the start of a
capture and not go high/low until the end of the last frame. Measurement
time applies only to captures, not to preview.
Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low.
The analog boards that have been installed in the measurement computer are
listed on the Analog boards page, for information about analog boards see
chapter "How to use analog boards" on page 747. For instructions on how to
install an analog board, see "How to use analog boards" on page 747.
NOTE: If more than one board is installed make sure that the syn-
chronization cable is connected to all of the analog boards.
The list of the analog boards contains six columns: #, Board type, Channels,
Used, Samples/s and Range. The list shows an overview of analog boards that
are checked as Enabled on the Input Devices page.
The # column contains the number of the analog board in Instacal. The Board
type column shows the type of the analog board, for a list of boards that are
compatible with QTM see "How to use analog boards" on page 747.
Sample rate
The Sample rate options control the frequency of the analog board. The
sample rate is most often set in a multiple of the camera frequency, because it
is easier to handle the data that way and this is for example a requirement for
the C3D file. The default setting is one sample per camera frame, i.e. the same
sample rate as for the camera system. What sample rate to use really depends
on what you want to measure, for example for force data it is often enough
with 500 to 1000 Hz.
There are then two options for synchronizing the analog board to the camera
system: External sync and Trigger start.
External sync (frame sync) with(default if available)
The analog board is frame synchronized so that the drift between the cam-
era system and the analog data is eliminated. This is the recommended
setting for a Qualisys system with a USB-2533 analog board. You can use
any frequency for the analog board. For instructions on how to connect
the sync out signal from an Oqus camera or a Camera Sync Unit to the
analog board see chapter "Connection of analog board" on page 752.
Simultaneous start
The analog board is just started on the first frame of the camera system,
which means that there can be a small drift between the cameras and ana-
log data.
Board settings
Under the Board settings heading on the page for the analog board, there are
settings for the range of the analog board.
Use the Range options to select Bipolar range (both negative and positive
voltage) or Unipolar range (only positive voltage) and to select the voltage
range that is used for the analog capture. The default settings are Bipolar and
± 10 Volts.
Under the Force plate control heading there is information about the setup of
the control of the force plate. Click More to go to the Force plate control set-
tings page.
Under the Compensate for analog offset and drift heading there are set-
tings for a filter which removes offset and drift. Activate the filter with the
Remove offset and Remove drift options. The current compensation is dis-
played in the Channels list, see chapter "Channels" on page 297.
Remove offset
Removes an offset from the analog data. This setting can be used to
remove the voltage level of a unloaded force plate. The offset com-
pensation can either be calculated from the measurement or from an
acquired list. The offset can be removed in RT/preview and in a file. When
removed in a file the offset value cannot be changed only turned on and
off, so it is important to set the correct settings before capturing the file.
However if none of the options are selected you can still turn on the com-
pensation in a file from the Data info window, the compensation will
then be calculated from the start of the measurement.
Calculate offset from the measurement
Use the Remove offset from and Number of samples to use
options to define the analog samples that are used to calculate the
offset. Activate offset compensation in real-time with Remove off-
set in real-time.
Remove offset from
Set which part of the measurement to use when calculating the
offset in the measurement. The default is Start of the meas-
urement, but it can be changed to use the End of the meas-
urementif there is noise at the beginning of the
measurements.
Remove drift
Calculates a slope from the first and the last samples of the meas-
urement. The number of samples used to calculate how much to remove
from the analog data is defined in the Number of sample to use option.
This slope is then deleted from the analog data. This setting can be used
to remove an output drift of a force plate that slowly changes the voltage
level during the measurement.
The drift compensation is only applied on a file and cannot be used with
the options Read offset from end of measurement and Use values in
list below. Because those implicates that the analog data is not correct at
the beginning of the measurement.
Channels
The channels of the analog boards are listed under the Channels heading.
NOTE: This option applies to all selected channels, i.e. the line is
colored blue.
Select all
Use all of the channels
Unselect all
Use none of the channels. Sometimes the fastest way to select a new
range of channels.
NOTE: This option applies to all selected channels, i.e. that line is
colored blue.
NOTE: This option applies to all selected channels, i.e. that line is
colored blue.
The current settings under the Compensate for analog offset and drift
heading is displayed in the last three columns. The different settings are
described below.
If the compensation is activated with the option Use the first and last,
then the Offset and Drift columns says Calculated. The offset com-
pensation will be applied in both RT/preview and in a file and calculated
from the first samples, which means that it is important that there is no
signal on the analog channel and the start of RT/preview or in the begin-
ning of a file.
The drift compensation is however just applied in the file, since it isn't cal-
culated for the RT/preview. For the drift compensation it is important that
there is no analog signal in the beginning and the end of the file.
If the compensation is activated with the option Use values in list below,
then the last read value is displayed in the Offset column. The offset com-
pensation will be applied in both RT/preview and in a file. However since
the offset values have already been stored there can be an analog signal
in the beginning of RT/preview and a file. Start RT/preview and use the
Read current values button to updated to offset values for the channels
where the compensation is activated.
The Drift column displays ---, because the drift cannot be compensated
for in this mode.
The Force plate control settings page contains the following settings for the
control of Kistler force plates.
NOTE: These settings are only valid for the Kistler force plates. The Kist-
ler force plates can, however, be externally controlled regarding Oper-
ate/Reset and charge amplifier range settings.
The Analog board type type for force control. Currently, only Qualisys analog
interface can be selected.
Qualisys analog interface
This is the default option that is used with all of the analog boards com-
patible with QTM.
Set the number of force plates and their ranges with the settings described in
chapter "Force plate control list" on the next page.
Use the Force plate auto-zero settings to control when to auto-zero/reset the
force plates connected to the analog board.
There are two options to control the auto-zeroing.
On measurement start
When enabled the force plates are auto-zeroed at two operations in QTM:
just before the Start capture dialog is opened before a capture and in a
batch capture just before QTM starts Waiting for next meas-
urement/trigger.
On preview start
When enabled the force plates are auto-zeroed at two operations in QTM:
new file and changing a setting in Project option during preview.
There is also an option to auto-zero the plates while QTM are in preview mode.
Just right-click in a Data info window that displays force data and select Zero
all force plates. It is important to use this zeroing at least every hour if both
auto-zeroing options have been disabled.
The list of force plate controls sets the controls and ranges of Kistler force
plates connected via the USB-2533 analog board. Up to four force plates can be
controlled via the USB-2533 analog board.
When connecting the force plates it is important to connect them in the same
order to the analog channels as they are specified in the list. This is because
the order is used to match the force plate numbers on the Force plate list on
the Force data page, see chapter "Force data" on page 360. For more inform-
ation on how to connect the force plates to the board see chapters "Connecting
Kistler force plates" on page 790.
A new Kistler plate is added with the Add option. The force plate control set-
tings can then be edited or removed by selecting the plate and use respectively
the Edit or Remove option. Adding a plate also means that an Operate/Reset sig-
nal will be sent to the force plate.
Amplifier
Select the type of amplifier used for the force plate, Internal or External.
Time-constant filter
The Time-constant filter is useful for long measurements of static forces,
for regular measurements the Time-constant filter should not be used.
For further information about the range settings and the time-constant filter,
see the manual for the Kistler force plate.
The external video devices that are selected under Input Devices are listed
under the AV Devices node. This include video captured with Blackmagic
Design cards and DV/webcam cameras, for more information on how to use
them see chapter "Video capture with Blackmagic Design cards" on page 899
and "External video devices in 2D view" on page 100.
To use a video device go to the Input Devices page and select the check box in
the Enabled column. Then the video will be displayed in the 2D view window.
To open a 2D view window with just the video cameras you can click on the
Video button in the toolbar.
When DirectShow video cameras are connected (for example web cameras),
the AV Devices node can be expanded, giving access to a list of available video
properties (resolution, color space and frequency).
AV device settings
In the AV devices properties tab the video properties (resolution, color space
and frequency) can be selected for DirectShow video cameras, for example web
cameras.
Force plates
The available force plates are listed in the settings tree to the left under the
Force plates heading.
For detailed information about the settings of the respective force plate types,
see the following chapters.
The settings for AMTI Gen5, OPT-SC (Optima Signal Conditioner) and AccuGait
Optimized are divided between the capture settings and the processing set-
tings. The settings are the same for Gen5, OPT-SC and AccuGait Optimized so in
the following descriptions we only refer to Gen5. On the AMTI Gen5 page there
are only settings for the capture. There is one page for each connected AMTI
Gen5 amplifier, the name of the page will be the same as specified for the plate
on the Force data page. The default name includes the model and serial num-
ber of the connected force plate.
For AMTI force plates with the new calibration chip the only processing setting
that is needed is the location, see chapter "Force plate location" on page 382. If
you have an old AMTI force plate where you have to input the calibration mat-
rix manually then that needs to be added in the AMTI-Netforce program.
QTM will then automatically read the settings file for that program. For a
description on how to connect the cameras see chapter "Connecting AMTI
Digital force plates" on page 756.
The Amti Gen5 heading contains information about the force plate that is con-
nected to the amplifier. The information includes the Model and the Serial
numbers.
NOTE: If there is an analog board in the system, that is also frame syn-
chronized, then it is recommended to use the same camera as syn-
chronization source for the analog board and the AMTI Gen5. Then you
can use the Sample rate option on the Analog board page to control the
frequency on both of the boards.
NOTE: The buffer of the AMTI Gen5 is 16 samples. Therefore if the num-
ber of analog samples is not an multiple of 16 there will be some empty
force plate samples at the end of the measurement.
Use the Force plate auto-zero settings to control when to auto-zero/reset the
force plate. There are two options to control the auto-zeroing.
On preview start
When enabled the force plates are auto-zeroed at two operations in QTM:
new file and changing a setting in Project option during preview.
Arsalis
The Arsalis device settings are managed via the Arsalis settings page. Once the
Arsalis force plates are set up correctly, the force calculation can be defined
under the force plate settings, see chapter "Force plate settings" on page 362.
For information about how to connect and set up Arsalis force plates for use
with QTM, see chapter "Connecting Arsalis force plates" on page 759.
The settings list contains a top section with common settings and a section with
individual settings for each force plate.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.
Integration version
The version number for the integration.
IP Address
Enter the IP address for the computer running 3D-Forceplate soft-
ware.
Port Number
Enter the port number used by the Local server in the 3D-Forceplate
software.
Frequency
Enter the frequency for the force plates. The plates supports the fol-
lowing frequencies: 100, 200, 250, 400, 500, 1000, 2000. Make sure
that you select one that can be evenly divided by the camera frame
rate.
Trigger Mode
Set the trigger mode for the force plates. The default value is start
only. This is the recommended mode since you then get a
Channels
The channels captured from the force plates with their respective
unit and frequency. Note that the four last channels are either 1 or
0.
Force X, Force Y, Force Z (N, frequency)
Trigger (frequency)
Zero (frequency)
Sync (frequency)
The Bertec device settings are managed via the Bertec Corporation Device set-
tings page. Once the Bertec force plates are set up correctly, the force cal-
culation can be defined under the force plate settings, see chapter "Force plate
settings" on page 362.
For information about how to connect and set up Bertec force plates for use
with QTM, see chapter "Connecting Digital Bertec force plates" on page 764.
The Bertec page contains the following buttons to communicate with the force
plates and a list with settings for the located force plates.
Restore Default Settings
Restore settings to their default values.
Synchronize Settings
Synchronize changed settings to the Bertec device. Synchronize Settings
should be used when changing the Frequency setting.
Zero Plates
Zero the connected Bertec devices.
The settings list contains a top section with common settings and a section with
individual settings for each Bertec device.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.
Integration version
The version number for the integration.
Frequency
Enter the frequency for the force plates. Make sure that it matches
the frequency for Sync out on the Synchronization page.
Autozero
Check to automatically re-zero the force plates at the start of pre-
view or when opening the capture dialog. Autozero applies only
when the measured vertical force is below 20-40 N.
Channels
The channels captured from the Bertec device with their respective
unit and frequency.
Force X, Force Y, Force Z(N, frequency)
The Kistler device settings are managed via the Kistler Force Plates settings
page. Once the Kistler devices are set up correctly, the force calculation can be
defined under the force plate settings, see chapter "Force plate settings" on
page 362.
For information about how to connect and set up Kistler digital force plates for
use with QTM, see chapter "Connecting Kistler digital force plates" on page 768.
The Kistler Force Plates page contains the following buttons to communicate
with the force plates and a list with settings for the force plates included in the
configuration.
Sync Settings
Synchronize changed settings to the Kistler device. Sync Settings should
be used when changing the Frequency setting.
Zero Offsets
Manually zero the connected Kistler force plates.
The settings list contains a top section with common settings and a section with
individual settings for each force plate.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.
Integration version
The version number for the integration.
DataServer version
The version number of DataServer.
Frequency
Enter the frequency for the force plates.
Shear/Vertical ranges
Select the force ranges for Kistler digital force plates.
Channels
The channels captured from the Kistler force plate with their respect-
ive unit and frequency.
Force X, Force Y, Force Z (N, frequency)
Instrumented treadmills
The treadmills that are selected as input device are listed under the Instru-
mented Treadmills category. Currently, the h/p/cosmos-Arsalis Gaitway-3D
treadmill is supported in QTM. Use the Add Device button to add the Gaitway-
3D treadmill to the Device list. The settings are managed via the Gaitway-3D set-
tings page, see "Gaitway-3D" below.
Gaitway-3D
The Gaitway-3D device settings are managed via the Gaitway-3D settings page.
Once the Gaitway-3D device is set up correctly, the force calculation can be
defined under the force plate settings, see chapter "Force plate settings" on
page 362.
For information about how to connect and set up a Gaitway-3D treadmill for
use with QTM, see chapter "Connecting a Gaitway-3D instrumented treadmill"
on page 797.
Sample rate
Select the sample rate in Hz of the Gaitway-3D data stream from the Fre-
quency drop down menu.
Sync settings
Check the Simultaneous start option for synchronizing the Gaitway-3D
data with QTM captures.
Gloves
The available motion glove devices are listed in the settings tree to the left
under the Gloves heading.
Manus Gloves
The Manus Gloves settings are managed via the Manus Gloves settings page.
For information about how to connect and set up MANUS gloves for use with
QTM, see chapter "Connecting Manus gloves" on page 889.
The Manus Gloves page contains the following buttons and a list with settings
for the gloves.
Restore Default Settings
Restore the settings to their default values.
Synchronize Settings
Synchronize changed settings with the device.
Integration version
The version number for the integration.
IP
Enter the IP address for the Manus device.
Model Type
Skeleton type used for the glove bindings.
Gloves
Gloves are indicated with their serial number and contain information
about the glove type and data channels.
Generics
The available integrated generic devices are listed in the settings tree to the left
under the Generics heading.
h/p/cosmos treadmill
The h/p/cosmos treadmill settings are managed via the hpcosmos treadmill
settings page. For information about how to connect and set up a h/p/cosmos
treadmill for use with QTM, see chapter "Connecting the h/p/cosmos treadmill"
on page 912.
Synchronize Settings
No action for the h/p/cosmos treadmill integration.
The settings list contains a top section with common settings and a section with
the channels for the treadmill.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.
Integration version
The version number for the integration.
IP
Enter the IP address for the treadmill.
elevation
The elevation of the treadmill in degrees.
heart rate
The heart rate in beats per minute.
Processing
The Processing branch of the options tree contains options for actions that can
be performed on the 2D data. The options have an effect on the 2D data in
real-time (preview), during a measurement and after the measurement, for
l Solve skeletons
l Apply SAL
l Calculate 6DOF
The actions can be defined separately for Real-time and Capture. Select each
action that will be used from the processing steps list. The settings for each
option can be reached by clicking on the corresponding page in the tree to the
left.
The following processing pages are not associated with actions, but the options
affect real-time output and the representation of rotations:
l Real-Time output
l Euler Angles
The 2D Preprocessing and filtering page contains settings for the pre-
processing of 2D data before 3D tracking.
Non-circular marker settings (Oqus)
Use the Non-circular marker settings to define how to handle the segment
data that is sent from the Oqus cameras. The cameras will only send segment
data when the Marker circularity filtering option is enabled on the Cameras
page, "Marker circularity filtering (Oqus)" on page 234. How many markers that
have segment data depends on the Circularity threshold on that page.
The options for handling non-circular markers with segment data is:
For more information about how to use the non-circular marker settings see
chapter "Marker circularity filtering (Oqus only)" on page 541 and "How to use
circularity filter (Oqus only)" on page 610.
Filtering
The Filtering option applies software marker masks and size filtering to the 2D
data. The option is used on all of the cameras in the measurement. Markers
that have been filtered are indicated by a rectangular frame in the camera view
in the 2D view window.
Software marker masks (only available when reprocessing a file)
Enable software marker masks in reprocessing, see chapter "How to use
software marker masks" on page 611.
3D Tracking
The Tracking page contains settings for the 3D tracking of the motion capture
data. The 3D tracker uses 2D data of the cameras in the system to calculate a
trajectories in a 3D view, see chapter "3D tracking measurements" on page 614.
3D Tracker parameters
The Prediction error parameter specifies the maximum distance (in mm)
between a predicted position of a trajectory and a captured point that is
allowed for it to be assigned to that trajectory. The parameter therefore
provides a margin of error with which a 3D point may deviate from the
mathematically calculated next position. Since real-world data cannot be
expected to exactly fit an equation, this provides a mechanism for dealing
with the unpredictability of the real changes in a marker’s trajectory. The
example above shows the 3D point (the red ball) and its predicted next
position (the black cross). The blue sphere is the volume in which the next
3D point is accepted as part of this trajectory.
Maximum residual
The default value of the Maximum residual is 6 mm. The value for the
parameter can usually be set to 2 - 5 times the average residual values in
the calibration result for the camera system. A too large value will slow
down the calculation and will tend to produce merged 3D points, resulting
in defective trajectories. On the other hand a too small value will probably
cause ghost markers and high segmentation, resulting in many more tra-
jectories than markers.
Increase the number of frames if you have a lot of short extra trajectories.
When those are removed the AIM process will also work better. However
it is a good idea to check so the extra trajectories are not caused by some-
thing else, e.g. a bad calibration or something reflective.
The Minimum ray count per marker parameter defines the minimum
number of cameras required for the 3D tracking of a marker. The default
value is 2, which imposes no limitation to the 3D tracking.
The ray length limits represent the minimum and maximum distance
between a marker and a camera to be used for 3D tracking. The units are
in meters.
Rays
Camera tracking rays are a mapping between 2D data of the cameras and the
3D trajectories, which can be shown in the 3D view window, see chapter "Rays
in 3D views" on page 130.
To show the rays, they to be stored during the 3D tracking stage of the pro-
cessing. To store the rays, make sure that the option Store is enabled.
NOTE: When the Store option is enabled, the processing will take longer
time and the file size of the QTM file will be larger.
Auto join
NOTE: Auto join only joins two trajectories so that they are one in the tra-
jectory info windows. If you want to fill the gap between the two tra-
jectories with data you need to use the gap-fill function, see chapter
"Trajectories" on page 338.
The bounding box is the volume in which the trajectories will be calculated. The
use of a bounding box can be helpful to reduce the amount of unneeded data
by discarding 3D data which falls outside the volume of interest. The bounding
box is displayed as a white outlined box in the 3D view, the display is controlled
on the 3D view settings page.
When enabling the bounding box, the 3D data is restricted to the volume
spanned up by the bounding box. The limits of the bounding box can be
defined in the text edit fields as distances (in mm) in each direction to the ori-
gin of the global coordinate system. The default behavior is that 3D points that
fall outside the bounding box are discarded, i.e. the option Reevaluate track-
ing solution if 3D point is outside bounding box is unchecked. By checking
the option Reevaluate tracking solution if 3D point is outside bounding
box the 3D tracker will try to find alternative solutions for 3D points if the initial
solution lies outside the bounding box. This can be useful for small systems, for
example with two cameras.
Auto range
The Auto range option automatically sets the measurement range so that it
only includes frames that have 3D data. The empty frames are not deleted just
placed outside the measurement range. Increase the measurement range on
the Timeline control bar if you want to use more frames, see chapter "Timeline
control bar" on page 133.
2D tracking
The 2D tracking page contains settings for the 2D tracker. The 2D tracker uses
the data of just one camera to calculate trajectories in a plane in the 3D view
window, see chapter "2D tracking of data" on page 618.
Under the Tracking settings heading you can select the camera that will be
tracked with the Camera to track option. Choose the camera from the drop-
down list. You can only 2D track one camera at a time.
The option Turn on track filtering and remove unidentified tracks is very
useful to limit the number of trajectories in the 2D tracking output. It filters the
data so that short identified tracks and the 2D markers that have not been iden-
tified are not displayed as trajectories. Use the option Minimum length of
identified tracks to set the minimum number of frames for an identified track
in the 2D data to be transferred to a 3D trajectory.
IMPORTANT: If you turn off the filter the unidentified markers in the 2D
data will be transferred to a 1 frame long trajectory. This means that the
data will be very fragmented if the 2D tracker has left a lot of markers
unidentified.
Auto join
2D to 3D settings
The 2D to 3D settings heading contains settings for how the 2D data is dis-
played in 3D view window.
Distance to measurement plane
Set the distance from the camera sensor to the measurement plane in
mm. The measurement plane will be the same as the green grid in the 3D
view window. With this setting you can get the distances in the 3D view
window to match the real distances. You must measure the distance your-
self, but for most applications the accuracy of the distance does not have
to be better than in cm.
The settings on the Twin System page are used to enable and control the Twin
system feature. This feature enables a Twin master system to control a Twin
slave system on another computer. It can for example be used to combine an
underwater system with one above water. For more information about the
Twin system feature, see chapter "Twin systems" on page 514.
To enable the Twin master functionality select the Enable Twin Master option
on the system that you want to use as Twin master. The Twin System settings
are then used to control the Twin slave computer and the merging of data.
NOTE: You do not need to change any settings related to Twin systems
on the Twin slave system.
To enable the automatic transfer of the Twin slave file you must select the
Merge with Twin Slave option on the Processing page.
The two systems must also be calibrated with a twin calibration, for inform-
ation about the settings see chapter "Twin System Calibration" on page 336.
Twin Slave System
The Twin Slave System settings controls which slave is controlled by the Twin
master system.
IMPORTANT: If you do not use the same frequencies for the two
systems then the data of the slave system is interpolated. When you
use a divisor of the twin master frequency, then the interpolated
data is just filled between the actual 3D data. If the twin slave fre-
quency is not a divisor of the twin master frequency, then QTM will
interpolate all of the slave data so that it matches the twin master
frequency.
Frame Sync Master Source (only available with the Frame sync option)
Select the device that you want to use as frame sync master. This can be a
sync output on the Camera Sync Unit or an Oqus camera. Devices with
multiple sync out ports (e.g. Camera Sync Unit) will additionally let you
specify which port to use (currently, only the Out 1 port on the Camera
Sync Unit is supported for Twin). The Frame sync master source can be in
either the Twin master or the Twin slave system.
Trajectory Conflicts
If there are two labeled markers with the same names in the twin master and
twin slave file, then QTM will have to know how to handle the trajectories. QTM
can either Merge trajectories or Rename trajectories with extension. If the
trajectories are merged then the twin slave trajectory will be added to the twin
master trajectory and if there are any overlaps the twin master data is used.
When renaming the trajectories then the extension _slave is added to all of the
labeled twin slave trajectories.
NOTE: This option will only be used if the Twin slave file contains labeled
trajectories. Most of the time it is recommended to not identify the data
on the slave file and instead apply AIM to the merged data.
The Twin System Calibration page contains settings for the twin calibration of
the two systems and describes the Translation and Rotation that is used for
transforming the twin slave data to the twin master coordinate system. For
information on how to perform a twin system calibration see chapter "Per-
forming a Twin calibration" on page 519.
The result of the last twin calibration is displayed below the transformations:
Calibration time
The processing time of the current twin calibration. Since it is only the pro-
cessing time that is displayed you need to check which files that are used
to be sure exactly what is used, see chapter "Twin System Calibration dia-
log" on the next page.
The Twin system calibration dialog is opened with the Calibrate button on
the Twin system calibration page. From the dialog you can control and
change the current twin calibration. It can either be changed by updating the
measurements used for twin calibration or manually.
Manual calibration
Enter the translation and position of the twin calibration. If you have
made a twin calibration from two files then it is the result that is dis-
played. However, you can also enter the numbers manually, but it is not
recommended if you want the best accuracy. Click on Calibrate to update
the twin calibration with the manually entered data.
Under the Gap Fill Settings heading there are options for the gap fill func-
tionality.
Set the Max frame gap option to select how many frames that can be gap
filled without preview. The trajectories will only be gap filled if the gap between
the two parts is less than the Max frame gap. The matching number of mil-
liseconds with the current frequency is displayed next to option.
Select the Default interpolation type to decide whether Polynomial or Lin-
ear gap fill is used by default. The Polynomial gap fill uses two frames before
and after the gap to calculate a third degree polynomial. If the gap starts or
ends with a trajectory part which consists of one frame, then the polynomial
gap fill will use the next available frame to calculate the polynomial. If there is
no other trajectory part then polynomial gap fill is calculate using just that one
frame.
The Linear gap fill calculates the line between the start and end frame of the
gap.
AIM
The AIM page contains settings for the Automatic Identification of Markers
(AIM) function, for information about AIM see chapter "Automatic Identification
of Markers (AIM)" on page 624.
AIM models
Under the AIM models heading there are two lists with AIM models. The
Applied models list are the models that are currently in used by QTM. There
can be several models in this list and then QTM will try to apply them all.
Remember to Remove (moves the model to the Previously used models list)
the unnecessary models from this list otherwise the AIM application might fail.
Add a saved model to the list with Add model. If you want to apply the same
AIM model to multiple actors, you can set the number of actors in the Nr To
Apply column.
Under the AIM model application parameters heading you can set the fol-
lowing settings that adjust the application of the AIM model.
Relative marker to marker distance tolerance
Change the permitted marker to marker distance relative the model
marker to marker distance. As default any marker that is within ±30 % of
the model's marker to marker distance will be tried in the model applic-
ation.
Use the physical IDs of active markers (if present in both file and
model)
This setting only applies to files with active and passive markers in the
same file. When activated AIM will use the active marker IDs to match the
markers with the AIM model, which means that you will aid the AIM model
when identifying the rest of the markers. The AIM model must have been
created with active markers placed on the correct positions in the
AIM model.
Skeleton solver
The Skeleton Solver page is used to manage the skeleton definitions used in
the project. For more information on skeleton tracking, see chapter "Tracking
of skeletons" on page 671.
Marker Label Mapping
Mapping of markers associated with a skeleton. See chapter "Skeleton
marker label mapping" on page 681 for how to use a custom marker map-
ping.
Skeleton template
Select template for global parameters used by the skeleton calibration.
See chapter "Skeleton template" on page 694 for how to use the Skeleton
template.
Skeletons
List of skeletons loaded in the project. The column are:
Save: Save selected skeleton to QTM skeleton file. If you select mul-
tiple skeletons, each skeleton will be saved in a separate file.
Skeleton Assisted Labeling (SAL) uses the skeleton segment markers to identify
unlabeled trajectories or parts that can be associated with them. It requires a
solved skeleton to be able to identify trajectories. The unidentified trajectory
part that is closest to a missing segment marker and fulfills the below criteria is
added to the corresponding labeled trajectory.
SAL uses the following marker to segment marker distance criteria for labeling
unidentified trajectory parts.
Claim threshold
Required closeness of a marker to a segment marker. At least one frame
of an unidentified part must be within the claim threshold of a segment
marker in order to be labeled as the corresponding trajectory. The default
value is 20 mm. Use a lower value when markers can be close to each
other for example when solving fingers.
Disqualification threshold
The maximum tolerated distance of a marker to a segment marker. If any
frame of a claimed part is beyond the disqualification threshold, it will be
disqualified as a solution. This setting prevents that wrong unidentified
markers that happened to be close to a missing segment marker at some
instance are accepted as a solution. The default value is 200 mm.
For more information about SAL see chapter "How to use SAL" on page 700.
The glove processing step allows data from motion gloves to be applied to skel-
etons within QTM. For more information about currently supported motion
gloves, see chapter "How to use motion gloves" on page 889.
The glove processing step settings dialog is used to create bindings which asso-
ciate a glove to the skeleton its data will be applied to. To create a new binding,
select an available glove in the bottom row of the bindings grid and select the
associated skeleton.
When running this processing step, all segments in the skeleton with names
that match the data sent from the glove will have the rotation data from the
glove applied to them. This requires that the target skeletons have hand hier-
archies which match that of the glove.
The use of motion gloves is natively supported by the Qualisys Animation skel-
eton. For more detailed instructions, see the chapters for the respective motion
gloves under "How to use motion gloves" on page 889.
The 6DOF Tracking page contains the 6DOF Tracker parameters and the list
of Rigid bodies. The 6DOF tracker uses this information to calculate the pos-
ition and rotation from the 3D data, see chapter "6DOF tracking of rigid bodies"
on page 649.
It is also possible to export the 6DOF tracking output to another computer and
as an analog signal, see chapters "Real-Time output" on page 387 and "6DOF
analog export" on page 388, respectively.
6DOF Tracker parameters
Specify the global tracker parameter for the 6DOF tracking under the 6DOF
Tracker parameters heading. More 6DOF tracker parameters, in particular
Min. markers, Max. residual and Bone tolerance, are available for the indi-
vidual rigid body definitions, see chapter "Rigid bodies" on the next page.
Reidentify all body markers
Enable this option to reidentify the markers for all rigid bodies using the
Rigid bodied definitions and settings for reprocessing. Only available
when reprocessing a file, for more information see chapter "Calculating
6DOF data" on page 660.
NOTE: Rigid bodies that are parts of AIM models are not affected
by this option.
For information on 6DOF tracking see chapter "6DOF tracking of rigid bodies"
on page 649.
Rigid bodies
The Rigid bodies list contains the definition of the 6DOF bodies. The bodies are
used by the 6DOF tracking to find the measured rigid bodies in a motion cap-
ture. The list consists of the following columns. In case a column refers to sep-
arate items for the rigid body definition or its points, respectively, this is
indicated by the / separator.
Rigid bodies:
Label
The Label column contains the name of the rigid body and its points.
Double-click on the name of the rigid body or the points to edit them. The
points can have any name, however if the same name is used in another
6DOF body or an AIM model then you need to follow the instructions in
chapter "How to use 6DOF bodies in an AIM model" on page 669.
Enabled
Enable or disable calculation of 6DOF data for rigid bodies with the check
box in the Enabled column. Disabled rigid bodies will appear as "Dis-
abled" in the Data Info window and count for the indexing of rigid bodies
in the real-time stream.
Color
The color of the rigid body is displayed in the X / Color column on the
same row as the name of the rigid body. Double-click on the color to open
the Color dialog where any color can be selected. The color is used in the
3D view window for the markers of the rigid body and for its name.
Min. markers
Specify the minimum number of markers required for 6DOF tracking of
the rigid body.
Max. residual
Specify the maximum residual accepted for 6DOF tracking of the rigid
body.
Bone tolerance
The Bone tolerance (in mm) is the maximum separation between the
lengths of the corresponding bones in a rigid body definition and a meas-
ured rigid body. E.g. if the Bone tolerance is specified to 5.0 mm and the
current bone in the rigid body definition is 100.0 mm, then the measured
separation between two markers must be in the range of 95.0 - 105.0 mm
for the tracker to accept the possibility that the calculated markers may
be the pair specified for the rigid body.
The default value of the Bone tolerance is 5 mm. Increase the value of
the parameter if the 6DOF tracking cannot find the body. Decrease the
value of the parameter if a body is found but the orientation or something
else is wrong.
The effect of the Bone tolerance differs slightly between RT and in files.
In RT the marker that is outside the tolerance will be unidentified and the
6DOF body will be calculated from the remaining markers. In a file the
automatic 6DOF tracker will discard the whole trajectory that is wrong
and then calculate the 6DOF body from the other trajectories. However if
Filter
Select the filter for smoothing 6DOF data. The default is No filter. The fil-
ter is applied both in real time and in a capture. For more information
about smoothing 6DOF data, see chapter "Smoothing 6DOF data" on
page 356.
Mesh
Object file of 3D mesh associated with rigid body. Double-click on the
mesh setting of the rigid body to open the Mesh Settings dialog, see
chapter "Rigid body Mesh Settings dialog" on page 358.
Points:
X, Y, Z
The X, Y and Z columns contain the coordinates of the points in reference
to the local origin. Double-click on the coordinates of to edit them.
Virtual
Select this option to make a point in the 6DOF body virtual, see chapter
"Virtual markers calculated from 6DOF data" on page 662.
Id
The ID of the trajectory in case the point is associated with a sequentially
coded active marker.
The options that are used to edit the rigid bodies or their points are described
below:
Translate
With Translate the local origin of the 6DOF body definition can be moved
to any place in reference to the points of the body, which means that the
rotation center of the body is changed. The local origin is also the origin of
the coordinate system that represents the 6DOF body in the 3D view.
Click Translate to open the Translate body dialog, see chapter "Trans-
late body" on page 350.
Rotate
With Rotate the pitch, roll and yaw of the local coordinate system is
changed. This will change the orientation of the local coordinate system in
Edit color
Open the Color dialog where any color can be selected. The color is used
in the 3D view window for the markers of the rigid body and for its name.
Coordinate system
Change the definition of the local coordinate system, see chapter
"Coordinate system for rigid body data" on page 354.
Reset rotation
This will reset the orientation of all the rigid bodies in the list. Reset
means that the local coordinate systems will be aligned to the global
coordinate system and all the angles will therefore be zeroed.
NOTE: The angles may differ from zero after reset if another ref-
erence system than the global coordinate system is defined in the
Coordinate system for rigid body data dialog.
Add body
Add a new body to the Rigid bodies list. The new body will be empty and
called ’New Body #1’, ’New Body #2’ and so on.
Remove body
Remove the selected body from the Rigid bodies list.
Add point
Add a point to the selected rigid body.
Remove point
Remove the selected point.
Edit label
Edit the selected label (rigid body or point).
Acquire body
Acquire the rigid body definition from the current marker positions in RT/-
preview mode, see chapter "Acquire body" on page 356.
Load bodies
Loads bodies to the Rigid bodies list from an XML file.
NOTE: Load bodies will overwrite any other bodies in the list.
Save bodies
Save the bodies and all of the individual options in the Rigid bodies list to
an XML file. Specify the name and the folder and click Save. The file can
be edited in a text editor, e.g. Notepad. The xml format is the same as
used for the RT protocol, see 6DOF xml parameters.
NOTE: Make sure that all of the bodies for the measurement are in
the same file, since Load bodies overwrites the bodies in the list. If
you want to combine the rigid bodies from two or more different
files you can copy-past them into a single file.
Translate body
So that point ... in the body has local coordinates (in mm)
Move the local origin so that one of the points in the rigid body definition
has a desired position. Enter the number of the point and the position in
X, Y and Z direction (local coordinate system).
The Rotate body dialog contains the following ways to rotate the local coordin-
ate system:
Rotate the system
Rotate the local coordinate system clockwise around one of the axes,
when looking in the positive direction. Choose the angle of rotation either
in Degrees or in Radians. Then select which axle to rotate round, X, Y and
Z.
NOTE: The rotation are not changed by the Euler angles but will
always be the same.
Follow these steps to set the rotation with the "Align the body using its points"
method:
1. First start with defining one of the axes, choose which one from the drop-
down list, by making it parallel to a line from one point to another, enter
the number of the points in the body definition. The direction will depend
on the order of the points so that the axis will always point in the dir-
ection from the first point to the other.
NOTE: The first point does not need to coincide with the origin of
the rigid body.
2. Then you define the direction of a second axis, choose which one from
the second drop-down list. The following options are available for defin-
ing the second axes:
a. Intersect point
The intersect option means that the second axes will point in the dir-
ection of the specified point. However since the axes must be ortho-
gonal, the second axes will actually intersect the projection of the
point on the coordinate plane orthogonal to the first axes. The name
of the orthogonal plane is displayed in the dialog.
The example below displays the second axes defined by the pro-
jection of point 3 on the orthogonal plane defined by the line from
point 1 to point 2.
To describe the position and orientation of the 6DOF body its data must be
referred to another coordinate system. By default the data is referred to the
position and orientation of the global coordinate system. However, with the set-
tings on the Coordinate system for rigid body data dialog you can refer the
local coordinate system to the following alternatives of coordinate system ori-
gin and orientation.
NOTE: Roll, pitch and yaw is the Qualisys standard, but if the Euler
angles definition are changed on the Euler angles page the new set-
tings will be used in this dialog.
With Get position and Get orientation the current position or ori-
entation is acquired. Which means that the data will be zeroed for the cur-
rent position of the rigid body.
With Acquire body a rigid body definition can be acquired from preview mode.
Place the rigid body with the markers in the measurement volume and open a
new file with New on the File menu. Open the 6DOF Tracking page in the Pro-
ject options dialog and click Acquire body to open the Acquire body dialog.
Specify the number of frames to collect with the Frames to acquire setting.
Click Acquire to start the acquisition. The points of the rigid body definition are
then calculated from the average of each marker’s position in these frames.
The Stop option can be used to cancel the collection before all frames have
been captured.
To see that the 6DOF tracking can find the body, change to 6DOF tracking on
the Processing page and click Apply. The body should appear in the 3D view.
NOTE: It is a good idea to place the body so that the orientation of the
desired local coordinate system is aligned with the global coordinate sys-
tem. It is also a good idea to place the desired origin of the local coordin-
ate system in the origin of the global coordinate system. Another way to
easily define the local origin of the body is to use an extra marker placed
at the desired location of the local origin. After acquiring the body
coordinates, use the Translate body dialog to translate the local origin to
the location of the extra marker. Then delete the extra marker from the
body definition with Remove point.
It is possible to smooth 6DOF data in QTM. The smoothing can be applied both
in real time and in a capture. Smoothing of 6DOF data can be useful if you need
to stabilize noisy data. The smoothing is applied to both position and ori-
entation.
Multi-purpose
Light smoothing and jitter reduction. This preset is suitable for most situ-
ations which require effective smoothing of light noise, while keeping side
effects to a minimum. The filter parameters are: s=0.25, c=0.25, p=0, rp=5,
ro=5.
Static pose
Effective jitter reduction for immobile objects. Application of this preset to
moving rigid bodies may lead to noticeable lag for small displacements
and rotations within the specified radius. The filter parameters are: s=0,
c=0, p=0, rp=10, ro=10.
The selected presets in the rigid body definitions in the Project Options will be
applied in real time and in subsequent captures when Calculate 6DOF is
checked in the Processing options. It is also possible to reprocess captures
with different presets.
The effect of the smoothing filter can be visually inspected in the 3D view win-
dow by comparing the markers of the rigid body overlay in the 3D view window
with the measured marker positions. For more detailed comparison of the
6DOF data, you can define the same rigid body twice, one with filter and the
other one without.
Rigid body Mesh Settings dialog
The Rigid Body Mesh Settings dialog is used to associate a mesh with a rigid
body and to inspect and modify the mesh object settings. For more information
about how to use Rigid body meshes see chapter "Rigid body meshes" on
page 667.
The dialog contains the following settings:
Position
The 3D translation of the object relative to the global coordinate system.
The values of X, Y and Z are specified in mm units.
Rotation
The 3D rotation of the object relative to the global coordinate system. The
values for Roll, Pitch and Yaw are specified in degrees around the X, Y
and Z axes, respectively.
Scale
The scale factor of the mesh object.
Opacity
The opacity of the mesh object from 0 (transparent) to 1 (opaque). An .obj
file can include an opacity option for individual faces, these will be
rendered as transparent even if the Opacity option is set to 1.
The buttons:
Reset
Remove the link to the file and resets the Position, Rotation and Scale
values.
OK
Accept the settings and close the dialog.
Cancel
Discard the settings and close the dialog.
Apply
Review the changes directly in the 3D View window while in Preview
mode.
The Force data branch of the options tree contains settings for the force data
calculation on the installed force plates. For a correct force calculation the fol-
lowing settings for the force plate must be specified: the dimensions, the cal-
ibration factors, the gain and the location in the coordinate system of the
motion capture.
For information about how to use the force data see chapter "Force data cal-
culation" on page 703.
General settings
The setting Coordinate system for force/COP display and export defines in
what coordinate system the force data is displayed in the Data info window
and exported.
NOTE: The C3D export only includes analog data and force plate para-
meters, so it is therefore not changed by this setting.
The default value is Local (Force plate), which means the local coordinate sys-
tem of the force plate with the origin in the middle of the force plate.
Under the Force plates heading on the Force data page the force plates are
managed for which the force data will be calculated. This applies to all types of
integrated force plates, namely digitally integrated force plates and instru-
mented treadmills, and force plates connected via an analog board.
Use the Add plate option to add a new force plate to the list. Right-click on the
force plate to open a menu where you can Change name and Remove plate.
The AMTI digital plates are created automatically and cannot be removed. They
can however be renamed and then the same name is used for it on the Input
Devices page.
To enable force plates in QTM, select the check box next to the force plate
name in the Calculate force column.
Select a force plate and click Edit plate to open the settings for that plate, or
double-click on it. The available settings depend on the force plate type, see
chapter "Force plate settings" on the next page. For more information about
force plates see chapter "How to use force plates" on page 756.
To remove a force plate from the list, select it and click the Remove Plate but-
ton.
The Define Plates button is used to automatically define force plates for spe-
cific digital integrations, e.g., Arsalis, Gaitway-3D, Bertec, Kistler, or any other
custom QDevice integrations.
NOTE: The force plates that are activated will be shown in the 3D view
window even if there is no analog data.
The settings for each force plate are found under the corresponding page. For
example, settings for force plate 1 on the Force plate 1 page.
Under the Force plate type heading there are a drop-down box for the force
plate type. The available force plate types depend on which force plate integ-
rations you have installed.
The Calibration and Settings options depend on the force plate type. For the
digital integrated force plates the type cannot be changed and the Calibration
and Settings options are controlled automatically. For more information about
the digital integrations see chapter "Digital force plate integrations" on
page 756.
For analog integrated force plates, the Calibration and Settings options
depend on the force plate type. In addition, generic settings are available for
connecting custom analog force plates or for force data imported from C3D
files. For a detailed description, see the below chapters.
The following settings apply to AMTI analog force plates with 6 analog output
channels.
There are four settings on the dialog: Analog board, Channel, Excitation
Voltage and Gain, see example in image above.
Select the analog board where the force plate is connected from the Analog
board drop-down list.
With Channel each signal is combined with its respective analog channel.
NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.
The Gain is set to the gain of each channel in the AMTI amplifier.
NOTE: The drop-down list gives you standard values, but you can
also type any value for the setting.
For more information about these settings see the manual of the AMTI force
plate.
The following settings apply to AMTI portable analog force plates with 8 analog
output channels.
Calibration matrix
The Calibration matrix is used to calibrate the force plate. Enter the values of
the calibration matrix (in SI-units) under the Calibration matrix heading. The
values can be found in the manual of the AMTI force plate or can be loaded
with Load from file from the diskette, which is attached to the manual of the
AMTI portable force plate.
NOTE: The file contains the Sensitivity matrix, which is then converted to
the inverted when imported to QTM.
Then associate each signal from the force plate with its respective analog
channel with the Channel settings.
NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.
For more information about the signals see the manual of the AMTI portable
force plate.
Dimensions
Under the Force plate dimensions heading you should enter the dimensions
parameters for the Bertec force plate. The parameters can be found in the user
manual of the Bertec force plate.
Calibration matrix
The Calibration matrix is used to calibrate the force plate. Enter the values of
the calibration matrix, which is found in the manual of the Bertec force plate,
under the Calibration matrix heading. Bertec often just supplies the six values
of the diagonal in the matrix. The values can also be loaded with Load from
file from the diskette, which is attached to the manual of the Bertec force plate.
There are three settings on the dialog: Analog board, Channel and Gain, see
example in image above.
Select the analog board where the force plate is connected from the Ana-
log board drop-down list.
With Channel each signal is combined with its respective analog channel.
NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.
The Gain is set to the gain of each channel as selected on the Bertec amp-
lifier.
For more information about these settings see the manual of the Bertec force
plate.
The following settings apply to Kistler analog force plates, and the legacy Kistler
DAQ Type 5698A/B integration (deprecated in QTM 2024.1).
For information on how to connect a Kistler force plate see chapter "Con-
necting Kistler force plates" on page 790.
Select the Kistler force plate type and click Calibration under the Force plate
type heading to go to the Kistler force plate calibration parameters dialog.
NOTE: Enter the absolute values of the parameters, that is for the h
value write 45 when the Kistler manual says -45.
NOTE: The coefficients are force plate model (type number) specific. If in
doubt, please contact Kistler.
NOTE: The coefficients apply only if the force plate is mounted on a rigid
foundation according to Kistler specifications.
NOTE: If you update from a version earlier than QTM 2.7 all of the
ranges will get the value of the currently entered range. Enter the correct
values for all of them so that you can switch ranges more easily.
l How to find the scaling factors differs between internal and external amp-
lifiers.
Internal amplifier
For a Kistler force plate with internal amplifier enter all of the scaling
factors that are found in the calibration certificate matrix of each
force plate.
Range 4 - Range 1
You can select the range manually if you don't use the analog board
to control the force plate ranges. Range 4 will give you the maximum
force range and range 1 will give you highest sensitivity, i.e. min-
imum force range. It is important to use the same range as is used
by the force plate to get correct forces, you can use different ranges
for the XY and Z settings.
First you must select the analog board where the force plate is connected
from the Analog board drop-down list.
Then associate each signal from the force plate with its respective analog
channel with the Channel settings.
For more information about these settings see the manual of the Kistler force
plate.
Select the Generic 6 ch (c3d type-1) force plate type and click Calibration under
the Force plate type heading to go to the Force plate calibration para-
meters dialog. It contains the settings for dimensions of the Generic 6 ch (c3d
type-1) force plate.
NOTE: The main use of the Generic 6 ch (c3d type-1) force plate is for
handling of imported c3d files. It is recommended to use the vendor spe-
cific force plate types when capturing data.
NOTE: For information about c3d type-1 plates refer to the c3d.org web-
site.
First you must select the analog board where the force plate is connected
from the Analog board drop-down list.
Then associate each signal from the force plate with its respective analog
channel with the Channel settings.
NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.
Select the Generic 6 ch (c3d type-2) force plate type and click Calibration under
the Force plate type heading to go to the Force plate calibration para-
meters dialog. It contains the settings for dimensions of the Generic 6 ch (c3d
type-2) force plate.
NOTE: The main use of the Generic 6 ch (c3d type-2) force plate is for
handling of imported c3d files. It is recommended to use the vendor spe-
cific force plate types when capturing data.
NOTE: For information about c3d type-2 plates refer to the c3d.org web-
site.
First you must select the analog board where the force plate is connected
from the Analog board drop-down list.
Then associate each signal from the force plate with its respective analog
channel with the Channel settings.
NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.
For more information about the signals see the manual of the force plate.
Select the Generic 8 ch (c3d type-3) force plate type and click Calibration under
the Force plate type heading to go to the Type 3 Force plate calibration
parameters dialog. It contains the settings for dimensions of the Generic 8 ch
(c3d type-3) force plate.
NOTE: The main use of the Generic 8 ch (c3d type-3) force plate is for
handling of imported c3d files. It is recommended to use the vendor spe-
cific force plate types when capturing data.
NOTE: For information about c3d type-3 plates refer to the c3d.org web-
site.
First you must select the analog board where the force plate is connected
from the Analog board drop-down list.
Then associate each signal from the force plate with its respective analog
channel with the Channel settings.
For more information about the signals see the manual of the force plate.
Select the Generic 6 ch with matrix (c3d type-4) force plate type and click Cal-
ibration under the Force plate type heading to go to the Force plate cal-
ibration parameters dialog. It contains the settings for dimensions of the
Generic 6 ch with matrix (c3d type-4) force plate.
NOTE: The main use of the Generic 6 ch with matrix (c3d type-4) force
plate is for handling of imported c3d files. It is recommended to use the
vendor specific force plate types when capturing data.
NOTE: For information about c3d type-4 plates refer to the c3d.org web-
site.
Calibration matrix
The Calibration matrix is used to calibrate the force plate. Enter the values of
the calibration matrix (M) (in SI-units) under the Calibration matrix heading.
The values can be found in the manual of the force plate. The calibration matrix
must fit in this formula F = M * A
First you must select the analog board where the force plate is connected
from the Analog board drop-down list.
NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.
For more information about the signals see the manual of the force plate.
To be able to view the force vectors in the same coordinate system as the
motion capture, the location of the force plate must be specified. The settings
are the same for all force plate types and are done under the Force plate loc-
ation heading on the Force plate page. The force plate location will be visu-
alized as a purple square in the 3D view window.
The three buttons have the following functions:
Generate
Automatically generate the force plate location from a capture of mark-
ers at the corners of the force plate, see chapter "Generate force plate
location from a capture" below.
Use default
Use default force plate location. This method is only available for the
Gaitway-3D instrumented treadmill.
View/Edit
Manual review and possibility to edit force plate corner positions, see
chapter "Manual revision and specification of force plate location" on
page 386.
For automatic generation of the force plate location, place a marker on top of
the four corners of the force plate. The markers do not need to be placed
exactly on top of the corners, however, it is important that the markers are
placed symmetrically for a correct estimation of the center of the force plate.
Follow these steps for automatic generation of the force plate location:
2. Identify the markers, you can give them any label, and keep the capture file
open.
3. Open the Force plate page and click Generate. QTM tries to identify the
corners of the force plate by comparing them to the width and length of
the plate (as entered in the Force plate calibration parameters dialog).
A dialog box like the one below is displayed:
4. Click OK to open Load measured force plate location dialog, see below,
and select one of the solutions found by QTM. Click Cancel to select the
markers’ locations manually, see further down. Try to make sure that the
orientation is correct. It is recommended to make a test measurement
after the location process to see that the orientation is correct. If the force
arrow is pointing downwards, you can use the Rotate 180 degrees option
in the Force plate location dialog, shown below.
To manually specify the location click View/Edit. Enter the X, Y and Z coordin-
ate (in mm) of the four corners of the force plate. The coordinates should be in
the coordinate system of the motion capture (lab coordinates).
Make sure that the orientation is correct. Use the internal coordinate system of
the force plate, shown in the dialog, to determine the orientation of the force
plate corners. Most force plates have the positive y axis in the direction away
from the connectors of the force plate.
The Center Of Pressure (COP) is the position of the center of the pressure on
the force plate. It is calculated by QTM from the data of the pressure sensors in
the corners of the force plate.
The COP (Center Of Pressure) threshold heading on the Force plate page
contains settings for the COP threshold. When the Z component of the force is
below the threshold level, the force vector will not be shown in QTM. The
COP will be calculated though and accessible in the file.
Select the Activate check box to activate the COP threshold filter. Enter the Z
axis COP threshold level in Newton to disable the visualization of the force
vector and COP. The Z axis is in the force plate coordinate system, which means
that it is always the vertical force which is used in the filter. This is because the
horizontal forces can be very small but still correct.
At the bottom of the Force plate page there is a Settings status window,
which shows the status of the settings of the force plate. It uses the same noti-
fication icons as under the Camera system settings heading on the Camera
system page. The settings that are shown are Dimensions, Scaling factors
(Calibration for the AMTI force plates), Location and Channels.
Real-Time output
With the real-time output function any tracked data (3D or 6DOF) and also ana-
log and force data can be sent to another computer via TCP/IP. The RT server is
always running so that a program can acquire the RT data.
The setting that can be changed for the RT output is the TCP/IP and OSC port
numbers. Uncheck the option Use default port numbers to set other port
numbers. The first four ports are grouped together, so that their port number
are changed with the Base Port number, which by default is 22222.
The Capture Broadcast Port (default 8989) is used for receiving and broad-
casting UDP start and stop packets. For more information, see chapter "Wire-
less/software Trigger" on page 267.
The 6DOF analog export page is only available if an analog output board (PCI-
DAC6703) is installed in the measurement computer. With the analog export
the information about 6DOF bodies’ positions can be used in feedback to an
analog control system. Select the Enable analog 6DOF output option to
enable the output of the board. The output will continue as long as the option
is selected and the 6DOF body is being tracked.
When clicking on Add value or Edit value the Analog channels settings dia-
log is displayed. In the dialog the following settings can be set:
Signal
The data in QTM that is used for the output signal. For each 6DOF body on
the 6DOF bodies page there are seven available outputs: X, Y, Z, Roll,
Pitch, Yaw and Data available. Data available shows whether the 6DOF
body is visible or not.
NOTE: The rotation angles will change if you change the Euler
angles definitions.
Channel
The channel on the analog output board that will be used for the signal.
Each channel can only have one signal assigned to it.
NOTE: For the three rotation angles the maximum input ranges
depend on the ranges of the respective angle.
NOTE: Data available has two positions Available and Not avail-
able instead of the input and output settings. Set the value in V
which will be on the channel depending on whether the 6DOF body
as seen or not.
Test output
In the dialog four tests can be performed to test the output of the channels:
Voltage
The output of all channels are set to the specified voltage.
In the Analog output range calibration dialog the range of the specific board
is entered to calibrate the output of the channels. The maximum and minimum
values can be measured with the Test output option.
On the Euler angles page you can change the definition of the Euler angles
used in QTM. The definition applies to all places in QTM where Euler angles are
used, displayed or exported, including:
l Rotation of the global coordinate system
By default QTM uses the Qualisys standard, which is described in the chapter
"Rotation angles in QTM" on page 663. The definition can also be seen when
Qualisys standard is selected as the grayed settings under the Definition of
rotation axes heading, see screen dump above.
Use the Custom setting if you want another definition of the rotation angles.
Then, define the rotation angles by choosing the type of Euler axes and set the
rotation order and angle conventions, as described below.
Definition of custom rotation axes
NOTE: For rigid bodies, the reference system may be different from the
global coordinate system, dependent on the chosen reference coordinate
system, see chapter "Coordinate system for rigid body data" on page 354.
Define the rotation order and angle conventions for the custom Euler angle
definition.
First rotation axis
Define the first rotation axis. This can be any of the three axes.
Angle range
Select the angle range of the first axis. It can be either -180° to 180°
or 0° to 360°.
Positive rotation
For each axis you can define the direction of Positive rotation. It can be
either Clockwise or Counterclockwise when seen along the positive dir-
ection of the axis.
Name
For each axis you can set a new name. This name will then be used every-
where in QTM.
1. The rotation angles are always applied with the first rotation first and
so on.
2. The rotation angles have limitations to the ranges of the angles. The
first and third angle is always defined between -180° and 180° or 0° to
360°. But the middle angle depends on the Euler angle definition:
On the TSV export page there are settings for the TSV export. For information
about the TSV export see chapter "Export to TSV format" on page 711.
Data type
Under the Data type to export headings you can select which data types to
export by checking the items. Each data type is exported as a separate file with
a suffix added to the file name. The following types of motion data can be selec-
ted:
2D data
Export the 2D data of the capture file.
3D data
Export the 3D data of the capture file.
6D data
Export the 6DOF data of the capture file.
Analog data
Export the analog data of the capture file.
Skeleton data
Export skeleton data of the capture file.
General settings
The settings under the General export settings heading are applied to all TSV
exports. The settings are:
Include TSV header
Include the TSV header with information about the capture file in the
exported file.
Include events
Include the events in the 2D, 3D or 6DOF or 3D TSV file.
The setting under the 2D Settings heading is only applied to 2D data. The
Export linearized 2D data setting is enabled by default. Disable the setting to
export raw unlinearized 2D data.
3D Settings
The settings under the 3D Settings heading are only applied to 3D data. The
settings are:
Include type information per frame
Include a column for each trajectory with the trajectory type per frame.
Exclude non-full frames from beginning and end where any of the
labeled trajectories are not found
Exclude completely empty frames of the labeled trajectories, in the begin-
ning and the end of the measurement.
On the C3D export page there are settings for the C3D export. For information
about C3D export, see chapter "Export to C3D format" on page 727.
3D Data
The settings under the 3D Data heading are only applied to 3D data. The set-
tings are:
Exclude unidentified trajectories
Exclude unidentified trajectories from the exported file. If unidentified tra-
jectories are included they will not have a name in the C3D file.
Exclude non-full frames from beginning and end where any of the
labeled trajectories are not found
Exclude completely empty frames of the labeled trajectories, in the begin-
ning and the end of the measurement. This setting overrides the selected
measurement range.
Under the Label Format heading you can change the format of the C3D file
with the following two settings:
Following the C3D.org specification (short label)
Use the C3D.org specification, which uses short labels.
Under the Event Output Format heading you can change the format of the
C3D file with the following two settings:
Following the C3D.org specification (use original start time)
Use the C3D.org specification, which uses original start time. Default
value and required if using the C3D file in Visual3D 2020.8.3 or later.
1. Trajectory prefixes (text before first underscore) are collected for all non-
rigid-body markers. Each unique prefix is associated with a SUBJECT (e.g.
skeletons).
2. For each enabled rigid body, the longest common prefix of the its point
labels is extracted and associated with a SUBJECT.
The option is disabled by default to allow for using underscore in labels. When
the option is enabled make sure that all labels start with an actual prefix.
Units
Specify the length unit used for the C3D export. The alternatives are meters
(m), centimeters (cm) and millimeters (mm, default). This setting applies to all
length data, including 3D trajectories and force plate positions.
The Matlab file export page contains settings for the export to MAT files. For
information about MAT file export see chapter "Export to MAT format" on
page 729. Select which data types to export with the Data type to export
options.
6D data
Include 6DOF data.
Skeleton data
Include skeleton data. Use the Skeleton Data Reference Frame option
to specify if the skeleton data is exported in global or local coordinates.
Analog data
All of the analog data will be included, both data from analog boards and
EMG.
Force data
Include force data.
Eye Tracker
Include eye tracker data.
Timecodes
Include timestamps for each camera frame, according to Timestamp set-
tings.
Events
Include events.
The AVI Export page contains settings for the export of 3D or 2D views to AVI
files. The export can be done either as a processing step or manually from the
File/Export menu. For more information see chapter "Export to AVI file" on
page 739.
NOTE: If you export an AVI file manually the current setting of the pro-
cessing step will also change. Therefore it is recommended that you save
the View that you want to use in a processing step so that you can return
to that setting.
Window settings
The Window settings control the input to the AVI file, i.e. the view that is used
for the export. The settings consist of a list of active and saved views (3D or
2D). Select the view that you want to use in an export, then you set the output
with the Video settings below. That view will then be saved as Previous set-
tings in the list and used in the export until you select another view. It is import-
ant to notice that if you make an AVI export from the File menu, the new
Type
The type is either 3D or 2D. A 2D view that only displays video images are
still a 2D type of view. Then the type can be either Active or Saved.
NOTE: If you are saving an Oqus video to an AVI file, e.g. with a 3D
overlay of, then the video image will be linearized. I.e. the same
parameters that are used to correct the marker data is applied to
the video data to correct for the lens distortion. Therefore the pixels
will not match exactly with the original video image. The lin-
earization can be turned of with the Show linearized data option
on the 2D view settings page in Project options.
Size
The size is the x and y size of the view in pixels.
View count
The View count displays the number of cameras displayed in a 2D view. A
number within parentheses means that there that number of views with a
3D overlay.
2D view borders
The 2D view borders option toggles whether the borders around a cam-
era view is included in the AVI export.
Save View
The Save View option save the selected view so that you can use it later.
The view is then copied and the type changed to Saved. The name can be
changed directly in the Name column.
Video settings
The Video settings control the output of the AVI export. The following options
are available:
Width
The width of the video in number of pixels. The width can also be changed
by resizing the Preview window.
Height
The height of the video in number of pixels. The height can also be
changed by resizing the Preview window.
Frame rate
The frame rate in Hz of the video file. The rate is down-sampled if you use
a frame rate lower than the marker or video capture in the view. I.e. if you
enter 30 Hz and the file is captured at 120 Hz, then the video file will con-
tain every fourth frame.
Playback speed
The Playback speed option controls the speed of the file in % of the ori-
ginal speed, so that the file can for example be played in slow motion. E.g.
if you have a file captured at 120 Hz and a Frame rate for the AVI export
of 30 Hz, then you can use a Playback speed of 25% to get all of the cap-
tured frames in the video file at a quarter of the original speed.
Preset
The Preset option is for using standard video output in the 16:9 format.
The available presets are:
Custom
The Preset option is set to Custom as soon as any of the Video set-
tings does not match the other three presets.
480p - 30 Hz
The video output is set to 480p at 30 Hz. This means a size of
854*480 pixels and using progressive scanning.
720p - 30 Hz
The video output is set to 720p at 30 Hz. This means a size of
1280*720 pixels and using progressive scanning.
1080p - 30 Hz
The video output is set to 1080p at 30 Hz. This means a size of
1920*1080 pixels and using progressive scanning.
Codec
Use a codec to reduce the file size of the video. For information about
recommended codecs, see chapter "Recommended codecs" on page 583.
Quality
For some codecs you can set the Quality directly without opening
the settings.
Preview
Display a preview of how the exported window will look like, showing the
data of the current frame. The size of the video window can be changed
by resizing the preview window. However, if the Use window dimensions
setting is active then the size is locked.
Marker/Video frequency
The marker and video frequency of the current camera settings are dis-
played at the bottom of the page. The video frequency that is displayed is
only the highest of the used frequencies.
If you are exporting an AVI file from a QTM file then the marker/video fre-
quency displays the frequencies of that file.
The settings of the previous export are always saved, that means that if you
change settings for an export from the File menu, those settings will also be
used for an export directly after a file capture. The settings are not saved if you
use the Export view to AVI option on the 3D and 2D view window menus.
On the FBX export page there are settings for the FBX export. For information
about the FBX export, see chapter "Export to FBX file" on page 742.
The following settings are available for FBX export.
File type
File type
Choose if the exported data is in ASCII format or in binary format.
Export a separate file for each skeleton and rigid body
Check this option to export a separate file for each skeleton and
rigid body. The name of the skeleton or rigid body is added to the
file name as a suffix, separated by an underscore.
Exported data
Opticals
Export labeled trajectory data.
Actors (MotionBuilder)
Export Actor(s) that can be used for IK solving in MotionBuilder. See
MotionBuilder documentation for more information. Requires that
Rigid bodies
Export 6DOF data from rigid bodies as single segment models.
End bones
Enable or disable the export of an end bone for the exported rigid
bodies (checked by default). The end bone can be included for better
visualization in animation software.
Skeletons
Export skeleton data.
The data is scaled according to the Scale factor used in the Skeleton
solver settings of the file. To change the Scale factor, you need to repro-
cess the file with the modified skeleton solver settings, see chapter
"Reprocessing a file" on page 601.
Characters
Export Character(s) that can be used for retargeting in third party
software. A character provides a mapping from QTMs skeleton struc-
ture to a predefined structure, defining which segment is the head,
the legs, the arms and so on. This in turn can be used to drive 3D
models with a slightly different skeleton structure. Requires that the
Skeletons option is enabled.
Root naming
Select the root name used for skeletons. The options are Reference
(default) and root. Alternatively, the user can specify a custom name.
Cameras
Export poses of calibrated cameras.
Timecode
If enabled, SMPTE timecodes (when available) will be used as timestamps
for any exported data.
Naming convention
Select the naming convention of labels for the export. The options are:
Maya: labels are converted if needed so that they do not contain spaces
or other symbols that are not supported in Maya.
JSON export
On the JSON export page there are settings for the JSON export. For more
information about the JSON export, see chapter "Export to JSON file" on
page 743.
The following data types can be exported.
3D data
All of the 3D data will be included, both labeled and unidentified.
6D data
Include 6DOF data.
Analog data
All of the analog data will be included, both data from analog boards and
EMG.
Timestamps
Include timestamps for each camera frame, according to Timestamp set-
tings.
Skeleton data
Include skeleton data. Use the Reference Frame option to specify if the
skeleton data is exported in global or local coordinates.
Events
Include events.
Camera information
Include information about the cameras.
TRC export
STO export
On the STO export page there are settings for the STO export. For more inform-
ation about the STO export, see chapter "Export to STO file" on page 745.
Start an external program using the command "Action Argument(s)". This can
be useful for further automatic processing of an exported file.
Action
Specify the program to be started.
Argument(s)
Specify additional arguments to be added to the command. The following
parameters are available:
$p: Path of the current capture
TIP: To open a capture in Excel, make sure that Export to TSV file is
selected and use the Action and Argument(s) fields as below.
The GUI page contains display related settings. All of the GUI settings are
applied both in RT and on files. On the GUI page there are settings for the
Screen update and for Close previous measurement file. Under the GUI
main page there are pages for 2D View Settings, 3D View Settings and Static
Mesh Objects.
The Screen update options can be used to optimize the performance of the
computer. Most of the time 15 Hz is enough for the screen update as the eye
cannot register movements faster than 25 Hz.
Real-Time mode
Set the frequency for the screen update during real time (preview).
Uncheck the check-box next to the setting to disable the GUI during pre-
view. The screen update rate for Video can be set independently.
Capturing
Set the frequency for the screen update during capturing. Uncheck the
check-box next to the setting to disable the GUI during capture. The
File playback
Set the frequency for the screen update during file playback.
Status bar
Set the frequency for the update of the numbers in the status bar.
The 2D/3D view presets options are used to control and save presets for the
2D and 3D GUI.
Preset drop-box
Select the preset from the Preset drop-box. There are two standard pre-
sets: QTM default and QTM High Contrast. These can be used to set the
default settings respectively to change the colors to improve the visibility
in sunlight.
Then there are 8 user presets that can be saved with any 2D view and 3D
view settings.
Save
To save a preset select one of the 8 user presets in the Preset drop-box
and then click on Save. All of the settings on the 2D view settings and 3D
view settings pages are saved to the preset.
Rename
Rename the currently selected preset.
The Plot options are used to control some plot window settings.
Default Real-Time Plot Size
Default data buffer duration when plotting data in Preview mode, used
when creating a new plot or applying a Windows layout while in Preview.
The Measurement file close options are used to control how QTM closes files
automatically.
When opening another file
The current file is closed when opening another file (Open in the File
menu).
Use the Reset hidden message boxes button to display all message boxes
that have been hidden. E.g. the message box about the force plate position
after a calibration.
2D view settings
The 2D view settings page contains settings for objects that is shown in the 2D
view window. The settings are saved with the project and are applied both in
the RT/preview and on a opened file. You can use the Reset settings button if
you want to reset all of the settings to default.
Show linearized data
Toggle whether linearization is applied to the marker and video data in
the 2D views. The unlinearized data is the original data, while the
NOTE: Marker masks and the red rectangle representing the image
size are not drawn linearized, this means that with wide angle lens it
is best to turn off the Show linearized data option to see the true
positions of the mask and image size.
Background color
Select the Background color of the marker views.
Marker color
Select the Marker color in the 2D views.
Marker display
All markers have the same size
Select whether the 2D markers are displayed with their actual size or
with the same size.
Marker size
Select the marker size in subpixels when All markers have the
same size is set to Yes.
3D overlay elements
Select the 3D elements that will be displayed in 3D overlay. The following
elements can be controlled in the 3D overlay.
Markers, Marker traces, Bones, Grid, Axes, Cameras, Force
plates, Force arrow, Force trace, Rigid bodies, Volumes, Bound-
ing box, Gaze vector, Gaze vector trace, Skeletons, Static mesh
objects.
3D view settings
The 3D view settings page contains settings for objects that is shown in the 3D
view window and also how to operate the 3D view. The settings are saved with
the project and are applied both in the RT/preview and on a opened file.
You can use the Reset settings button if you want to reset all of the settings to
default. Only the more complex settings are explained below, for explanation
of the other settings please check the description in the Project options dia-
log.
Axes
Display settings for the global axes in the 3D view.
Show axes, Length of the axis [mm]
Background
Display setting for the background in the 3D view.
Background color
Bones
Display settings for the bones in the 3D view.
Show bones, Show AIM structure bones, Bone thickness [mm],
Bone default color
The Show AIM structure bones option toggles the display of the
AIM structure bones used to create an AIM model.
Cameras
Display settings for the cameras in the 3D view.
Show cameras, Show camera IDs
Force vector
Display settings for the force vector in the 3D view.
Show force vector, Force vector color, Scale factor [mm/N],
Show force trace
The Scale factor option sets the size of the force arrow in relation
to the force in N.
Activate the force trace, also called force butterfly or Pedotti dia-
Force plates
Display settings for the force plate in the 3D view.
Show force plates, Force plate color,
Gaze vector
Display settings for the Gaze vector of eye-trackers.
Show gaze vector, Gaze vector color, Gaze vector length [mm],
Show gaze vector trace and Gaze vector trace color.
Grid
Display settings for the grid in the 3D view.
Show grid, Show grid measure, Grid color, Automatic size, Grid
length [m], Grid width [m], Vertical offset [mm], Distance
between each gridline [mm], Number of subdivisions, Thicker
lines on outside and center
The Automatic size option will make the grid approximately the
same size as the lab area, i.e. the size is set by the camera positions.
Markers
Display settings for the markers in the 3D view.
Default unidentified marker color, Default labeled marker
color, Use global marker size setting, Marker size (mm), Show
marker labels, Show marker traces, Traces line style, Show tra-
jectory count, Show labeled trajectory information, Show
unidentified markers
The Use global marker size setting option will decide whether to
use the same marker size on all files or individual settings for each
file. Disable the option to use the marker size that was used when a
file was saved.
The Show trajectory count option will display the number of selec-
ted trajectories and the total number of trajectories in the current
frame at the bottom right corner of the 3D view window.
OpenGL
OpenGL format selection mode
The default values will give the best quality of graphics. However if
there are any problems with the graphics first try to disable the anti-
aliasing, then try one of the other modes: Use options below or
Use explicit index.
Rays
Display settings for camera tracking rays in the 3D view.
Enable camera rays
Show/hide camera rays
Rigid bodies
Display settings for the rigid bodies in the 3D view.
Skeletons
Display settings for skeletons in the 3D view.
Show skeletons, Show skeleton labels, Skeleton color, Skeleton
thickness, Show segment labels, Show segment coordination
axes, Show segment markers, Show segment constraints, Con-
straint tolerance range [%], Segment constraint color
Volumes
Display settings for the display of volumes in the 3D view, for more inform-
ation see chapter "Volumes in 3D views" on page 125 and "Camera view
cones in 3D views" on page 128.
Show covered volume, Cut covered volume at floor level, Cam-
eras required to consider volume covered, Show calibrated
volume, Enable camera view cones, Length of camera view
cones [m], Show bounding box
The Static Mesh Objects page contains a list of static 3D mesh objects that are
shown in the 3D view window in the project. Mesh objects are Wavefront 3D
object files.
Static meshes in this list are associated with and saved in the project. The 3D
meshes are not stored in QTM files, which means that the whole project must
be shared to include the meshes when opening a file.
The static meshes for a project will be shown in the 3D view window inde-
pendent of the file or measurement being displayed. Use the Show static
mesh objects option on the 3D View Settings page to disable the display.
NOTE: Using really large meshes can slow down the rendering in QTM.
The buttons can be used to manage the list or access and edit the settings for
the objects:
NOTE: When adding a mesh, you may need to change the scale
factor since there is no standard for how to represent the size in the
.obj file.
Edit
Open the Static Mesh Settings dialog for inspecting or modifying the
Mesh Object settings.
Remove
Remove the selected mesh object from the list.
Static Mesh Settings dialog
The Static Mesh Settings dialog is used to inspect and modify the static mesh
object settings.
The dialog contains the following settings:
Filename
A drop-down list of mesh objects in the Meshes folder of the project. Copy
the .obj file, and any associated .mtl or image file, manually to the Meshes
folder to use the mesh in the project. The path to the Meshes folder can
be set on the Folder options page, see chapter "Folder options" on
page 427. For information about obj files and the features supported, see
chapter "Compatibility of meshes" on the next page.
Rotation
The 3D rotation of the object relative to the global coordinate system. The
values for Roll, Pitch and Yaw are specified in degrees around the X, Y
and Z axes, respectively.
Scale
The scale factor of the mesh object.
Opacity
The opacity of the mesh object from 0 (transparent) to 1 (opaque). An .obj
file can include an opacity option for individual faces, these will be
rendered as transparent even if the Opacity option is set to 1.
The buttons:
Reset
Remove the link to the file and resets the Position, Rotation and Scale
values.
OK
Accept the settings and close the dialog.
Cancel
Discard the settings and close the dialog.
Compatibility of meshes
A mesh file consists of a lot of faces and each individual face can be associated
with materials and textures, but not all of the features applied to a face is
rendered in QTM. Complex obj files may not be supported and then those faces
and textures are not rendered. Some know limitations are:
l The material is only rendered for the front side of a face. If a material doesn't
show the reason can be that it is applied to the backside of the face.
l Only four faces can be included on one row in the .obj file. If there are more
than four faces on a row then none of them are rendered.
Miscellaneous
Folder options
The Folder options page contains the settings for the locations of the following
file types saved by QTM.
The Project files locations are different for each project.
Calibration folder
The Calibrations folder is by default set to a folder called Calibrations in
the Project folder. It can be changed to another folder with the Browse
option, for example if you want all of the calibrations in different projects
to be saved in the same folder. However, then you must change it manu-
ally in each project.
Meshes folder
The Meshes folder is by default set to a folder called Meshes in the Pro-
ject folder. It can be changed to another folder with the Browse option,
for example if you want to use the same meshes tin different projects.
However, then you must change it manually in each project.
The options on the Startup page changes how projects are loaded when QTM
starts. The default is that no project is loaded automatically and that you have
to select a project from the Open project dialog, see chapter "Manage pro-
jects" on page 72.
However, if you want to load a project automatically, then activate the option
Automatically load a project when QTM starts and select one of the fol-
lowing two options.
The most recent project
Loads the project that was last opened in QTM.
This project
Loads a selected project when QTM starts. Select the project with the
Browse option.
The Events page contains settings for Event shortcuts. For information on
how to set and edit events see chapter "How to use events" on page 706.
Event shortcuts
The Event shortcuts options are for creating default event names that can be
easily accessed in the Add events dialog when creating an event, see chapter
"Adding events" on page 706. The shortcuts are enabled in the dialog with the
option Show shortcut list when creating events.
Use the Add Shortcut and Remove Shortcut buttons to add/remove shortcuts
to/from the list. The Event name and Color can be edited in the list.
The Scripting page contains the settings for the terminal and the use of script
files in QTM. For more information about the QTM Scripting Interface, see
chapter "QTM Scripting Interface" on page 1017.
Terminal settings
Language
Choose the language used in the terminal window. The choices are
Python or Lua.
Text color
Define the color of the text used in the terminal window.
Script files
The script files section contains a list of script files that are loaded when
starting the project. Use the check box to activate or deactivate the script
files.
Add
Open a file dialog for adding script files to the list. You can use mul-
tiple select for adding multiple scripts at once.
Remove
Remove a script file from the list.
System hardware
Qualisys offers a variety of marker cameras for Arqus, Miqus and Oqus sys-
tems. For an overview and specifications of the various types and models, see
"Qualisys camera sensor specifications (marker mode)" on page 926.
The marker cameras calculate the positions and sizes of the detected markers
on the camera, which then are sent to the computer over the camera network.
This leads to a great reduction of information, which allows the cameras to be
connected in a daisy chain. The edges of the markers are detected by using an
intensity threshold. The center and size of the markers is calculated in sub-
pixels, a subdivision of each pixel in 64 units per dimension.
The following features are available for Qualisys marker cameras:
Sensor mode
Most marker cameras have the possibility to use different sensor modes.
For an overview of available sensor modes per camera model, see
"Qualisys camera sensor specifications (marker mode)" on page 926. The
available sensor modes are:
High-resolution mode: in high-resolution mode the full resolution
of the sensor is used. This is the default sensor mode.
Marker masking
Marker masking can be used to block marker detection at selected areas
of the sensor. For more information, see chapter "Marker masking" on
page 536.
Exposure delay
Exposure delay can be used to improve marker detection in setups with
opposing cameras by shifting the exposure of one or more cameras rel-
ative to other cameras in the system. For more information, see chapter
"Delayed exposure to reduce reflections from other cameras" on
page 534.
Active filtering
Active filtering can be used to improve marker detection in environments
with high background illumination. Active filtering works by capturing
each image twice to remove the background light. This significantly
increases the ability to capture passive markers in an outdoor envir-
onment. Active filtering is supported by all camera types, except the Oqus
3 and 5-series cameras. For more information, see chapter "Active fil-
tering for capturing outdoors" on page 539.
Marker types
All marker cameras support the use of different types of markers, namely
passive and several types of active markers. For more information, see
chapter "Choice of markers" on page 529.
Video mode
All marker cameras can be used in video mode. This can be helpful, for
example for pointing the cameras when setting up the system. Most
l A hybrid camera that can be used both as a marker camera and a color video
camera, see chapter "Miqus Hybrid" on the next page.
For more information on how to use Qualisys video cameras, see chapter
"Qualisys video capture" on page 574.
Streaming video
Streaming video can be recorded with Miqus Video, Miqus Video Plus, or the
Oqus color video camera (Oqus 2c). When using streaming video, compressed
video data is sent to QTM during the capture, allowing for long video captures.
For an overview of the sensor specifications of all cameras supporting stream-
ing video, see chapter "Qualisys video sensor specifications (in-camera MJPEG)"
on page 927.
For more information on capturing streaming video, see chapter "Capture
streaming video" on page 576.
Miqus Video
The Miqus Video series is a dedicated video camera for capturing of MJPEG com-
pressed video. The following types of video cameras are available: Miqus VM
(monochrome), Miqus VC and Miqus VC+ (color). The Miqus Hybrid camera can
also be used as a color video camera. The Miqus Video series is configured with
a white strobe and a built-in filter that blocks the IR light for a better image
quality. The camera is always configured for video using In-camera MJPEG com-
pression for effective streaming. For information about maximum video
NOTE: On newer Miqus Video cameras, one of the LEDs in the strobe is
infrared, so that it can be used to trigger the active calibration kit in a
video-only system.
The Oqus color video camera (2c-series) is a dedicated video camera for cap-
turing of MJPEG compressed video. It is configured with a clear glass to let in
the visible light. There is a filter on the lens that filters out the IR light for better
image and the strobe is white.
The camera is always configured for capturing streaming video (MJPEG) by
default. When switching off the In-camera MJPEG compression, it can also be
used as a high-speed video camera. For more information, see chapter "Cap-
ture high-speed video" on page 579.
Miqus Hybrid
The Miqus Hybrid camera is a two-in-one camera which can be used for both
marker tracking and color video recording. The dual functionality of the camera
makes it especially useful for the exploration of markerless applications.
In Marker mode, the camera uses the near IR strobe (850 nm) for the illu-
mination of markers. The resolution and frame rate are similar to the Miqus M3
camera.
In Video mode, the specification are similar to the Miqus Color Video camera.
Qualisys provides weather protected cameras that are specially adapted for out-
door use or use in industrial environments. The housing, including connectors
and cabling, is IP67/NEMA 6 classified, making the whole system water- and
dust resistant.
Underwater systems
Qualisys provides the possibility to measure under water, (e.g. in indoor ocean
basins used for ship scale model testing or ordinary basins, used for water
sports) by using specially modified cameras. The camera housing is IP68 clas-
sified and pressure tested to 5 bar (40 m depth) and corrosion-protected for
use in salt water tanks or chlorinated swimming pools. Weight and volume is
balanced to give the camera neutral buoyancy for easy handling in water.
Qualisys underwater cameras are equipped with a special strobe with high
power cyan LEDs. These LEDs are not limited to a flash time of 10% of period
time as the regular strobe. Therefore the exposure time can be set to almost
the period time. The long exposure times are needed to get enough light in the
water. Because the water absorbs more light than air it also means that the
measurement distance are more dependent on the exposure time.
The cameras are connected with underwater cables to a connector unit on
land. One connector unit can be used to connect up to three cameras. For
more than three cameras the respective connector boxes can be chained
together or connected via an Ethernet switch.
For the currently available models, see the Qualisys website:
https://www.qualisys.com/cameras/underwater/.
Qualisys provides cameras, including cabling and accessories adapted for use
inside MRI rooms. A minimal camera system of only three cameras can be suf-
ficient to track hand and finger movements outside the scanner bore. A larger
six camera system can provide the possibility to look into the bore, for example
for monitoring the movements of the head during a scan.
The housing is made of die-casted aluminium, steel fasteners are replaced by
brass screws and the calibration L-frame and wand is made from glass fiber.
The camera and cables are completely shielded to keep electro magnetic emis-
sions to a minimum and enabling cameras to operate just a meter away from
the scanner without causing artifacts in the MRI image. The camera system is
connected with the computer in the control room via optical Ethernet com-
munication. The power supply is outside the room and connected via a filter
through the MRI room panel.
For the currently available models, see the Qualisys website:
https://www.qualisys.com/cameras/arqus-mri/
Qualisys accessories
Qualisys offers a range of accessories for motion capture applications, includ-
ing:
l Calibration kits
l Mounting equipment
l Computers
For an overview the currently available accessories, refer to the Qualisys web-
site: https://www.qualisys.com/accessories/
For technical specifications of selected products (calibration kits, active mark-
ers), see chapter "Qualisys accessories specifications and features" on
page 995.
Traqr Configuration Tool
The Traqr Configuration Tool is used the configuration of the Active Traqr and
the Naked Traqr.
The Traqr Configuration Tool is downloaded from your Dashboard on the
Qualisys website. For more information about how to use the tool see its
manual, which can be found on the Help page in the tool.
Camera positioning
Cameras must be mounted firmly on tripods or other stable structures, which
isolate the camera from movements or vibrations of any sort.
To capture 3D data the camera system must consist of at least 2 cameras. The
guidelines below can be used to set up a camera system.
l The best possible setup for a 3D motion capture system is to position it so
that all cameras can see the L-shaped reference structure during the cal-
ibration, see chapter "Calibration of the camera system" on page 543.
NOTE: The cameras can be positioned so that just two of the cam-
eras are able to see the calibration reference object. The rest of the
cameras must then overlap each others field of view (FOV) to be
able to calibrate the system. For this setup QTM will automatically
use the Extended calibration method, see chapter "Extended cal-
ibration" on page 549.
l To reconstruct 3D data, at least two cameras must see each marker dur-
ing the measurement. Therefore, it is best to position the cameras so that
as many cameras as possible see each marker during the measurement.
l The angle of incidence between any two cameras should ideally be more
than 60 degrees, but at least 30 degrees. The accuracy of the 3D data cal-
culated from only two cameras placed at less than 30 degrees can
degrade below usable levels.
l In order to avoid unwanted reflections, position the cameras so that every
camera’s view of flashes from other cameras is minimized. E.g. put the
cameras above the measurement volume so that the cameras have an
angle of about 20 degrees in relation to the floor.
l Obviously the cameras must also be positioned so that they view the
volume where the motion of the measurement subject will occur. Mark
the volume by putting markers in the corners of the base of the meas-
urement volume. Then make sure that all cameras can see all of the mark-
ers by looking at a 2D view window in preview mode. Preferably, the
cameras should not see much more than the measurement volume.
2D motion capture
Inter-chain mixing
Cameras using the same cable type can be mixed together in a chain,
without the need of a network switch. To mix cameras of different cable
types you need to use a switch. For an overview of cable types, see the
Table below.
NOTE: Older Miqus cameras are not compatible with the R2 power
supply. Please contact Qualisys support for more information.
Arqus A 5 20 75
Arqus Pro-
B 5 20 75
tected
Miqus A 10 20 140
Miqus
A 101 3 140
Hybrid
Oqus C 5 15 -
1Even though a power supply can power up to 10 Miqus Video/Hybrid
cameras, it is
recommended to use one power/data chain per three cameras.
2When combined with marker cameras, the maximum number of
Miqus Video cameras
is 2 per chain.
NOTE: You can add a Camera Sync Unit to the chain beyond the
maximum power kit capacity or maximum chain size.
Arqus and Miqus cameras can be mixed in the same chain. Due to differences
in power consumption, one Arqus camera can be interchanged by two Miqus
cameras and vice versa. Supported combinations per power kit:
l 4x Arqus + 2x Miqus
l 3x Arqus + 4x Miqus
l 2x Arqus + 6x Miqus
l 1x Arqus + 8x Miqus
Power consumption wise, a Miqus marker camera can be swapped out for a
Miqus Hybrid or Miqus Video camera. However, due to bandwidth require-
ments it is recommended to limit the number of video cameras in a chain to a
maximum of two video cameras.
If more video cameras are needed, they should be added in a separate chain
and joined through a network switch.
1. Check that the computer has an Ethernet card with at least the same
bandwidth as the switch.
2. Connect the Ethernet card with the switch. You can use any of the ports
on the switch.
3. Use the standard network card settings as described in the chapter "Net-
work card setup" on page 461.
4. Connect the respective daisy chains of Qualisys devices to one port each.
Systems with more than three Miqus Video or Miqus Hybrid cam-
eras
This type of setup requires a 10 Gigabit switch with a maximum of three
cameras per port.
Connecting an Arqus system
Camera cables
The number of camera cables is the same as the number of cameras.
The Arqus system is easy to connect. The backside of each camera contains two
Data/Power ports, see chapter "Arqus camera: back side" on page 934. In a
basic setup with up to 5 cameras, the cameras are connected to each other by
camera cables in a daisy chain configuration. For the maximum number of cam-
eras and the maximum cable length per power supply, see chapter "Power and
camera cable requirements" on page 441.
The first camera or Camera Sync Unit is connected via a Power injector to the
power supply and the computer as follows:
1. Connect the power supply to the Power port on the power injector.
2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to the Ethernet port of the computer.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data port of the first camera or
the Camera Sync Unit.
For larger systems, start a new power chain when the maximum of 5 cameras
per power chain is reached. Connect the first camera of the new power chain
(camera 6, 11, 16) via a Power injector as follows:
1. Connect the power supply to the Power port on the power injector.
2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to a Power/Data connector of the previous
camera.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data connector of the first camera
in the next power chain.
For systems with more than 20 cameras, the use of a switch is required. The
system can then be subdivided in chains with up to 20 cameras. The chains are
connected with their respective host cables to a switch, which in its turn is con-
nected to the computer.
Before you connect the Arqus camera system, make sure that the QDS
(Qualisys DHCP server) is running and that the network interface settings are
correct. This is needed for the cameras to receive an IP address from QDS to
communicate with other Qualisys cameras and the host computer. For more
information, see "QDS" on page 462 and "Network card setup" on page 461.
The general Arqus startup sequence is as follows (total duration about 50 s):
2. Booting of the cameras. The amber LED ring is lit during booting.
The Arqus A12 with standard lens option has a motorized lens, which can be
controlled from QTM via the Lens Control interface in the Camera Settings
sidebar in the 2D View window, see "Camera settings sidebar" on page 91.
For Arqus cameras with a manual lens, you need to extend the strobe mech-
anics to get access to the focus and aperture rings of the lens. The strobe mech-
anics can be shifted as follows:
1. Press and hold the Strobe unlock button at the backside of the Arqus
camera.
2. Shift the strobe mechanics outwards to expose the lens for adjustment.
3. Shift the strobe mechanics inwards again when done. You may have to
fine adjust the position of the strobe mechanics so that it locks into the
dimples on the strobe rails.
Camera cables
The number of camera cables is the same as the number of cameras.
1. Connect the power supply to the Power port on the power injector.
2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to the Ethernet port of the computer.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data port of the first camera or
the Camera Sync Unit.
For larger systems, start a new power chain when the maximum of 10 cameras
per power chain is reached. Connect the first camera of the new power chain
(camera 11) via a Power injector as follows:
1. Connect the power supply to the Power port on the power injector.
2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to a Power/Data connector of the previous
camera.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data connector of the first camera
in the next power chain.
For systems with more than 20 cameras, the use of a switch is required. The
system can then be subdivided in chains with up to 20 cameras. The chains are
connected with their respective host cables to a switch, which in its turn is con-
nected to the computer.
The Camera Sync Unit can be placed anywhere in the chain, but in most cases
it will be practical to have it close to the computer.
When the cables have been connected correctly, the indicator LEDs at the
Miqus data/power ports will indicate the status of the power and data con-
nection. For more information, see chapter "Miqus camera: back side" on
page 945.
Before you connect the Miqus camera system, make sure that the QDS
(Qualisys DHCP server) is running and that the network interface settings are
correct. This is needed for the cameras to receive an IP address from QDS to
communicate with other Qualisys cameras and the host computer. For more
information, see "QDS" on page 462 and "Network card setup" on page 461.
The general Miqus startup sequence is as follows (total duration about 50 s):
2. Booting of the cameras. The amber LED ring is lit during booting.
All Miqus cameras are equipped with a manual lens. You need to extend the
strobe mechanics to get access to the focus and aperture rings of the lens. The
strobe mechanics can be shifted as follows:
2. Shift the strobe mechanics outwards to expose the lens for adjustment.
Camera cables
The number of camera cables is the same as the number of cameras.
A system with Miqus Hybrid or Miqus Video is connected in the same way as a
Miqus system. However, the maximum chain size is limited to three cameras.
The chains must be connected via a 10 Gigabit switch to the Desktop Mark-
erless Computer.
The maximum number of cameras in a system that can be used at Full FOV and
full capture rate depends on the computer processor. For specifications, see
the below table.
Data cables
Cables carrying data between cameras.
Host cable
Cable carrying data between first camera in the chain and computer or
switch.
The Oqus system is easy to connect. The connectors are unique and cannot be
connected to the wrong ports. Further, the connector color matches that of the
port. The DATA connector can be connected to any of the two DATA ports, and
the POWER connector can be connected to any of the two POWER ports, so it
does not matter on which side you put the connector. For more information on
the connectors, see "Oqus camera connectors" on page 965.
NOTE: When the cables have been connected correctly the LEDs on the
back of the Oqus will be lit. The EXT LED will be lit green and the ACT
LEDs will be blinking.
For larger systems, start a new power chain when the maximum of 5 cameras
per power chain is reached. One of the POWER ports of the first camera of the
new chain should be connected to a power supply. The DATA ports of the last
camera of the previous chain and the first camera of the new chain are con-
nected with a Data cable.
NOTE: For Oqus systems larger than 15 cameras and for systems with
many high-speed cameras, the performance can sometimes be improved
with a Gigabit Ethernet switch and then connect the cameras in shorter
daisy-chains, see "Connecting a Qualisys system through an Ethernet
switch" on page 444.
Before you connect the camera system, make sure that the QDS (Qualisys
DHCP server) is running and that the network interface settings are correct.
This is needed for the cameras to receive an IP address from QDS to com-
municate with other Miqus cameras and the host computer. For more inform-
ation, see "QDS" on page 462 and "Network card setup" on page 461.
The general Oqus startup sequence is as follows.
1. Connect the power supply and the green LED on the front will blink twice.
If the bar stops at two-thirds then the Oqus is waiting for an IP-address.
The reason is probably either a missing connection to the computer or
that QDS is not running, for instructions on how to search for the error
see "Troubleshooting connection" on page 1022.
3. When the camera has an IP-address the display will show an image similar
to one below. The Oqus will first synchronize to other cameras, during
that process the clock is blinking and there is a spinning plus sign instead
of the letter M or S. Wait until the clock stopped blinking and the display
shows M or S.
The M or S on the display stands for Master respectively Slave. This is only
to show which camera that is sending a synchronization pulse to the
other cameras.
The Oqus 7+ with standard lens option has a motorized lens, which can be con-
trolled from QTM via the Lens Control interface in the Camera Settings side-
bar in the 2D View window, see "Camera settings sidebar" on page 91.
For Oqus cameras with a manual lens, you need to extend the strobe mech-
anics to get access to the focus and aperture rings of the lens.
1. Turn the strobe mechanics counterclockwise to expose the lens for adjust-
ment.
The Oqus system can run with a wireless communication from the camera sys-
tem to the computer. The camera uses the 802.11b/g@54mbps standard.
However the communication speed can be reduced depending on the signal
strength or if there are many other wireless networks.
All Qualisys devices are compatible and can be combined in a single system.
When mixing camera types, the global camera settings will default to the lowest
of the camera limits. This means for example that the capture rate will be lim-
ited to 183 Hz at full field of view when Miqus M5 is combined with Miqus M3.
However, individual settings can always be set within the limit of the camera
type.
Devices with the same cable type can be connected in a daisy chain, for
example Arqus and Miqus cameras with cables of type A, see "Power and cam-
era cable requirements" on page 441. The total number of cameras that can be
connected to a power supply depends on the number of Arqus and Miqus cam-
eras, see chapter "Mixing Arqus and Miqus" on page 443.
1. Before changing the network configuration you can save the current con-
figuration in QDS, so that you can easily restore the current configuration.
This is recommended when you only have one network card.
2. Make sure to set the Address type to Static Address.
4. Make sure that the Enable QDS operation for this network option in
the QDS Advanced dialog is checked.
QDS
QTM comes with a DHCP server called QDS (Qualisys DHCP Server), which dis-
tributes IP addresses to the Qualisys cameras. An IP address for each camera is
required to be able to communicate with them over the Ethernet network. QDS
will be installed automatically with QTM and it must be running at all times, to
provide the cameras with IP addresses at startup. The DHCP server will only
give IP addresses to Qualisys cameras so it will not disturb your computer net-
work.
QDS menu
To open the QDS menu, right-click on the QDS icon in the status toolbar.
Advanced
Advanced configuration of network interfaces on the computer, see
"Advanced" on page 467.
Network configurations
Using this sub-menu you can save or load network configurations. Click
on Save to save the current configuration and then to load a con-
figuration click on Load.
Camera utilities
QDS can control the Qualisys cameras with the following commands. The
commands will be sent to all cameras, but only acted upon by the camera
models that support the command.
Green on
Switch on green LEDs on the LED ring.
Green off
Switch off green LEDs on the LED ring.
Green pulsing
Pulse green LEDs on the LED ring.
Amber off
Switch off amber LEDs on the LED ring.
Amber pulsing
Pulse amber LEDs on the LED ring.
Display on
This is the default mode for the Arqus and Oqus cameras. The dis-
play is on when you use the camera, but goes into sleep mode if not
used for 2 hours.
Display off
In this mode the camera display is turned off for Arqus and Oqus
cameras. The front LED is also turned off for all Qualisys cameras.
Display and LEDs do not turn on unless you enable them with Dis-
play on or reboot the cameras.
Camera blocklist
The camera blocklist is used to block QDS from distributing
IP addresses to specific cameras, see "Camera blocklist" on
page 469.
Enabled
Enable the camera blocklist.
Edit
Open the mac_block.txt file in the default text editor.
About QDS
Information about QDS.
The QDS network configuration wizard will guide you through the different
steps to setup the network for Qualisys cameras. If you run the wizard you do
not need to follow the instructions for network card setup in chapter "Network
card setup" on page 461. Follow the steps below.
1. Double click on the QDS icon in the Windows task bar or click on Con-
figuration wizard in the QDS menu to start the wizard. This will open the
Wizard with the Choose network connection page.
2. The list will show all enabled network interfaces on the computer. Select
the interface that you want to use with Qualisys cameras and click Next.
a. If there are more than 4 network interfaces available, use the Prev 4
conn and Next 4 conn buttons to navigate through the list.
b. Click on More to see information about the network interface. The
More info window shows the current settings of the network inter-
face. This is the same information as is shown in Advanced, see
NOTE: The wizard will not configure all interfaces, for example a
network interface that is already connected to an internal network
and has received an IP address will not be configured because it is
considered to have a running DHCP server. However, any dis-
connected interfaces can be configured by the wizard.
3. The wizard shows how it will change the selected interface. You can save
the current network setup with the Save button for backup. Click Next to
continue.
The Advanced settings dialog is opened with Advanced... in the QDS menu.
The QDS dialog contains settings for the enabled network interfaces on the
computer. These settings can be used instead of the QDS wizard or the Win-
dows network settings.
Select network connection to view/edit
Select the network that you want to edit in the dialog. The list is in the
same order as in Windows and if a network is disabled in Windows net-
work connections it will not be shown in the list.
Name
Current name of the network.
Status
Current status of the connection: Connected or Not connected.
Address type
Select the wanted address type between these two types:
Received through DHCP
The network will receive its IP address from a DHCP server. The
standard setting for many networks.
Static address
The network is set to a static IP address with the settings below. This
setting must be used for QDS to give IP addresses to Qualisys cam-
eras.
IP address
Current IP address of the network. The address can be changed when
Address type is set to Static address.
IP address mask
Current IP address mask of the network. The mask can be changed when
Address type is set to Static address.
QDS started
Time since QDS started.
Autoconfig
Use this button to configure the network interface for Qualisys cameras.
Camera blocklist
The camera blocklist is used for blocking QDS from distributing IP addresses to
specific cameras. Follow these steps to enable the blocklist.
1. Find the MAC addresses of the cameras that you want to block. For
example locate the system in QTM and use the System info option to get
information about all the cameras in the system. The MAC address is also
written on the label of all Qualisys cameras.
2. Open the mac_block.txt file with Camera utilities -> Camera Blocklist -
> Edit on the QDS menu.
3. Add the MAC addresses that you want to block in the list. '#' can be used
to comment a line in the file so that it is not used by the blocklist. There
can also be text after the MAC address to describe which camera it is. For
example:
C4:19:EC:00:0C:14 S/N 12345
C4:19:EC:00:0C:15 S/N 12346
# C4:19:EC:00:0C:16 S/N 12347
4. Enable the blocklist with Camera utilities -> Camera Blocklist - >
Enabled on the QDS menu.
5. Reboot the cameras. The cameras in the blocklist will not receive any IP
address.
There will be a conflict for the IP addresses when two or more computers with
QDS are connected to the same camera system. In those cases the first QDS
that replies after the startup of a camera will give the IP address to the camera.
On the other computers the Qualisys DHCP server message below will be
shown and QDS operation will be disabled on that network. The QDS operation
can be turned on again with the Advanced option on the QDS menu. However,
make sure that the other computers are disconnected from the camera system
otherwise QDS operation will be turned off again at the camera startup.
Firmware update
QTM will detect automatically if the camera firmware needs to be updated or
downgraded in the following cases. The firmware update is done via the
Qualisys Firmware Installer (QFI). For more information about QFI, see
chapter "How to use Qualisys Firmware Installer (QFI)" on the next page.
Firmware update when locating system
The following dialog will appear when QTM has detected an old firmware when
you are locating the system on the Camera System page in the Project
options dialog.
Detailed info...
Open a dialog with information on why QTM needs to update the firm-
ware.
OK
Start the firmware update program Qualisys Firmware Installer (QFI).
Firmware update when starting preview
The following dialog will appear when QTM has detected an old firmware when
you start a preview with New on the File menu.
Detailed info...
Open a dialog with information on why QTM needs to update firmware.
Cancel
Cancel the update. However, you cannot open a new file until you have
updated the firmware.
OK
Start the firmware update program Qualisys Firmware Installer (QFI).
How to use Qualisys Firmware Installer (QFI)
1. Start QFI. Usually this is done via the firmware upgrade dialog, see
chapter "Firmware update" on the previous page. Alternatively, locate the
QFI.exe program in QTM installation folder and double-click on it to start.
2. Click Next to start looking for Qualisys cameras connected to the com-
puter. The cameras must have started completely before you start loc-
ating the cameras.
5. Wait until all the steps (Uploading files, Programming camera(s), Waiting for
camera(s) to reboot) have finished.
CAUTION: Do not use these settings unless you are absolutely sure.
Upgrade firmware
Uncheck to not download the new firmware, which is useful if you only
need to modify PTP Mode or Lens Control.
PTP Mode
The PTP mode options are:
Use Qualisys PTP: This is the default PTP mode used for syn-
chronization of the Qualisys devices.
Use standard PTP: This PTP mode needs to be selected for PTP syn-
chronization of the Qualisys devices with an external clock master.
For more information, see chapter "How to use PTP sync with an
external clock master (Camera Sync Unit)" on page 501.
Lens Control
No change: Keep the current lens control mode. This is the default
choice.
Unlock settings: Enable focus and aperture control from QTM for
cameras with a motorized lens. This is the default mode.
Lock settings: Disable focus and aperture control from QTM. In this
mode, the communication with the lens is disabled and the lens con-
trol parameters are no longer shown in QTM. This setting can be use-
ful to fix the focus and aperture settings once they have been set to
their optimal values in a fixed camera setup.
Enable: Enable TCP Keep Alive mode. When enabled the cameras
regularly check if the command channel is open when there is no
activity.
Below these settings is a list of all cameras in the system. Check the cam-
eras that you do not want to upgrade.
NOTE: When locating the system QTM will detect automatically if the
camera system has old firmware. The firmware must then be updated
before the system can be used. For more information see chapter "Firm-
ware update when locating system" on page 470.
The steps below are just an outline of what should be done to automatically
connect the camera system to QTM.
Follow these steps to connect the camera system to QTM:
1. Switch on the camera system, wait for the cameras to start up properly
and start QTM.
2. Open the Project options dialog and go to the Camera System page.
3. Click Locate System. This will open the Finding camera system dialog.
5. For Arqus and Miqus cameras there is the option to automatically order
the cameras using the Auto Order button.
Starting a preview
When opening a new file in QTM, aka starting a new measurement, the cam-
eras are starting in Preview mode. This is done by pressing the New button in
the QTM toolbar ribbon or the File menu (keyboard shortcut Ctrl + N).
Once the cameras are in preview mode, QTM can stream data in real
time. Therefore, Preview mode is also referred to as real-time mode, RT/Pre-
view modeor live preview mode.
If the camera system has not been located yet, starting a preview will auto-
matically locate the camera system.
When starting the cameras for the first time after booting the cameras, it may
take some time before the cameras are ready. The status is indicated by the
Waiting for cameras dialog.
If no camera system is found, for example when the cameras are still booting,
QTM will wait for the cameras to start up.
NOTE: In case not all cameras have finished booting QTM may not find
all cameras. It is recommended to locate the system first, see chapter
"Locate System" on page 222.
2. Make sure that all cameras are selected in the Camera selection toolbar.
4. You can now drag and drop individual camera image areas to their
desired positions. Repeat this until you are satisfied with the order of the
cameras.
Identifying the cameras with the identification tool
3. Select one or more cameras using the Camera selection bar. The LED
ring of the selected camera(s) will light green.
TIP: If you have multiple cameras selected, you can select a single
camera by double clicking on its image area. When you double click
again, the previous selection is restored.
Arqus and Miqus cameras can be automatically ordered in QTM when locating
the camera system. For automatically ordering the cameras, follow these steps:
1. Locate the camera system (Locate System button under Project options
> Input devices > Camera System > Connection).
2. When all cameras have been detected, press the Auto Order button in
the Finding Camera System dialog.
3. When the auto ordering is finished, the Auto Order button changes name
to Reverse Order. The green LED rings of the cameras will flash in the
order of the found sequence. The first camera will light continuously. By
pressing Reverse Order you can change the order of the cameras, i.e.,
the last camera of the sequence becomes the first, etc.
NOTE: For cameras with motorized lenses, you may consider to lock the
aperture and focus to the set values. This can be done using the Qualisys
Firmware Installer (QFI) by setting the Lens control option under
Advanced settings to Lock settings, see "How to use Qualisys Firmware
Installer (QFI)" on page 471. By locking lens control aperture and focus
will be locked to the current settings and the Lens control interface will
no longer be available in QTM. This can be helpful in fixed camera setups
in which focus and aperture need to be constant.
1. Start a new measurement and stay in preview mode. Use the Camera set-
tings sidebar to change the settings.
The linearization file in the camera is compared to the one used in QTM every
time that you start a new measurement. If the files do not match, the following
warning is displayed.
If you want to download the file from the camera click Yes, which is mostly
recommended if the file in the camera is more recent than the one in QTM.
Otherwise, click No. Optionally, check the box Do not show this message
again to avoid the warning for this file the next time you start a measurement.
l The angle of the plate to the plane of the sensor must be at least 10 degrees
to be accepted as a valid orientation.
The data for the calculations is automatically selected based on these criteria.
The user is guided by the feedback provided by QTM to facilitate the data col-
lection.
Feedback during the linearization procedure
QTM provides the following feedback to guide the user during the linearization
procedure:
l At the start of the linearization, the 2D view turns red. The image is
mirrored to make it easier for the user to the move the plate across the
area.
l The markers of the plate are colored when the plate is identified.
l If the markers are white, it may help to shortly hide the plate from
the camera and show it again.
l If QTM has difficulties to identify the plate, make sure to remove any
extra reflections and redo the linearization.
TIP: Set a fixed, large marker size under Marker display in the 2D view
settings in case you have difficulties to see the position of the lin-
earization plate during the linearization procedure.
Linearization instructions
Preparations
1. Place the camera on a tripod. It is a good idea to place the camera side by
side with the computer screen so that you can easily follow the feedback
during the linearization procedure.
2. Connect the camera to the computer and start a preview in QTM. You can
have several cameras connected while linearizing, but you can only lin-
earize one camera at a time.
2. Select the number of the camera that you want to linearize with Camera
to linearize.
3. Make sure that you are using the correct plate type. The standard plate
consists of 6x5 markers.
4. Enter the focal length of the current lens in Approximate focal length.
This ensures that the distance intervals are scaled correctly when per-
forming the linearization.
5. Click OK to start the linearization procedure.
NOTE: During the linearization procedure, the lens must not be touched.
However, the camera can be moved if needed.
When you are finished with the data collection, QTM will calculate the lin-
earization parameters. The results are presented in the Linearization results dia-
log.
Focal length
The calculated focal length of the lens in mm. Make sure that the value
corresponds to the lens specifications.
Residual
The residual represents the remaining error of the collected data after
application of the linearization in subpixels. The residual should be lower
than 5 subpixels for standard camera-lens combinations. For wide-angle
lenses, higher values may be acceptable.
Furthermore, the dialog shown information about the linearization file that has
been generated. The file name consists of the serial number of the camera, fol-
lowed by the date and time the linearization was performed. The linearization
procedure outputs three files:
.lin - The linearization file that is used by QTM.
.stat - This file contains the settings of the linearization and also some
statistics.
The files are saved in the Linearization folder under the Qualisys program data,
typically C:\ProgramData\Qualisys\Linearization. The .lin file is also uploaded to
the camera.
Click OK to close the Linearization results dialog.
Synchronization
Timing hardware
How to use external trigger
An external trigger can be used to trigger the start of the motion capture. For a
system with a Camera Sync Unit the trigger is connected to any of the trigger
ports, see chapter "Trigger ports" on page 273.
The delay between the trigger and capture start can be set, however the min-
imum is 20 ms. Therefore it is recommended to use the Sync out signal when
synchronizing with other equipment. The delay for stopping on trigger is twice
as long as the configured start delay.
For Oqus the trigger is connected to a splitter cable on any Oqus camera in the
system. If the movement that is captured is very short the external trigger can
be used together with pretrigger so that the whole movement is captured.
On the Oqus camera there can be errors if you send a trigger signal when the
camera is setting up the measurement. I.e., before it says Waiting for trigger
in the Status bar. The problem can be solved by moving the trigger button to
the master camera.
Possible external trigger devices are for example:
l Trigger button
l Photo cell
l Other systems
The pretrigger is used to collect frames before the arrival of a trigger event. The
pretrigger frames are saved in the camera buffer before the trigger event and
after the event the frames are sent to the measurement computer. When using
pretrigger it is impossible to know the delay between the trigger signal and the
first frame after the pretrigger frames. This is because the cameras are already
measuring and the trigger event can come at any time in reference to the
frame rate.
The trigger event can be sent to the camera either with an external trigger or
from a dialog in QTM if there is no external trigger.
When using analog boards with pretrigger the connection differs between the
boards, see below.
The pretrigger settings can be found on the Synchronization page in the Pro-
ject options dialog, see chapter "Pretrigger" on page 277.
When analog data is collected while using pretrigger, an external trigger must
be used to start the capture. Activate the external trigger setting on the Syn-
chronization page in the Project options dialog. The external trigger is con-
nected to different connectors on the analog board depending on the analog
board type and camera type.
USB-2533
Use normal setup with a Sync out signal from one of the Oqus cameras
connected to the Sync input of the analog device, see chapters "Con-
nection of analog board" on page 752. In addition, the external trigger sig-
nal should be connected to the first analog channel on the board. This can
be done by splitting the trigger input at the camera or sync unit with a T-
connector and connect one end to the trigger button and the other end
via a BNC cable to the first analog channel on the board.
The external timebase is used to get a required frame rate in the motion cap-
ture from an external device.
The Qualisys camera system can be synchronized with an external timebase for
drift free synchronization with a external devices. When synchronizing to a peri-
odic TTL signal the Qualisys system will lock the capture frequency to that of
the external signal source, see chapter "Using External timebase for syn-
chronization to a periodic TTL signal" below for detailed information.
It is also possible to use a time code signal as an external timebase source. Cur-
rently, two standards are supported, SMPTE (requires an Oqus or Camera Sync
Unit) and IRIG (requires a Camera Sync Unit). For more information, see
chapter "Using External timebase for synchronization to a time code signal" on
page 497.
An external timebase can also be used to trigger individual camera frames,
using non-periodic signal mode. For an example of how to synchronize the cam-
era system with burst signals, see chapter "External timebase with bursts of sig-
nals with constant period (cycle) time and with delays between the bursts" on
page 499.
The settings for external timebase are on Synchronization page in the Project
options dialog, see chapter "External timebase" on page 278.
The external timebase is connected to the Sync input port of a Camera Sync
Unit. For an Oqus system, you can also use the Sync in connector of a Trig-
ger/Sync splitter cable connected to the control port of one of the cameras.
Follow these instructions to use external timebase with a periodic signal.
Real-time
1. Make sure that the settings for the External timebase on the Syn-
chronization page in the Project options dialog are correct for your
setup.
2. First time the periodic sync is activated you must wait for the camera to
synchronize after clicking New on the File menu. QTM will report EXT ?
Hz in the status bar for the camera frequency and the dialog below will
appear. The following times a measurement is started the camera system
3. When the camera is synchronized the current camera frequency will be repor-
ted in the right corner of the status bar. The frequency will also be fixed for
the camera system in the Camera settings sidebar and in the Project
options dialog, so that the exposure time settings are limited to the correct
values.
4. The synchronized samples can now be sent with the RT protocol to another
collection system.
Capture
The start of the capture needs to be known if there is an external system that
collects data independently of QTM. There are two ways to know the time of
the start.
1. Record the Sync out signal from the camera on the external system. If you
use external trigger on both of the systems then the start of the capture is
synchronized with the first pulse in the pulse train after the external trigger
signal is sent. Otherwise if the camera system is started without a trigger the
start of the capture is after a very short pause in the pulse train between
when you click on Start in the Start capture dialog and when the camera
starts capturing.
2. Start the camera system and the external system with the external trigger sig-
nal. Then the start of the camera system is in the default mode delayed 20
ms from the trigger pulse, which means that the camera system starts on fol-
lowing Sync in pulse. 20 ms is the minimum delay but the delay can be set to
a higher number on the Synchronization page.
The Qualisys system can be synchronized to an external time code. The fol-
lowing standards are supported:
l SMPTE at frame rates of 24 Hz, 25 Hz, and 30 Hz, without dropped frames.
SMPTE
NOTE: IRIG cannot be used when there are any Oqus cameras included
in the system.
Timestamps
When using time code as external timebase source, time stamps are recorded
by default, so that each camera frame includes a time stamp. When the capture
rate is set to a multiple or divisor of the time code frequency, the first camera
frame of a capture may not correspond to the start of a time code period or
frame. For synchronization to an external signal you will need to look up a cam-
era frame at which the time code increased and take the offset into account.
All Qualisys video cameras can be used with External timebase, but only if the
video capture rate is a divisor of the marker capture rate. The video capture
rate also needs to be an integer so you cannot use any divisor on the marker
capture rate. The presets that can be used are activated automatically for the
To use an external timebase with bursts of signals with constant period time
and with delays between the bursts follow the steps below:
1. First the possible time periods of the measurement system must be cal-
culated.
a. Calculate the smallest time interval t1 between two frames that can
occur during the measurement (i.e. the smallest time interval
between two pulses from the external timebase source that can
occur during the measurement).
b. Calculate the longest interval between two pulses from the external
timebase source that can occur during the measurement, multiply it
by 1.5 and call it t2.
c. Calculate the maximum frequency f as 1/ t1.
CAUTION: Make sure that the capture rate and exposure time stay
within the limits. Not doing so may lead to damage of the cameras, and in
that case the guarantee will be void.
l DO NOT overclock the cameras by setting the external timebase
rate higher than the camera specification. The maximum frame rate
should not exceed 0.975 times the maximum frame rate of a cam-
era at full FOV.
l The exposure must not exceed 1/10 of the period time of the
external timebase frequency.
NOTE: The tracking can become significantly of poorer quality if you use
a timebase with a large difference between the smallest and the largest
delay between two frames. Since the tracking assumes a constant time
period between two frames.
The Camera Sync Unit (CSU) can be used with any Qualisys camera system and
configured to synchronize with an external clock master using the standard PTP
protocol. Follow the steps outlined below to set up the camera system.
1. Start the correct version of the Qualisys Firmware Installer (QFI.exe), see
"How to use Qualisys Firmware Installer (QFI)" on page 471 for more inform-
ation.
3. Under Select PTP mode, select Use standard PTP. The Firmware
upgrade option can be deselected to speed up the process.
5. Reboot the cameras manually to finish the change to standard PTP mode.
The external clock master should be connected to the same Local Area Network
as the cameras. The following requirements apply to the external clock master:
l Standard: IEEE 1588:2008 (PTPv2)
l ipv4/udp
l Two-step clock
For adding the timestamp to the motion capture frames, activate the
timestamp option on the Synchronization page under Project Options and
set the type to Camera time, see chapter "Timestamp" on page 284.
NOTE: Oqus systems can use PTP sync without a Camera Sync unit, but it
requires extra configuration, see chapter "How to use PTP sync with an
external clock master (Oqus)" below.
1. Start the correct version of the Qualisys Firmware Installer (QFI.exe), see
"How to use Qualisys Firmware Installer (QFI)" on page 471 for more
information.
2. Click on the Advanced button.
3. Under Select PTP mode, select Use standard PTP. The Firmware
upgrade option can be deselected to speed up the process.
For Oqus systems without a Camera Sync Unit, one of the cameras need to be
manually set as system master. To manually set the system master, follow
these steps.
1. Start a telnet client and log in to the selected camera (login: oqus; password:
oqus)
l A camera that has been set to system master with the forcetosystemmaster
command will always be system master.
The external clock master should be connected to the same Local Area Network
as the cameras. The following requirements apply to the external clock master:
l ipv4/udp
l Two-step clock
For adding the timestamp to the motion capture frames, activate the
timestamp option on the Synchronization page under Project Options and
set the type to Camera time, see chapter "Timestamp" on page 284.
NOTE: For Oqus systems, the camera timestamp in exported files needs
to be converted as the number of bits in which the timestamp is stored is
limited to 48. It is recommended to add a Camera Sync Unit to the sys-
tem to avoid the need to convert the timestamps. Contact Qualisys sup-
port of you need more information about this conversion.
It is possible to synchronize other external hardware that are not directly integ-
rated in QTM. This can be achieved either by triggering the external hardware
from the cameras or sending a signal from the external hardware to the cam-
eras. Which option to use depends on the specification of the external hard-
ware, read the following chapters and contact Qualisys AB if you have any
questions.
The external hardware must be able to trigger on a TTL pulse or send out TTL
pulse according to one of the following alternatives.
The recommended option is to use the Sync out signal from the cameras. This
is the same signal that is used for synchronizing the analog boards. The Sync
out signal is a pulse train with a TTL pulse that is only active when the camera
is measuring.
In a measurement the pulse train for the default mode with the setting Shutter
out will look as the figure below.
1. Preview
During preview a pulse is sent on each preview frame.
2. Capture
Click on Capture in the Capture menu.
3. Start capture dialog open
When the Start capture dialog is open the pulse continues because the
preview is still being updated.
4. Start
Click on Start in the Start capture dialog.
5. Waiting for measurement
When the camera waits for the start of the measurement the sync output
signal is stopped. How long this period is depends mostly on two things.
l With external trigger this period continues until you press the but-
ton. Therefore we recommend that you use external trigger so that
you have time to initialise the measurement on the external device.
l Without external trigger the period is less than a second.
Another way to synchronize is to use the External trigger input signal. This sig-
nal must be sent to the camera either by a button or by another hardware.
However with this method there is a longer delay from the trigger event to the
start of the measurement. The signal that is used will look as the figure below.
2. Start
Click on start in the Start capture dialog.
3. Waiting for measurement
The camera waits for the trigger event on the External trigger signal.
6. Measurement stop
The measurement can end in three different ways. Only one of them will
have a trigger signal on the External trigger input signal.
l The measurement comes to the end of the specified measurement
time.
In this case there is no pulse on the External trigger signal. The
external hardware measurement must be setup to measure for the
same time as QTM.
l The measurement is stopped manually in QTM.
In this case there is no pulse on the External trigger signal. The
external hardware measurement must be stopped manually as well,
the stop will therefore not be synchronized.
l The measurement is stopped with external trigger.
In this case there is a pulse on the External trigger signal and the
measurement will stop on the next frame. Because the stop pulse
can come at any time it is impossible to tell the delay until the meas-
urement stops.
By connecting the Oqus sync unit to the master camera in the Oqus system,
you can use an SMPTE signal to timestamp and synchronize the measurements.
The sync unit can also handle a video signal (black burst or tri-level) for syn-
chronization (also called genlock). For more information about how the syn-
chronization works in sound applications see chapter "Using SMPTE for
synchronization with audio recordings" on page 512.
NOTE: For the best detection of the SMPTE timecode use one of the
Analog outputs on the Motu device so that you can increase the
amplitude of the signal. Move the dial in the SMPTE program so that
it is pointing right to get a high enough amplitude. The SMPTE time-
code output on the MOTU device is often too low for the Sync unit
to get a stable detection of the SMPTE timecode.
Timestamp
To activate the SMPTE timestamp open the Synchronization page in
the Project options dialog. Then select Use SMPTE timestamp
below the SMPTE heading also make sure that you select the correct
SMPTE frequency, see chapter "Timestamp" on page 284.
Sync in/Video in
The connector can be used either for Sync in or Video in synchronization.
The LEDs next to connector will indicate which input you are using in
QTM.
Sync in
The Sync in connection is standard connector for an external
timebase connection, see chapter "External timebase" on page 278
and "Using External timebase for synchronization to a periodic TTL
signal" on page 494. For example the word clock of up to 48 kHz
from a sound sampling board can be connected to this connection
and then divided to the wanted frequency on the Synchronization
page in the Project options dialog. Make sure that you have set a
divisor that gives a camera frequency that works with the Oqus cam-
era.
Video in
To activate the video synchronization open the Synchronization
page and select Video sync on the Control port setting, see chapter
"External timebase" on page 278. When the Video in option is selec-
ted the sync unit will decode a black burst or tri-level video signal so
that the Oqus camera can lock in on the signal. Just as for the stand-
ard sync-in you need to have time reference to know when the meas-
urement starts, for example you can use the SMPTE signal.
The Qualisys system can be synchronized with audio recordings by using the
SMPTE timestamp via the Oqus or Camera Sync Unit. QTM has been tested with
the MOTU 828mk3, but in theory any equipment with an SMPTE signal can be
used. Check the following information to synchronize QTM with audio record-
ings. For information about the settings for external timebase see chapters
"External timebase" on page 278 and "Timestamp" on page 284.
To be able to synchronize the QTM capture with the audio data it is important
that the SMPTE time code is associated with the audio recording. This can be
done in the following three ways with the MOTU 828mk3.
1. When using the MOTU device on a Mac you can use a special setting
called Generate from sequencer in the MOTU SMPTE console. With this
option the SMPTE signal is generated based on the time position in the
MOTU Digital Audio Workstation (DAW) software used for recording
audio. However, this requires that you start the recording in the MOTU
DAW software before you can start preview in QTM and that the recording
must keep running while the cameras are running. Then it is important to
select the BWF format for the files in the DAW software so that the start of
the SMPTE time can be read from the WAV file header. For detailed inform-
ation about the BWF standard, see
Twin systems
The Twin system feature enables QTM on one computer to control a camera
system that is connected to another computer. With this feature you can cap-
ture data from two systems and then process them in the same QTM file. It is
for example useful if you have an underwater and above water system or if you
want to capture a small volume within the larger volume at a higher frequency.
For information about the settings for twin systems see chapter "Twin System"
on page 333.
How to use frame synchronized twin systems with separate volumes
Follow this procedure to use frame synchronized twin systems with separate
volumes, for example a system above water and a system underwater.
Requirements
NOTE: For Oqus systems at least one camera must be from the 3+-,
4-, 5+-, 6+ or 7+-series if you want to use frame synchronization.
Procedure
1. Set up the two camera systems and connect them to two separate com-
puters.
l It is important that there is a large plane that connects the two
volumes so that you can move the wand in the Twin calibration.
2. Calibrate the two systems separately.
3. Connect the free Ethernet ports of the two computers so that they can
communicate with each other. Use for example one of the alternatives
below:
a. Connect them both to the same internal network.
8. The systems are now connected. You do not need to change anything on
the Twin slave system, all of necessary settings are controlled by the Twin
master system.
b. Enter the Calibration capture time you want to use for the cal-
ibration. It must be long enough to move the wand twice in the
whole plane between the two systems.
c. Enter the Wand length that is used for the twin calibration. If you
enter the 0 the wand length is auto calculated but then you must
make sure that both markers are in one of the camera systems at
the beginning of the measurement.
d. Click OK. The twin calibration is done at the frequency of the Twin
slave system, therefore the Twin master system might need to syn-
chronize to the 100 Hz signal again.
e. Start the twin calibration with the trigger button when you are
ready.
f. Then move the wand with one marker in each camera system. It is
extremely important to move the wand so that you cover as much as
possible of the plane between the two systems.
g. When finished you will get a result where it is important to check
that the wand length is close to what you expect.
11. Activate the processing steps that you want to use.
a. Make sure to always activate 3D tracking on both systems. On the
Twin slave system it is actually not necessary to activate anything
else.
c. Gap-fill, AIM and 6DOF is best to only activate on the Twin master. It
can be done on the Twin slave, but it can result in unnecessary dia-
logs that interrupts the merging process.
d. The export is of course best to do from the Twin master system.
12. You can now start doing measurements controlled by the Twin master sys-
tem.
a. The saving settings on the Start capture dialog controls the beha-
vior of both the Twin master and Twin slave system. This means that
it is recommended to save the measurements automatically,
because then the Twin slave file is merged automatically with the
Twin master file. The Twin slave file is then saved both on the Twin
master computer and the Twin slave computer.
b. The Twin master file will contain the 3D data from both systems and
can be processed just as any QTM file. The data from the Twin slave
system will be labeled with the type Measured slave in the Tra-
jectory info windows. For information on how to work with the twin
files see chapter "Working with QTM twin files" on page 522.
c. Any video on the Twin slave system is not transferred to the Twin
master computer. Video files recorded with the slave system can be
added to the merged capture later by importing the video link via
the menu File > Import > Link to Video File.
When using Twin systems it is necessary to specify the relations between the
coordinate systems of the Twin master and Twin slave systems. Therefore the
two systems must first be calibrated separately with wand calibration. The twin
calibration is then created to transform the Twin slave data to the coordinate
system of the Twin master. The relations are usually established in a Twin cal-
ibration procedure. However it can also be entered manually if actual relations
are not critical.
Preparations
To perform a Twin calibration you need to first make sure that you have made
the following preparations.
1. Set up the two systems so that the Twin master controls the Twin slave,
see chapter "How to use frame synchronized twin systems with separate
volumes" on page 514. And make sure that both systems are calibrated.
2. Make sure that you have a wand that reaches between the two volumes.
It is recommended to use one where you know the length, but you can
also use one of the systems to measure the wand during the twin
3. Enter the Wand length and the number of seconds needed to at least
move the wand in the whole plane twice.
NOTE: If you don't have a wand length you can enter 0 and then
make sure that both markers are clearly visible in one of the sys-
tems when you start the twin calibration.
4. Click on Start and wait for the systems to synchronize. The twin cal-
ibration is always done at Twin slave frequency, so the Twin master sys-
tem might need to synchronize before you can start.
5. Press the trigger button to start the twin calibration and start moving the
wand.
6. Move the wand so that you have one marker in each volume. The twin cal-
ibration will be better if you fulfill these requirements:
a. Try to move it so that you cover the whole plane between the two
volumes.
c. Make sure that the angle of the wand differs when you move it
around in the volumes.
7. When finished the Twin slave data is transferred automatically and then
you get the result of the calibration. Make sure that the wand length is
close to what you have entered and that the standard deviation is not
much higher than the deviations of the regular calibrations.
NOTE: The twin calibration fails if the wand length differs more
than 1 % of the entered value. The reason could be that other mark-
ers alters the twin calibration. Open the files that are saved in the
calibration folder of the project and check the 3D data. Delete any
3D data that is not the wand markers. Then reprocess the Twin cal-
ibration according to the instructions below.
8. If you want to check the actual translation and rotation of the slave data ,
open the Twin system page on the Twin master system and then click on
Calibration.
It is possible to reprocess the Twin calibration from the Project options dialog
or in the reprocessing dialogs. Follow these steps to change the twin cal-
ibration:
1. Click on Calibration on the Twin system page to open the Twin System
Calibration page.
2. Then click on Calibrate to open the Twin system calibration dialog, see
chapter "Twin System Calibration dialog" on page 337.
NOTE: You can also enter a new translation and rotation manually
under the Manual Calibration heading, but it is not recommended
if you want the best accuracy.
The complete data from a QTM twin system measurement is always stored in
two separate files. If you have used the Merge with Twin slave option in the
processing, the slave file automatically gets the same name as the master with
the ending _slave. When the twin slave file is merged with the twin master file,
it is then only the 3D data of twin slave that is merged into the file. The twin
slave cameras are also displayed as greyed cameras in the 3D view.
l Calculate 6DOF
l Calculate force
l Export data
The data of two QTM files can always be merged in reprocessing if you have not
been using the Merge with Twin slave option or if you need to redo it because
the tracking of the Twin slave file has changed.
NOTE: All of the previous Twin slave data is deleted when you merge
twin slave data in reprocessing.
It is recommended to use the same frequency in the two systems to avoid any
interpolation. If you have to use a lower frequency for the slave system then it
is recommended that you use a divisor of the twin master frequency to min-
imize the interpolation.
The interpolation is a linear interpolation meaning that a line is drawn between
the twin slave 3D data. Then any time where the twin slave data does not exist
the linear data will be used instead. The interpolated data will have a zero resid-
ual so that you can distinguish it in post processing.
Twin system with a shared volume
It is possible to use a twin system with a shared volume, if you want to capture
at a higher frequency in parts of the volume. For example if you want to cap-
ture the impact of a golf swing.
2. Test each video system separately, before you connect the systems together.
4. Decide which system that is going to be the Main system and connect the
Camera Sync Unit in that system.
5. The other systems will be Agent systems and must not have any Camera Sync
Unit connected in the system.
6. Make sure that the computers are connected to the same local network.
1. Start the Qualisys Firmware Installer (QFI.exe) and enable Standard PTP in all
of the systems. QFI.exe is located in the Camera_Firmware folder in the
Qualisys Track Manager folder.
a. Start QFI.exe.
c. Under Select PTP mode, select Use standard PTP. The Firmware
upgrade option can be deselected to speed up the process.
2. Use QDS to set different subnets for the camera interface on each computer,
e.g. 192.168.11.1, 192.168.12.1 and 192.168.13.1.
3. On each computer enable the QDS blocklist feature, see chapter "Camera
blocklist" on page 469. Use the MAC addresses saved earlier and block the
MAC addresses of the cameras in the other systems. This means that when
the cameras start they will only get an IP address from the subnet they are
supposed to be on.
Daisy chain the Ethernet switches so that all of the camera systems are con-
nected. The systems must be connected for the cameras to be synchronized to
the same PTP master clock, which is the Camera sync unit in the Main system.
Below is an example of how the systems are connected.
1. For the Main system, set the Start delay to 1 second on the Cameras page
in Project options, see chapter "Start delay" on page 248. This setting is to
ensure that there is enough time to send the capture start time from the
Main system to the Agent systems.
2. For each of the Agent systems, set the Wireless/software Trigger function to
Start capture and then set Start/stop on UDP packet to Listen for Main
QTM instance, see chapter "Wireless/software Trigger" on page 267. This set-
ting configures QTM so that it only listens for the UDP packet with the cam-
era start time and not the generic UDP start/stop packet.
3. If there are other computers with camera systems on the local network, then
the port used for the UDP packet must be configured so that they are dif-
ferent. For both Main and Agent systems, change the port to e.g. 8990 in the
Capture Broadcast Port option on the Real-Time Output page in Project
options, see chapter "Real-Time output" on page 387. If the port isn't
changed then there is always a risk that a system which isn't the Main system
triggers the start of the Agent systems.
Preparations
Choice of markers
Motion capture measurements require the use of markers. The markers can
either be reflecting or emitting near infrared light. The first type is referred to
as passive markers, and the latter is referred to as active markers.
Passive vs Active markers
Passive markers can be made very light-weight and in many different sizes,
therefore it is usually the best option. However, in some setups it can be good
to have an active marker, for example because of unwanted reflections or at
very long distances.
Qualisys offers several types of active markers for different applications. For
more information about Qualisys active marker solutions, see chapter "Active
marker types" on page 1000.
You can select the Untriggered active marker option if you are using generic,
constantly lighting active markers. When using this setting, the strobes of the
cameras are inactivated to minimize the amount of unwanted reflections.
For more information about how to use active markers, see chapter "How to
use active markers" on page 531.
Marker size
It is important to choose the correct marker size, since a larger marker gives a
larger area to use to determine the central point of the marker. The central
point data is the input to the tracking of the markers movement and is there-
fore very important.
Rules to determine which marker size to use are:
l Use as big markers as possible without the markers being in the way or
restraining normal movement.
The markers should be placed in a way so that they are visible during as much
of the measurement time as possible. Check that clothes or rotating objects,
such as body parts, do not move in a way that hides the markers from the cam-
era.
There is a wide range of marker sets for tracking humans for biomechanical
and animation applications. For detailed information, refer to the following
resources:
l Marker guides for animation, available via the Skeleton menu.
l Marker guides for analysis modules, available via the Show guide button
in the PAF project view.
l QAcademy tutorials. There are several QAcademy tutorials available on
how to apply specific marker sets, see https://www.qualisys.-
com/qacademy.
Active markers can be used in situations when it is hard to use passive markers,
for example, when measuring outdoors at long distances, or when there are
many unwanted reflections from the measured object or the capture volume.
In addition, the Active/Naked Traqr and the Short Range Active Marker use
sequential coding for automatic identification of markers or rigid bodies.
When using active markers you need to specify the correct marker type in the
Project Options under Cameras > Marker Mode. The following active marker
options are available:
Active
Use this mode for the sequentially coded active markers: Active or Naked
Traqr or the Short Range Active Marker. In this mode the camera strobes
are used to send a pulsed signal ahead of the exposure to trigger the act-
ive markers. Only active markers are visible as the camera strobe is inact-
ive during exposure.
Active + Passive
In this mode the camera will capture both passive and sequentially coded
active markers. This mode can be used if you need to add some tem-
porary markers to the subject and do not want to add active markers. If
you mix the passive and active markers all the time you lose some of the
advantages of both types, see chapter "How to use Active + Passive mode"
on page 533.
Sequential coding is implemented in the Active and Naked Traqr and the Short
Range Active Marker. By using sequential coding, the trajectories are auto-
matically identified in QTM without the need of an AIM model or a rigid body
definition.
The sequential coding configuration and supported options depend on the type
of active marker and the camera system.
Active/Naked Traqr
The Active and Naked Traqr is configured using the Traqr Configuration
Tool. The Traqr Configuration Tool can be used to set the ID range and
the marker IDs of the individual markers on the Traqr. The Traqr supports
the use of the extended ID range option. The ID range is a global setting
that applies to the whole system. The options are:
Standard (1-170)
Standard ID range of 170 uniquely defined markers. This is the
default option.
Extended (1-740)
Extended ID range of 740 uniquely defined markers. Use this option
if you have more than 170 markers. This option is only supported
with Arqus or Miqus cameras. The extended range option cannot be
used if there are any Oqus cameras included in the system.
NOTE: In the Passive and Active mode the AIM model does not work
the same way as when you only have active markers.
The Active + Passive mode can for example be used in setups where you need
to add some temporary markers for a static model. In that case it is often
easier to add passive markers than to add more sequentially coded active mark-
ers. There is however disadvantages to the mixed mode and it is therefore not
recommended to be used in all of your measurements. The disadvantages com-
pared to just using the active markers are the following.
l If the AIM model includes both passive and active markers it has to be cre-
ated from a file with the correct marker placement.
l It can be harder to identify the correct IDs for the active markers, since a
passive marker that is badly tracked can be mistaken for an active marker.
Exposure delay can be used in environments where the strobe light from one
set of cameras disturbs marker capture in other cameras. It is good to first con-
sider the following changes to the setup.
1. To use the delayed exposure, you must first find out which cameras are
causing reflections in other cameras. It is usually best to use the marker
intensity mode to see where the reflection comes from.
2. When you know the cause of the reflections you place the cameras in dif-
ferent groups on the Cameras page in the Project options dialog, see
chapter "Exposure delay" on page 236.
a. Activate the Exposure delay mode called Camera groups, which
will calculate the delay for each group automatically by comparing it
to the longest exposure time of the cameras in the previous group.
Do not use the Advanced mode unless you are absolutely sure of
what you are doing.
b. Select the cameras that you want in the group from the list to the
right.
c. Then select the group on the Camera groups setting. Make sure
that you always start with group 1 and continue with 2 and so on.
d. Repeat steps b and c for each group that you want to create. Usually
for a setup with cameras on two sides of the volume it is enough
with two groups, one for each side. It is also possible to change the
exposure group of a camera by right-clicking on the camera live feed
in the 2D view window and selecting the exposure group via the con-
text menu.
e. When the exposure delay is activated Using delayed exposure is
displayed in the status bar.
3. To check which groups the cameras are in, go to the 2D view and check
the delayed exposure setting next to the camera number.
By delaying the exposure between camera groups, the markers may have
moved between the respective group exposures. The 3D tracking algorithm
applies a compensation for this displacement of the 2D positions for the
delayed cameras based on the measured movement of the markers. The com-
pensation is effective for most capture applications of human movement, such
2. Make sure that most cameras are assigned to the lowest possible expos-
ure groups, so that most cameras are in exposure group 1, followed by
group 2, etc.
3. Use short exposure times to minimize the delay
NOTE: The time of exposure will return to the default value if the camera
is in video mode.
NOTE: Reflections that come from the flash of the camera itself cannot
be removed with a delay. Then you have to either cover the reflective
material, optimize the exposure settings or use marker masks.
Marker masking
The marker masking is a tool to delete unwanted markers in the 2D data. The
masking areas are defined per camera and can be drawn manually in the 2D
view window, or created using the Auto mask function. The masks can also be
viewed and managed on the Cameras page in the Project options dialog. For
explanation of the settings, see chapter "Marker masking" on page 232.
Camera masks are applied during the measurement. The masked data is not
transmitted to QTM, and it will not be possible to restore any masked markers.
2. Open a 2D view window so that you can see the 2D markers in preview.
4. Select the Masker mask tool in the 2D view toolbar and draw a mask
over the area with the unwanted marker. The mask is indicated as a dark
green area on the 2D view of the camera.
a. The mask can be resized and moved by placing the cursor and hold
down the mouse button on the edges of the mask respectively on
the mask. To delete a mask or all masks for the current camera,
right-click on it and select Delete this mask, Delete all masks or
Delete all masks from all cameras.
NOTE: Marker masks are not drawn linearized. To see the true pos-
itions of the masks, you need to turn off the Show linearized data
option on the 2D view settings page, especially for wide-angle
lenses.
5. Keep adding masks until all of the unwanted markers are covered. There
can be up to 20 masks per camera for Arqus and Miqus cameras and 5
The auto marker masking can be used to remove unwanted static reflections
automatically. Follow this procedure to apply the auto marker masking.
2. Check if you have any unwanted markers in the cameras. Before using
auto marker masking try to remove the physical cause of the marker. Also
try changing the Exposure and Marker threshold settings to remove the
markers.
3. If the unwanted markers cannot be removed, make sure that you are in
preview mode and then open a 2D view window.
4. Select the cameras where you want to add marker masking with the cam-
era buttons. You can select all of the cameras or just some of the cam-
eras.
5. Click the Auto-create button on the Camera settings sidebar to open
the Create marker masks dialog.
6. Click Start to get the position of the unwanted markers. The auto mask
function will identify the largest reflection and mask them. If there are
reflections close to each other, the auto mask function will try to join the
masks.
NOTE: Marker masks are not drawn linearized. To see the true pos-
itions of the masks, you need to turn off the Show linearized data
option on the 2D view settings page, especially for wide-angle
lenses.
7. To see that the masks really cover a marker you can deselect the Enable
marker masks checkbox. The masks will then be inactivated and the
markers below will appear.
Active filtering for capturing outdoors
The active filtering mode is most useful in daylight conditions, because it filters
out background light. This is done by capturing extra images without the IR
flash, these images are then used to remove the background light from the
images used by the camera. Active filtering is available on all Qualisys camera
types, except the Oqus 3- and 5 series. The Miqus Video and Miqus Hybrid cam-
eras also support active filtering during calibration and in marker mode.
The standard and recommended setting is the Continuous mode in which an
extra image without IR flash is capture before each actual image. Therefore the
maximum frequency is reduced by about 50% compared to the normal max-
imum frequency.
NOTE: Active filtering does not help if you have problem with extra reflec-
tions from the IR strobe of the cameras. In that case you must try to
remove the reflective material or use the delayed exposure setting, see
chapter "Delayed exposure to reduce reflections from other cameras" on
page 534.
2. Turn off the Active filtering option in the Camera settings sidebar and
look at the marker intensity mode.
l Use a quite low Exposure time around 200 µs. Then change the
exposure time so that the markers are just at the maximum end of
the red color.
3. Turn on the Active filtering option and check that the markers are clearly
distinguishable in preview. It is best to do this check on the subject that
you are going to measure and not on static markers.
l If the markers are not sufficiently distinct from the background, try
to increase the Marker threshold setting until it is better.
4. Continue to calibrate the system as usual.
Discarded markers are not sent to QTM, which means that the on-camera
marker discrimination is irreversible, and cannot be undone by reprocessing a
file. The advantage of on-camera marker discrimination is that it can unload
QTM from processing obsolete markers, which can lead to increased pro-
cessing speed. This can be important, especially for real-time applications.
If you want to keep all data, filtering can also be applied post-hoc as a pro-
cessing step by preprocessing the 2D data in QTM. The alternative methods
are:
The non-circularity marker settings can be used to correct or delete bad or par-
tially hidden markers. The quality of 3D data and how to use the filter depends
a lot on the number of cameras in the system. If you have three or more cam-
eras covering the same markers, it is usually better to try to remove the non-cir-
cular markers. Because then you have enough data to create the 3D data
anyway. However if there is usually just two cameras viewing the same markers
the data can become more complete by correcting the non-circular markers.
This will result in more 3D data, which otherwise cannot be calculated.
The settings for the Qqus camera is activated on the Cameras page in the Pro-
ject options dialog, see chapter "Marker circularity filtering (Oqus)" on
page 234. The markers used by the filter are filtered out by the camera depend-
ing on the Circularity level setting.
How to handle the non-circular markers in then set on the 2D Preprocessing
and filtering page by the two options: Correct center point of non-circular
markers and Discard non-circular markers, see chapter "How to use cir-
cularity filter (Oqus only)" on page 610.
When you turn on the filtering the markers will be color coded in the 2D view
window according to his list:
White - Circular markers
These markers are below the Circularity level and are therefore not
handled by the filter.
Red outlines
These are the outlines created from the segments sent from the camera.
If there is no marker inside the outline, then it has been discarded. It can
only happen with the Discard non-circular markers option.
When to use the marker filtering and its effectiveness depend a lot on the cam-
era setup. Check these points to figure out if it will be useful in your setup:
Marker size
One thing that must be considered is the 2D size of the markers. To be
used by the filter the marker has to be larger than 320 in marker size.
Therefore all markers below this size are considered as circular markers.
Also markers that are too large (>2000) will make it hard for the camera to
have time to calculate whether the marker is circular or not.
Number of markers
The Qualisys camera needs to store the segments in a temporary memory
when trying to find the markers that are non-circular. This memory can be
full if there are too many markers that are on the same horizontal level in
Introduction to calibration
The QTM software must have information about the orientation and position of
each camera in order to track and perform calculations on the 2D data into 3D
data. The calibration is done by a well-defined measurement procedure in
QTM. Calibration is, however, not needed for a 2D recording with only one cam-
era.
There are two methods that can be used to calibrate a camera system:
Wand calibration
The most common type of calibration, see chapter "Wand calibration
method" on page 547.
The following items are important to think about before the calibration:
Calibration dialog
Calibration time
Set the duration of the calibration. Make sure that the calibration time is
sufficient to cover the whole volume that needs to be calibrated without
rushing.
For Fixed camera calibration the length of the calibration is not especially
important, since the data of the frames is averaged. Therefore, the min-
imum calibration time of 10 seconds can be used.
Linearization parameters
The linearization parameters tab shows the status of the linearization.
The linearization files must be specified, otherwise, you cannot start the
calibration. Click Load if the files have not been specified.
Options
Open the Calibration settings to modify the settings in case they are not
correct.
Cancel
Quit the Calibration dialog.
OK
Start the calibration.
The steps below are just an outline of what should be done to calibrate the cam-
era system with Wand calibration method.
Follow these steps to calibrate the camera system:
4. Set the settings on the Calibration page in the Project options dialog,
see chapter "Calibration" on page 253.
5. Click OK.
NOTE: If you have force plates there will be a warning reminding you of
measuring the force plate position again. Since it has most probably
changed some with the new calibration.
NOTE: If any problems with the calibration process occur, check the set-
tings on the Calibration page in the Project options. If that does not
help, check the troubleshooting list in chapter "Troubleshooting cal-
ibration" on page 1024.
Calibration tips
In the pictures above the reference structure is not indicated to make the pic-
ture more distinct. The reference structure must of course always be present
during the calibration. The box in the figure represents the measurement
volume.
It is not necessary to hold the wand in distinct directions as described above,
but the wand can be moved more freely as well. The most important is that the
measurement volume is well covered and that the wand orientation is varied.
The best moving method may vary depending on the application. It is recom-
mended that the moving method is systematic and easy to repeat for con-
sistent results.
Extended calibration
The advanced calibration can be used to simultaneously optimize both the cam-
era linearizations (intrinsic calibration) and the capture volume calibration
(extrinsic calibration). In many cases, this will lead to decreased 3D residuals of
the measured trajectories and improved 3D tracking.
The advanced calibration can be specially beneficial in the following cases:
l Systems with wide-angle lenses
For the advanced calibration, it is important that all cameras are sufficiently
covered by the movements of the calibration wand. Mostly, this can be
achieved by optimizing the camera setup for the used capture volume accord-
ing to the following guidelines.
l Make sure that the cameras are pointed in a way so that the central part
of the sensor is used.
l Make sure that for each camera a significant part of the sensor is covered
by the movements of the wand. The recommended sensor coverage is at
least 75%.
l Make sure that there is sufficient depth coverage for all the cameras.
The advanced calibration uses the same calibration settings as the standard
wand calibration, and is performed in the same way.
Follow these steps to perform an advanced calibration:
1. Make sure that the calibration options are correct under Project Options
> Input Devices > Camera System > Calibration.
3. Start a preview.
4. Open the Advanced calibration dialog via the Capture menu: Capture >
Advanced calibration (beta)....
5. Set the duration of the calibration and press OK to start.
6. During the calibration, move the wand through the whole capture volume,
varying the orientation of the wand (see chapter "How to move the wand"
on page 548).
7. When done, inspect the calibration results.
1. Open the original calibration file from the Calibrations folder in the pro-
ject.
2. In the Capture menu, click on the Advanced calibration (beta)... button
to open the recalibration dialog.
3. Click OK to start the recalibration.
The advanced calibration always takes the factory linearizations that are stored
on the cameras as a starting point for the optimization process. After suc-
cessful advanced calibration, the linearization files on the Linearization page
in the project will be replaced with the new files, which are stored in the lin-
earization files folder (for the folder location, see "Folder options" on page 427).
The advanced calibration will not replace the factory linearizations that are
stored on the cameras.
Once the new linearization files are loaded in the project, they will be used for
subsequent standard calibrations. If you want to perform a standard cal-
ibration based on the factory linearizations, you will need to restore the lin-
earization files in the project. The easiest way to achieve this is when the
camera system is available.
If the cameras are not connected or available, the factory linearization files can
be restored by loading a standard calibration that was based on factory lin-
earizations, or by loading them from a folder on the computer in which they
are stored. In case the cameras have been connected to the computer, the fact-
ory linearization files will be stored in linearization files folder.
Use scenarios
For the new wand kits (carbon fiber 300 and 600, 120, active 500) the origin is
automatically translated to the corner point of the L-frame at floor level.
On custom or legacy calibration kits, the origin is by default placed in the center
of the corner marker on the L-frame. If you want to move the origin to another
position you can use the settings on the Transformation page, see chapter
"Transformation" on page 259. Use the following distances (in mm) if you want
to translate the origin to the bottom corner of the L-frame. The distances are
for the Z-axis pointing upwards, the X-axis pointing along the long arm and Y-
axis pointing along the short arm.
Wand 300
x: -8.5, y: -8.5, z: -20.5
Wand 110
x: -10.0, y: -10.0, z: -7.5
NOTE: For the 300 and 750 reference, the bottom corner of the L-shape
is 1 mm outside of the U-profile. The figures above are from the corner of
the bottom plate, if you want the U-profile you should subtract 1 mm
from x and y.
NOTE: For more information about how to install and use a fixed camera
system contact Qualisys AB about the QTM - Marine manual. They include
detailed descriptions of the camera installation, survey measurement,
fixed camera calibration, validation and the use of 6DOF bodies in marine
applications.
At the top you get a message if the calibration passed or not. There is also a
warning if the cameras are using Exposure delay.
There are four buttons at the bottom of the dialog, clicking them have the fol-
lowing effects.
OK
Close the dialog and the calibration file.
New measurement
Close the dialog and the calibration file and open a new capture file in pre-
view mode.
View Calibration
Track the calibration file and close the dialog.
Export
Export the calibration results to a xml file. The exported file also includes
the rotation matrix for the cameras. The default folder to save in is the
Calibrations folder in the project folder, see chapter "Project folder" on
page 61.
Quality results
The quality results under the Camera results heading are camera specific. For
each camera ID there are the following five results:
X (mm), Y (mm) and Z (mm)
The distance (in mm) between the origin of the coordinate system of the
motion capture to the optical center of the camera. The distances are
respectively in the X, Y and Z direction.
Points
Number of points used in the calculation of the distance above. The num-
ber should be as many as possible, but without large differences between
the cameras. The maximum number of points for a Wand calibration
depends on the calibration time and the number of frames used in the cal-
ibration. If the camera has more than 500 points it is usually enough for a
normal measurement volume. For the other methods it depends on the
number of markers seen by the camera.
Finally the calibration time when the calibration was performed is displayed at
the end.
View Calibration
With View Calibration the calibration is tracked and opened in a 3D view win-
dow. For a Wand calibration the movements of the wand is shown and the
measurement volume can be confirmed. For the other two methods the pos-
itions of the markers can be confirmed in the 3D view.
Calibration failed
If the calibration fails the calibration result will say Calibration failed and an
error message is displayed after each camera ID.
The error messages are as follows:
General calibration failure
Something is wrong in the calibration. Check the calibration settings.
The calibration is older than the first time limit or you have had a warn-
ing from the residual check after the last measurement.
The calibration is obsolete, i.e. it has passed the second time limit. It is
recommended to calibrated again, however it can still be used.
For residuals that are just slightly over the limit the data will still be ok, espe-
cially if you have more than 2 cameras that can see each marker. However it is
recommended that you check the setup to see if it can be improved and then
calibrate the system.
l First check that the markers 2D size of the markers are not much smaller
than in the other cameras. This can cause a higher residual than normal.
l Then you should check if the camera might have moved. For example if a
screw is not tight enough in the mounting the camera might move slightly
by its own weight causing a residual that slowly increases. This can be con-
trolled by making a measurement of just static markers 1-2 hours after
the calibration. Then if you get the error message it is very likely that the
camera has moved slightly.
If the warning instead is about the camera having too few contributions, then
you get the warning below. In this case the data is not used in the 3D tracking
so the 3D data you have has not been degraded. However it is recommended
to check the 2D data to see what the reason is and then calibrate the system.
l Turn on the 3D overlay and check if the 2D markers match the 3D mark-
ers. It could be that the camera has moved so that it no longer con-
tributes to the 3D data.
Recalibration
An existing calibration can be recalibrated to improve it or solve problems, for
example if a calibration failed. The following problems can be corrected:
l Wrong calibration kit chosen
l Deactivated cameras
1. Open the calibration file of the wanted capture file. The calibration file is
in the Calibrations folder in the project folder, see chapter "Project
folder" on page 61.
5. The Calibration results dialog is shown. Click Use if you want to use the
reprocessed calibration as the current calibration. OK will only close the
Calibration results dialog.
IMPORTANT: All the capture files that have used this calibration must
then be reprocessed, see chapter "Reprocessing a file" on page 601.
1. Disable the cameras that can't be used for a certain volume on the Lin-
earization page in Project options.
2. Calibrate with the remaining cameras in the system. Remember the name
of the calibration file.
3. Enable the other cameras on the Linearization page and disable the first
set of cameras. It is possible to merge any number of calibration files so
the system can be divided into any number of calibration files.
4. Calibrate the system again and remember the name of the calibration
files if there is more than one.
5. When all calibrations are finished, open the Current calibration page in
Project options and click on Load other.
6. Select all of the calibration files made in the previous steps and click on
Open.
7. The calibrations are merged and the result of the merge is displayed in
the Calibration results. The merged calibration is saved as a QCA file but
it only includes the calibration results and not the individual 2D data. If
any reprocessing is needed for the calibration use the original files and
repeat the merge.
Capturing data
Introduction to capture
To capture data you need to have a project in QTM. It is recommended to make
a new project if you for example change the marker setups or if you want to
work with specific camera settings. For more information about projects see
chapter "Projects" on page 60.
Usually before you start a measurement it is best to open a new empty capture
file with New on the File menu. The file will be opened in preview mode where
you can check measurement volume and settings before starting the capture.
NOTE: When starting a preview, QTM will detect automatically if the cam-
era system has old firmware. The firmware must then be updated before
the system can be used. For more information see chapter "Firmware
update when starting preview" on page 471.
3. Open a new file by clicking the New file icon . The cameras will start in
the same mode as last preview or measurement.
NOTE: If the system has been calibrated and is ready for a meas-
urement you do not have to open a new file. Click the Capture icon
and go directly to step 6.
4. Check that the markers are visible, otherwise change the camera settings
see chapter "Tips on marker settings in QTM" on page 483.
5. Calibrate the system, see chapter "Calibration of the camera system" on
page 543.
6. Go the Processing page in the Project options dialog to activate any pro-
cessing steps you want to apply directly after a capture.
9. Check that all of the settings under the Camera system settings heading
are correct.
10. Click Start to start the capture. When the capture is finished, all of the pro-
cessing steps will be performed and then the new motion capture file is
displayed in QTM, unless batch capture is activated. In batch capture,
QTM will immediately start waiting for the next capture as soon as the pro-
cessing of the previous capture is finished. An ongoing capture can always
be stopped with Stop capture on the Capture menu or by clicking the
Stop capture icon .
NOTE: If any problems occur during capture, check the settings in the
Project options dialog and in the Start capture dialog. If that does not
help, check the troubleshooting list in chapter "Troubleshooting capture"
on page 1025.
The Start capture dialog appears before the start of every capture or batch
capture.
Click Start to start a capture or click Options to change the settings in the Pro-
ject options dialog.
Capture period
Under the Capture period heading, the capture period is specified in seconds
or in number of frames. Fractions of a second can be specified, which will be
rounded off to the nearest number of frames. If a Qualisys camera is in Video
mode the number of Video frames, the corresponding video capture rate and
the maximum number of video frames are also displayed.
Under the Capture delay and notification heading there are options for delay-
ing the start of the capture (Use capture delay) and for sound notification on
start and stop of the capture (Use sound notification on start and stop).
When the Use capture delay option is used the delay is specified in seconds in
the text box next to the option.
Automatic capture control
Under the Automatic capture control heading there are options for auto-
matic saving of measurement files and whether to use batch capture.
Select the Save captured and processed measurement automatically
option to automatically save the measurement file. Enter the folder in which
the files will be saved in the Folder option. You can Browse for a folder or use
the Reset to project folder option to reset the folder to Data folder in the cur-
rent project. Set the name of the files in the Name option, an automatic
counter can also be assigned to the filename.
When Batch capture is selected, QTM will make several consecutive meas-
urements. When batch capturing Save captured and processed meas-
urement automatically and the automatic counter must be selected. When
Wait between captures is checked the user will be prompted to start each
new capture. If unchecked, the next capture will start as soon as QTM is ready
after a previous capture. For information on batch capture, see chapter "Batch
capture" on the next page.
Under the Camera system settings heading the measurement settings are dis-
played. The settings can be changed by right-clicking on the entry and then click
Change or Reset to default value. The Project options dialog can also be
reached by clicking Options.
Batch capture
With batch capture, QTM will capture several consecutive measurements. Batch
capture is activated with the Batch capture option on the Start capture dia-
log, before the start of the measurements. In this dialog, the options Save cap-
tured and processed measurement automatically and Add counter must
also be selected so that the each measurement is saved in a separate file.
Before each new measurement in a batch capture QTM will wait for a start sig-
nal and the whole batch capture is stopped by a stop signal. These signals are
given in different ways depending on whether external trigger is used or not.
During the measurement the border of the view window will indicate the
status.
External trigger
If the external trigger is used to start each measurement, to stop the
batch capture you need to press Esc or click Close on the File menu. Stop
capture on the Capture menu can be used during an individual capture
to stop just that capture.
No external trigger
Start each measurement by clicking Yes in the Next measurement dia-
log. Stop the batch capture by clicking No in the dialog. Stop capture on
the Capture menu can be used during an individual capture to stop just
that capture.
Auto backup
The measurement can be saved automatically directly after the data is fetched
with the Auto backup option on the Processing page in the Project options
dialog. When activated a temporary file with the 2D data is saved before the
other processing steps. If the file is very large the auto backup may take several
minutes. Then if QTM crashes during the processing of the data the file can be
retrieved at the next startup of QTM.
NOTE: When the capture rate is too fast, the real time frequency is
reduced. When storing the real-time data it means that 3D data is only
processed for part of the frames.
Reprocessing is recom-
Reprocessing is recom-
mended, because the delay
mended, because the delay
between eyetracker and
between eyetracker and
6DOF data is not com-
Eyetracker 6DOF data is not com-
pensated for in real-time.
pensated for in real-time.
Gaze data is calculated for
The Gaze data is calculated
all of the frames that
for all of the samples.
include 6DOF data.
All video frames are included All video frames are included
External video and the time offset is applied and the time offset is applied
to the video. to the video.
Qualisys offers a range of camera models that can be used as dedicated video
cameras. Such cameras feature a clear front glass, a white strobe. Oqus high-
speed cameras also have a larger buffer memory buffer for storing video
frames. Dedicated video cameras are the Miqus Video and Video+ series, and
Oqus 2c. Furthermore, several Oqus types are available in a high-speed video
configuration (monochrome image). For more information about the various
types of Qualisys video cameras, see chapter "Video cameras" on page 434.
1. Make sure that the cameras that will be used for recording video are in
video mode.
2. Choice of video settings (capture rate, exposure, compression, etc.). The
available video settings and the interface for changing them in QTM
depends on the camera model. The video capture rate can be set indi-
vidually per camera and may be different from the capture rate of the
cameras in marker mode. Some camera models have the option to use in-
camera MJPEG compression. This allows a longer capture, however, at a
reduced maximum capture rate, see chapter "Qualisys video sensor spe-
cifications (in-camera MJPEG)" on page 927.
3. Calibration of Qualisys video cameras is optional, but required for 3D
overlay. Qualisys video cameras can be calibrated together with the
marker cameras in the system. During the calibration, the Qualisys video
camera will be automatically switched to marker mode, so it is important
to make sure that the marker settings are correct. If you do not want to
include a video camera in the calibration, you can uncheck it in the lin-
earization options, see chapter "Linearization" on page 249.
4. Start a new capture. Depending on the camera model and settings, video
is streamed to QTM during the measurement or buffered on the camera
and fetched after the capture is finished.
5. The video from each camera will be stored in a separate AVI file and can
be played back in QTM, see chapter "Qualisys video files" on page 586.
You can then for example activate the 3D overlay on the video, see
chapter "3D data overlay on video" on page 587, and export the view with
the Export to AVI feature on the File menu.
Calibrating Qualisys video cameras
Qualisys video cameras are calibrated in the same way as Qualisys marker cam-
eras. During the calibration, the video cameras will be automatically switched
to marker mode. Follow these steps to calibrate Qualisys video cameras.
1. First, you need to set the marker settings. Switch to Marker intensity
mode and adjust the exposure and threshold so that the markers are red,
and the background is dark blue.
2. Make sure that the cameras have enough overlap with other cameras
within the system. Preferably, the L-frame is visible, but this is not
required.
3. Start a calibration. The video cameras will be calibrated together with the
marker cameras in the system.
The settings of streaming video cameras (Miqus Video, Miqus Video Plus and
Oqus 2c) are available under the Streaming Video settings in the Camera set-
tings sidebar. The Camera settings sidebar allows to adjust basic settings, see
chapter "Camera settings sidebar" on page 91. Advanced video settings are
available on the Cameras page in the Project options, see chapter "Video set-
tings" on page 238.
The following steps show how you can change the settings for streaming video
cameras.
1. Choose the video capture rate. The buttons show a selection of common
video capture rates. Integer divisions or multiples of the current marker
capture rate are indicated bold. Note that the maximum frequency, as
well as the frequency values of the buttons, may change depending on
the selected resolution and aspect ratio (steps 2 and 3).
NOTE: The file size of the recorded videos can become very large when
using a high resolution and a high video capture frequency. If you need
the video for documentation purposes, you can reduce the file size con-
siderably by using a lower resolution and a lower capture frequency (e.g.
540p @25 Hz).
The maximum capture rate for streaming video cameras depends on the selec-
ted resolution and aspect ratio. The available capture frequency range for the
current combination of settings is shown under Video settings on the Cam-
eras page in Project options. The below Tables give an overview of the max-
imum frequency values for Miqus Video and Oqus 2c.
Oqus 2c
NOTE: If you have mounted the external IR filter on the lens that
must be removed, so that the visible light is recorded.
Data storage
High-speed video generates a large amount of data. During a capture, the
data is buffered in the camera. The maximum amount of data that can be
captured is 1.1 GByte. After the capture, the video data is downloaded
from the camera, which can take up to about 5 minutes.
The number of frames that can be captured depends on the frame rate
and the image size, see chapter "High-speed video" on page 960. For
another resolution multiply the amount of data with the reduced res-
olution/full resolution.
Outline of how to capture high-speed video
The following outline describes how to capture high-speed video with an Oqus
camera.
2. Open a new file by clicking the New file icon . If you want to use the 3D
overlay functionality it is important to check that the camera system has
been calibrated.
3. Switch to 2D view if it is not visible, i.e. right-click in the View window and
select Switch to 2D view from the menu.
4. Right-click on the 2D view for the camera that will capture high-speed
video. Select Mode/Video to switch to Video mode. The 2D view for that
camera will switch to a video image.
5. Open the aperture to at least 4 and set the focus.
6. Change the settings for Video capture in the Camera settings sidebar.
a. Set the video Capture rate. This can be set independently of the
marker image rate.
b. Set the Exposure time to a value that makes the image bright
enough, test until you are satisfied.
If you have no extra light the exposure time needs to be quite high,
at least 16000 microseconds or even up to 40000. This limits the cap-
ture rate that can be used. It also means that fast movements will be
blurred.
For high capture rates and measurements with fast movement,
extra lighting is needed because the exposure time must be
extremely short, sometimes as short as 100 microseconds.
Use a Codec if you want to reduce the size of the avi-files. If the file
will be used in analysis it must not be reduced too much as it can
influence the analysis.
NOTE: The Image format will reduce the image directly in the
camera, but then you will lose pixels.
A codec can be used to significantly reduce the file size of the high-speed video
files. Any codec that can run on Microsoft DirectShow can be used in QTM to
compress Oqus high-speed video files. There are many codecs available. For a
selection of recommended codecs that have been tested with QTM, see the list
below.
Recommended codecs
BlackMagic/Decklink MJPEG
l A lossy codec using JPEG compression of each video frame.
l The codecs are included with BlackMagic Desktop Video software. The
software can be installed, even when you do not have a BlackMagic
design video interface.
l For the latest version tested with QTM, use the download link provided at
https://www.qualisys.com/info/recommended-codecs/.
Video
In the Video mode the cameras use the exposure and flash time
from the Video settings on the Cameras page. This is the mode
that you use when you want to capture video with the Qualisys cam-
era.
4. To optimize the preview frequency the number of pixels in the image are
automatically changed depending on the 2D view size and the zoom.
5. Use the Video mode and set the following settings on the Camera setting
sidebar to get a bright image in preview with non high-speed Qualisys
cameras.
Capture rate
Set the Capture rate to 10 Hz.
Flash time
The flash does not change the brightness of the image a lot on long
distances. Therefore the Flash time can be set to 0. If you want to
see the markers better, e.g. when setting the focus on a marker, you
can set the flash time to about 400 microseconds.
6. Then to see more of the video image double-click on the camera so that only
that camera is shown in the 2D view window. It is also possible to zoom in
the image, see chapter "Selecting cameras, zooming and panning" on
page 87.
7. The following list are some of the things that the Video preview can be
used for.
Check the cause of extra reflections
Zoom in on the extra reflection in Marker intensity mode if neces-
sary.
NOTE: If you have compressed the video file with a codec the computer
where you play the file must have that codec installed.
The AVI files from Qualisys cameras contain meta information about the
QTM version, the capture time and the SMPTE time code, if used, according to
the AVI standard (isft, idit and ismp)
The 3D data can be overlayed on the Qualisys video data to show the 3D view
from that camera's viewpoint. This can for example be used for showing the
force arrow in the video of someone stepping on a force plate. Which 3D
objects that are displayed in the 3D overlay is optional.
1. Calibrate the camera system, including the Qualisys cameras that are
used in video mode.
2. Open a 2D view window in RT/preview mode or in a file.
3. Right-click on the camera where you want to turn on the overlay and
select Show 3D data overlay.
4. The 3D elements displayed in the overlay and the opacity can be changed
on the 2D view settings page in the Project options dialog, see chapter
"2D view settings" on page 417.
5. The video data display in the 2D view can be switched between linearized
and unlinearized on the 2D view settings page. To match the video with
the 3D data the data must be linearized.
First of all the camera systems must be calibrated in one calibration file so that
the video can be processed in Theia. Follow these steps to calibrate the system:
2. Restart the cameras on the Agent systems and shutdown QDS before they
receive the IP address from the Agent system. The cameras will now get
an IP address from the Main system.
3. Calibrate the system with all video cameras, for general advice on cal-
ibration of video cameras, see chapter "Calibrating Qualisys video cam-
eras" on page 576.
4. Enable the blocklist in QDS on the Main system.
6. Restart all cameras, so that they get the IP address from the correct sys-
tem.
Setting up synchronization
When the system is calibrated follow these steps to capture synchronized video
in all of the systems.
1. Start the capture on all of the Agent systems so that QTM is in the Waiting
for trigger mode. The cameras are waiting for the UPD packet with the
camera start time from the Main system, for configuration see chapter
"Setting up multiple video systems" on page 525.
2. Start the capture on the Main system. The Main system will trigger the
Agent systems via the UDP packet so that the video capture starts at the
exact same time.
3. Once the capture is finished the QTM and video files are saved on each
separate computer. The UDP packet includes the file name used on the
Main system so that the QTM and video files on the Agent systems auto-
matically use the same name.
Processing with Theia 3D markerless mocap software
1. Copy the calibration file from the Main system to the computer with
Theia.
2. Copy all of the video files for one capture from Main and Agent systems to
a new folder on the computer with Theia. It is important that the folder
only includes the videos from one capture for the Theia processing to
work.
3. Optionally copy the QTM files to the same folder as the video files. The
QTM files are not needed for the Theia processing, but are good to save
to be able to read the settings used for the capture. It is important to
rename the QTM files from the Agent systems since otherwise the files all
have the same name.
4. Start Theia and import the video files and calibration manually. Please
refer to the Theia manual for instructions how to perform the processing.
NOTE: Theia will rename and move the video files, so you need to
have a copy of the video files if you want to be able to open the
QTM file with videos.
Real-time streaming
The QTM real time process enables QTM to send data to any program that can
receive data on TCP/IP or UDP/IP.
The protocols are described in the QTM Real-Time Sever protocol doc-
umentation QTM RT protocol.pdf that is included with the QTM installer in the
RT Protocol subfolder ("C:\Program Files\Qualisys\Qualisys Track Manager\RT
Protocol" when QTM is installed in the default location). The RT Protocol folder
also contains a compiled version of the RTClientExample, that can be used for
testing and troubleshooting real-time streaming. The RT protocol dou-
mentation is also available online via https://docs.qualisys.com/qtm-rt-pro-
tocol/.
QTM can also be controlled through its REST API. For more information, see
the RESTful_QTM_API PDF included with the QTM installer in the REST sub-
folder ("C:\Program Files\Qualisys\Qualisys Track Manager\REST").
The real time performance depends on the computer and graphical card spe-
cifications, the number of cameras, the number of markers and the complexity
of the used AIM model(s), amongst others. For more information and tips for
improving real-time performance, see chapter "How real time works in QTM"
on the next page.
Resources
Qualisys offers a variety of resources that can be used for real-time streaming,
including:
QTM Connect
Plugins/clients for streaming data from QTM into third party software,
such as Matlab, LabVIEW, or game engines, such as Unreal Engine or
Unity.
SDKs
Open source SDKs for developing custom QTM clients for C++, C#, Python,
etc.
The data that is processed in real time can be viewed in QTM and streamed via
TCP/IP through the supported protocols (RT protocol and OSC). The TCP/IP
server is always active and waits for a connection from a client program.
Almost all data that is acquired and processed in QTM can be streamed, includ-
ing:
l 2D data (unlinearized, linearized)
l Images (from cameras in Marker intensity and Video mode; only sup-
ported in preview mode, not when streaming data from file)
Controlling QTM
QTM can also be controlled via the RT protocol, for example for:
l Starting/stopping of a calibration or a capture
Real-time frequency
The real-time marker frequency is set on the Camera system page in the Pro-
ject options dialog. You can use the Reduced real time frequency option if a
high capture rate is required but you still want real time data during the cap-
ture. It is recommended that the reduced frequency is an integer divisor of the
capture rate.
These are the different steps in the real-time process:
l Set the GUI update to 15 Hz on the GUI page, or shut it off completely to
get the maximum performance.
l Check that the RT frequency is stable during the RT measurement. Lower
the rate if it changes to much from the specified capture rate.
Real-time streaming of data from external input devices
All external equipment samples, such as analog, EMG, force and eye-tracker
samples, are sent with each camera frame, e.g. if the analog capture rate is
three times the camera capture rate there will be in average three analog
samples sent with each frame. However, because of the buffering in the
external equipment the number of samples sent with each marker frame can
differ, but the total number of samples will be correct.
The analog and EMG is started in synchronization with the markers. The analog
boards are always started in synchronization with the sync out signal. The EMG
on the other hand needs to be started with the trigger button to be syn-
chronized, see respective EMG system in chapter "Wireless EMG systems" on
page 804.
Real time latency is the time it takes from the marker exposure to the time
when the TCP/IP signal can be received on another computer. The latency can
be divided into the parts in the image below. The size of the different parts
shows approximately how long time the different steps will take.
Computer
The computer performance will influence the latency and because QTM
runs on Windows the latency may also differ depending on which other
programs that are running.
1. Before you start the real-time, make sure that you have an AIM model or
6DOF bodies for the movement that you are going to capture, see chapter
"Automatic Identification of Markers (AIM)" on page 624 and "6DOF track-
ing of rigid bodies" on page 649, respectively.
NOTE: It is best if the AIM model is specific for the subject that will
be captured.
5. Activate the Real-time actions that you want to use. For example, for
Visual3D typically the following actions are used: Pre-process 2D data,
Track each frame: 3D, Apply the current AIM models and Calculate
force data.
6. Check the settings for the actions that have been selected.
7. Go to the Camera system page and set the capture rate. The maximum
capture rate depends mainly on the computer specifications, as well as
the complexity of the AIM model.
8. Go to the GUI page and set the Real time mode screen update to 15 Hz.
This step is not necessary, but it will give the computer more time to pro-
cess the marker data.
9. Test the real-time with the motion that you want to capture. Look espe-
cially at how the AIM model is applied and if the RT frequency shown in
the Status bar is close to the capture rate. If it differs too much, lower the
Real time frequency, it might also help with a new AIM model or changing
the tracking parameters.
l If the real-time is slow close all windows, including the Data info win-
dow and the Trajectory info windows, except for a 3D view win-
dow.
l When the real-time is working fine you can even turn off the GUI on
the GUI page in the Projects Options dialog. This will reduce the
processing capacity needed by QTM.
l AIM and 6DOF data can be restarted with the F9 shortcut or the
Apply AIM model button .
10. When you are satisfied with the real-time in QTM, you can connect to QTM
from the other program or client.
Server mode
QTM can be started in server mode to minimize the interference by QTM when
running an RT client. To start QTM in server mode use the command-line argu-
ment /server when starting QTM. The modified behavior for the server mode is
as follows:
l Merge with Twin slave, see chapter "Working with QTM twin files" on
page 522.
l Calculate 6DOF, see chapter "6DOF tracking of rigid bodies" on page 649.
l Calculate force data, see chapter "Force data calculation" on page 703.
l Export to C3D format, see chapter "Export to C3D format" on page 727.
l Export to Matlab file, see chapter "Export to MAT format" on page 729.
l Export to FBX file, see chapter "Export to FBX file" on page 742.
l Export to JSON file, see chapter "Export to JSON file" on page 743.
The settings of each processing step can be found in the tree to the left in the
dialog. For information about the settings, see chapter Project options.
NOTE: The two tracking options under Track the measurement are
mutually exclusive and can therefore not be used at the same time.
Reprocessing data
Reprocessing a file
Files can be reprocessed at any time. Reprocessing can be useful in the fol-
lowing scenarios:
l In case not all processing actions were applied during the capture,
l In case you used the Store real-time data option for a quick review during
the capture session,
l In case you want to modify processing options to optimize the results, for
example
For reprocessing the currently open file, click the Reprocessing icon or
Reprocess in the Capture menu top open the File reprocessing dialog. If you
want to process multiple files at the same time, you can use batch processing
instead, see chapter "Batch processing" on page 605.
The File reprocessing dialog contains all of the settings from the Project
options dialog that can be applied to a file in post-processing, see chapter "Pro-
cessing" on page 320. Follow these steps to set the reprocessing steps.
2. Choose the source for the settings from the drop-down lists to the right.
There are two options:
Measurement
Choose the measurement settings for reprocessing the file based on
the previously used settings. This is the default choice. You can
modify the settings by editing them in the dialog. The edits will only
apply to the current file.
Project
Choose the project settings if you want to use the current settings
used in the project. Any edits will update the current project settings
as well.
If you need to reprocess multiple files with the same settings, you
can use the project settings and modify them for this purpose if
needed.
NOTE: The Calibration and Linearization pages are connected for files
that have been 3D tracked. This means that you have to change the cal-
ibration to change the linearization. The Linearization page is only dis-
played so that you can check that the linearization files are correct.
However, for a 2D tracked file there is no Calibration page and you can
change the linearization on the Linearization page, see chapter "Lin-
earization" on page 249.
The calibration of a file is changed on the Calibration page in the File repro-
cessing dialog. The file's current calibration is shown under the Calibration
file heading and the results are displayed under the Calibration results head-
ing. Replace the current file with the wanted calibration file by clicking Load
other and locating the file. The calibration files are located in the Calibration
folder in the project folder and the name of a calibration file includes the date
when the calibration was performed.
If the current calibration file has been recalibrated, it must be loaded again to
reload the parameters, otherwise the file is reprocessed with the old cal-
ibration results that are stored in the file.
To start reprocessing the file click OK. The new settings in the File repro-
cessing dialog are only valid for the active file. To keep the changes the file
must be saved.
Batch processing
With batch processing, several capture files can be processed with the same set-
tings. The same actions can be performed in batch processing as in the pro-
cessing that is performed after a measurement has been captured.
1. Click Batch process on the File menu. Select the files and click Open to
open the Batch processing dialog.
3. Choose the source for the settings from the drop-down lists to the right.
For some settings the only possible source is the project, for the other set-
tings there can be three possible options:
Processed file
The settings used are the ones present in each processed file and
cannot be edited. This option is always selected by default because
the other options replace the original settings from each file.
Project
The settings are copied from the current project settings to each pro-
cessed file. The settings can be edited in the tree-view to the left.
Editing the settings will change the current project settings as well.
This option is often the best to use when you want to change the set-
tings of the processed files.
Present file
The settings are copied from the present file (the file open in the
QTM window where you opened the Batch processing dialog) and
can be edited in the tree-view to the left.
4. Click OK. The actions are performed on the files and then the files are
saved and closed.
Processing 2D data
The 2D data is processed when the Pre-process 2D data option is enabled on
the Processing page in Project Options. When reprocessing 2D data in a file,
the 3D, 6DOF and skeleton data must also be reprocessed for the changes to
have any effect.
The following pre-processing and filtering methods for 2D data are available:
l Circularity filter (a legacy filtering method, only supported by Oqus cam-
eras)
l Software marker masks
Effect on 3D data
This mode will give you more exact 3D data than the unfiltered data,
because the corrected markers have more exact center points than
the camera data. It has most effect if only two cameras can see the
marker. For example you can use this mode to improve the accuracy
of the calibration, since the wand markers are partially hidden by
the stick.
Effect on 3D data
This mode will give you less 2D data than the unfiltered data and
actually most of the time it is better to try to correct the markers.
The biggest advantage with this mode is that it is faster than cor-
recting the markers. So if you have many cameras that can see each
marker you will not loose much 3D data and it will also be more
exact.
IMPORTANT: If you change this option when processing a file, you must
also reprocess the 3D, 6DOF and skeleton data for the change to have
any effect.
Add software marker masks to a file with the Marker mask tool in the 2D view.
The mask is shown as a blue rectangular area. Use the Selection tool in the 2D
view to resize and move the mask with the pointer. There can be up to 100
To apply the software masks, the QTM file must be reprocessed with the Pre-
process 2D data step and the Software marker mask option enabled. After
processing, the masks are surrounded by blue edges, indicating that they have
been applied. The markers that are filtered are surrounded by white rect-
angles.
IMPORTANT: The 3D, 6DOF and skeleton data must also be reprocessed
for the change to have any effect.
To undo the filtering by software masks, the QTM file must be reprocessed with
the Pre-process 2D data step enabled and the Software marker mask option
disabled. The masks will be preserved, but the filtering is undone. The masks
are shown as blue rectangular areas without the surrounding edge, the same
as when they were just created.
To delete masks, right-click on a mask and select one of the available options:
l Delete this mask to delete the currently selected mask,
l Delete all masks to delete all masks for the current camera,
l Delete all masks for all cameras to delete all masks for all cameras.
To undo the filtering, the file must be reprocessed with the Pre-process 2D
data step enabled (the Software marker mask option can be either enabled
or disabled).
IMPORTANT: The 3D, 6DOF and skeleton data must also be reprocessed
for the change to have any effect.
To undo the size filtering, reprocess the file with the Pre-process 2D data step
enabled and the Minimum marker size and Maximum marker size options
disabled.
3D tracking measurements
3D tracking is the process of constructing 3D points from the 2D rays from the
cameras and sorting them into trajectories. It is activated with Track the meas-
urement and the 3D option on the Processing page and is controlled by the
tracking parameters, see chapter "3D Tracker parameters" on page 325.
How to set the 3D tracker parameters depends a lot on the measurement
setup, for some advice, see chapter "Advice on how to set 3D Tracker para-
meters" below. For an example of how to evaluate if the tracking is working
properly, see chapter "3D tracking evaluation" on page 616.
Track the measurement with the 3D option is used by default while capturing
data. Files can also be reprocessed in case tracking was skipped during the
recording, or to optimize tracking results, see chapter "Reprocessing a file" on
page 601. For information about how to optimize tracking results in case of
problems, see chapter "Troubleshooting tracking" on page 1028.
Maximum residual
The Maximum residual is by default set to 6 mm. For very large tracking
volumes it may help to increase the value for more robust tracking. Redu-
cing the value may help to avoid switching artifacts, but may lead to more
fragmented trajectories. A good way to estimate the Maximum residual
is to track the file and then plot the residual of all of the trajectories. Then
you can set the parameter two or three mm higher than the maximum
residual of the trajectories.
3D tracking evaluation
Before data collection, it is a good idea to test that the tracking is working prop-
erly for the current camera setup. Place all markers required on the meas-
urement subject and also one marker in each corner of the measurement
volume. Then perform the following test:
1. Open a new QTM file and perform a capture. The movements should be
as similar to the planned recordings as possible.
2. Go to the View menu and open the File information dialog. In the dialog
there is a list of the tracking residuals per camera. There are two numbers
per camera. Points are the number of 2D points from the camera that are
used to calculate 3D in the measurement. Avg res is the average residual
for the 2D rays from the camera at the average marker distance in the
whole measurement.
2D tracking of data
With the 2D tracker you can track the 2D data of one camera and present the
markers as trajectories in the 3D view window. Because just one camera is
used to track the trajectories, they will all be in a plane in front of the camera.
This means that you can only measure the distances in two dimensions.
As the 2D data is tracked only in a plane no calibration is used to track the data.
The only settings that are needed is the distance between the plane and the
camera. That setting and other 2D tracking settings are set on the 2D tracking
page in the Project options dialog, see chapter "2D tracking" on page 330.
NOTE: The 2D tracker can only track one camera at a time and it can only
be used on a captured file and not in real-time.
1. Before you start the capture you should activate 2D tracking with Track the
measurement and 2D on the Processing page in the Project options dia-
log. Then, set the settings on the 2D tracking page, see chapter "2D tracking"
on page 330.
b. It is a good idea to use filter so that you do not get may short tra-
jectories.
c. Measure the distance between the measurement plane and the camera
sensor and enter the Distance to measurement plane setting.
d. Define the orientation of the coordinate system with the three settings
for the axes.
2. Start a capture with Start capture on the Capture menu. 2D tracking cannot
be applied in real-time so you have to make a measurement to see whether
your settings are correct.
3. When the 2D tracker is finished the result is trajectories that can be pro-
cessed and handled exactly as trajectories created by 3D tracking. The only
difference is that the trajectories will all be in the same plane.
Manual identification is best done with the quick identification method. Follow
these steps to use this method:
1. To use the quick identification method you need to have a label list in the
Labeled trajectories window. Load a label list or enter the labels manu-
ally with Add new label on the Trajectory info window menu.
2. Select the first label in the Labeled trajectories window that you want to
identify.
3. Hold down Ctrl + Alt or select the Quick identification cursor and
click on the marker or trace in the 3D view window that matches the
selected label.
l When you click on the marker or trace, it will be added to the selec-
ted label. If you have not selected any label, the marker you click on
will be added to the first empty label in the list. If there is no empty
label in the list, the marker you click on will be added as a new label
at the end of the list, and you can edit its name at once.
It is also possible to use the Identify option on the Trajectory info window
menu, which appears whenever you right-click a trajectory, either in a 3D view
window or in a Trajectory info window. For information about functions in the
Trajectory info windows, see chapter "Trajectory info window menu" on
page 144.
The following features are useful when identifying trajectories.
l The keyboard shortcut C can be used to center on the selected marker in a
3D view window.
l The Trajectory info window menu option Center trajectory in 3D, will also
center on the selected trajectory or part. However, if the trajectory is not vis-
ible at current frame it will also move the current frame to the first frame of
the trajectory.
l The keyboard shortcuts J and Shift + J can be used to jump to the next
unidentified trajectory or part.
l View the trace of the trajectories to check which trajectories that match.
TIP: Use the Lock key to keep the focus on the trajectory you are
working on, and uncheck the horizontal auto zoom to keep the time
line at a fixed interval.
Identification methods
l Use Quick identification to identify markers in the 3D view.
l Drag and drop traces or markers from the 3D view to the Labeled tra-
jectories list or onto a trace in the 3D view.
l If it is a long measurement start with identifying just a part of the meas-
urement and then generate an AIM model and apply that to the whole
measurement.
Other tips for navigation and trajectory management
l Use scrubbing (Ctrl + drag) and trace range zoom (Shift + Mouse wheel)
features for browsing through the measurement.
AIM identifies trajectories from angles and distances between markers. In the
AIM model QTM has saved the angle and distance ranges of added meas-
l AIM needs to have movement in the first file used to create the model. Other-
wise AIM will not identify the connections between markers in the AIM
model, since they are based on the shortest possible distances between
markers.
l AIM looks for the best solution in each frame and then joins the solutions.
This means that AIM will join and split trajectories depending on the AIM
model so that you get the best match. The measurement subject can leave
and enter the volume and AIM will continue to join the trajectories.
l Because AIM is working with relations between markers, all of the markers in
the model file must be included in the measurement to ensure that all of the
trajectories can be identified. E.g. if a marker on the shoulder disappears for
a long time it is difficult for AIM to find the arm.
General instructions
l Make sure that the trajectories have the correct identity throughout
the file. You can select a smaller measurement range to delete the
incorrect parts if you do not want to identify the whole file.
l In the first file used to generate an AIM model, it is recommended to
not have any static markers together with the moving subject.
Because it will make it very hard for AIM to find a solution. Only use
an AIM model with static markers if the markers are always static.
2. Select the measurement range of frames that will be used in the model.
Use the scroll boxes on the Timeline control bar to select the range.
Choose a range where the motion is typical for what you are going to cap-
ture, most of the time it is best to use the whole measurement range
unless there are large gaps or incorrect data.
3. Click the Generate model icon on the AIM toolbar or click Generate
model on the AIM menu.
4. Choose one of the options Create new model, Create new model
based on marker connections from existing AIM model or Add to
existing model, see the below chapters for a detailed description. Click
Next to start the model generation.
Use this option to create a new AIM model from scratch. When creating a new
model, it is very important that the bones are connected in the correct way to
reflect the hierarchy of the model. You can use the Delete bones tool in the
toolbar to remove the bones that are wrong. For more information about the
AIM bones, see chapter "How to verify and edit AIM bones" on page 632.
Use this option to create a new AIM model based on the AIM bones (hierarchy)
and the visual properties (colors and visual bones) of an existing AIM model.
The data of the existing AIM model is ignored, so the new AIM model is based
on the movements of the current measurement only. The labels of the meas-
urement should correspond to the labels of the existing AIM model. This option
consists of the following steps:
1. Select an existing AIM model from the file dialog. If the labels of the selec-
ted model do not correspond to those of the measurement, AIM gen-
eration will fail.
2. Review the AIM model in the AIM model visualization window. Note that
the AIM bones cannot be edited.
3. Specify the file name of the new AIM model.
This option is used to extend the motion of an existing model. It can be used to
generalize the model for application to a wider range of subjects or move-
ments.
Select one or more models from the list to which you want to add the move-
ment in the current file.
l The models in the list are the models that are available on the AIM page.
By default all the models in the Applied models list are selected. Click
Add model to browse for another model.
l The file must include all the labeled trajectories that are included in the
model. It can however have other labeled trajectories as well, only those
with the same name as in the model will be used to update the model.
This means that you can update several models at the same time. E.g. if
the file include three person with different names on their respective
labeled trajectories, then you can select all three AIM models in the list at
once and all three models will be updated.
l If you only add to one AIM model you will get a dialog that displays the
AIM bones of the AIM model that is added to. If the AIM bones are placed
incorrectly in the model it is best to make a new AIM model and start
Follow these guidelines when you check the data before adding it to or creating
an AIM model.
l Make sure that the subject in the file is moving. The movement does not
have to be exactly the same as the rest of the captures, just that it
includes most of the movement. When you add more measurements to
an AIM model, it becomes less important that the added files include a lot
of movement. You can verify that the movement is large enough by look-
ing at the AIM bones, see section "How to verify and edit AIM bones" on
the next page.
NOTE: Even if you want to make an AIM model for a static meas-
urement it is important to have some movement in the first file. This
is to make sure that the internal bone definitions are correct.
However, the subsequent files can be static because then the defin-
ition is already correct.
l Make sure that the trajectories have the correct identity throughout the
file. You can select a smaller measurement range to delete the incorrect
parts if you do not want to identify the whole file.
l The colors of the trajectories and any bones between them are also saved
in the model. For example to make it easier to verify the identification
after AIM has been applied, the colors of the trajectories can be set with
Set different colors on the Trajectory info window menu before cre-
ating the AIM model.
l Repeat these steps for all of the frames where you can find erratic
data and then gap-fill the trajectories
When an AIM model is created there are AIM bones created automatically
between the markers that have least movement between them throughout the
file. Because these AIM bones are then kept even if you add more meas-
urements it is important that they are correct. Therefore there is a step when
creating a new AIM model where you can verify and edit the AIM bones.
l There is no relationship between the AIM bones and bones created by the
user.
l The number of AIM bones are as few as possible.
l You can rotate, translate and zoom the 3D image of the AIM bones.
l You can delete AIM bones with the Delete bone tool, see below.
l The frame used for displaying the AIM model is the first frame that
include all or as many as possible of the markers.
To verify that the AIM bones are correct follow these instructions:
1. Check that all of the markers are displayed. If there are missing markers it
means that there is no frame with all of the markers. It is recommended
to use another file where all of the markers are available in at least one
frame.
3. The AIM bones may not look as good as your user created bones, but that
does not necessarily mean that it is wrong. For example, the AIM bone
between a head and the body can look strange if the marker nearest to
the head is on the shoulder.
Applying an AIM model
The AIM model can be applied to files with captured motions that are similar to
any part of the motion in the model. I.e. if your model includes a lot of different
motions made by a human, then the captured trajectories can be identified if
another human makes one or more of these motions. An AIM model can be
applied either as a processing step or manually.
Below follows a description of how to apply an AIM model manually on a cap-
ture file.
2. Make sure that all of the trajectories are in the Unidentified trajectories
window or the Identified trajectories window. Discarded trajectories are
not used by AIM. It is also important that all required parts of the meas-
urement is included in the selected measurement range, since the AIM
model is only applied to trajectories with parts within the selected meas-
urement range.
3. Click the Apply model icon on the AIM toolbar or click Apply model on
the AIM menu. The AIM application settings dialog is displayed.
NOTE: The AIM settings are copied from Project options, if you
want to use the AIM settings and models saved in the file you need
to reprocess the file and select from measurement as source, see
chapter "Reprocessing a file" on page 601.
4. Check that the Applied models are correct. It is possible to apply several
AIM models to the same file, or to apply the same AIM model to multiple
actors. In the latter case, set Nr To Apply to the number of actors. For
more information, see chapter "AIM models for multiple subjects in the
same measurement" on page 637 .
If any of the models cannot be applied to the trajectories the dialog below
will appear. Showing how many of the bodies (models) were applied.
The AIM results dialog will display the result of all of the applied AIM
models. The Partial results are AIM models where not all of the markers
have been identified. For the Failed models none of the markers have
been identified.
The most likely reason for the Partial result is that the model doesn't com-
pletely match the captured motion. Then it is recommended to manually
identify the file and then add it to the existing AIM model, see chapter
"Generating an AIM model" on page 625.
It can also help to reduce the selected measurement range so that the
AIM model is applied only on a smaller part of the measurement. For
example, if the subject walks in and out of the volume it can help to
reduce the selected measurement range to where the subject is inside the
volume.
7. When the AIM model has been successfully applied, the capture file must
be saved to keep the changes.
When you apply a model as a processing step, either directly after a capture or
in a batch process, it works exactly like when applying it manually. The model is
set on the AIM page in the Project options dialog.
AIM models for multiple subjects in the same measurement
You can apply AIM for automatic labeling of multiple subjects at the same time.
The following scenarios can be distinguished:
l Tracking of subjects with different marker configurations and labels. This
requires a unique AIM model for each subject, which should be present in
the Applied models list in the AIM dialog or settings page.
l Tracking of subjects with similar marker configurations (e.g. multiple act-
ors with same marker set). In this situation the same AIM model can be
applied multiple times by specifying Nr to Apply in the AIM dialog or set-
tings page.
l A combination of the above.
For more information about how to create and apply AIM models when cap-
turing multiple subjects, see the chapters below.
1. Make sure that each subject has a unique marker configuration. If you
have similar types of subjects (e.g. multiple actors), you can use a dif-
ferent marker pattern to distinguish the subjects, for example by placing
four markers on the chest in different patterns.
2. Make sure that the label names of the subjects are different so that
QTM can identify the labels when you add measurements to the
AIM models.
3. Create an AIM model for each subject. This can be done per subject as
described in "Generating an AIM model" on page 625. Alternatively, you
can generate the AIM models using a single measurement with multiple
subjects as follows.
l Make a Range Of Motion (ROM) measurement with all subjects sim-
ultaneously.
l Label the markers per subject as described in "Manual identification
of trajectories" on page 620.
l Select the trajectories of one subject in the 3D view or the trajectory
list, right-click on the selection and click Generate AIM model from
selected trajectories....
l Generate the AIM model as described in chapter "Generating an AIM
model" on page 625
l Repeat step b. to d. for every subject.
5. You can add data to multiple AIM models from a single measurement by
selecting the applicable AIM models under the Add to existing model(s)
option. The data will then be added to the AIM model(s) with matching
label names.
Similar subjects
When tracking similar subjects with the same marker configuration, you can
use one generic AIM model and apply it multiple times. This is done as follows:
1. Add the AIM model to the Applied models list on the AIM page in the Pro-
ject options dialog.
2. Set Nr To Apply to the number of subjects to be tracked simultaneously.
3. Optionally, enable Use random trajectory color for each AIM file. This
will help to distinguish the subjects from one another.
4. Start tracking.
If you are measuring multiple subjects with the same marker configuration you
can create specific AIM models for each subject based on a generic AIM model.
The advantage is that this allows for identification of individual subjects.
2. Select the trajectories of one subject in the 3D view or the trajectory list.
7. Start tracking.
Editing of trajectories
The Trajectory Editor can be used to view and edit trajectory data. Manual
editing trajectory data should be the last step in the processing of trajectories.
Before editing trajectory data, it is important to make sure that the quality of
the tracking is optimal, and that the identification of trajectories is correct and
as complete as possible. Remaining irregularities of trajectories can be edited
with the Trajectory Editor.
The main functions of the Trajectory Editor are:
l Locating and filling gaps,
You can open the Trajectory Editor window in the following ways:
The functions and lay-out of the Trajectory Editor is described in chapter "Tra-
jectory Editor window" on page 159.
The following chapters describe in more detail how to use the Trajectory
Editor.
Gaps
Gaps are to be understood as missing parts within a trajectory. Two common
types of gaps are:
The first type of gaps can generally be kept to a minimum by improving the AIM
model. In some cases, manual identification may be needed to add unidentified
parts to the trajectory. Remaining gaps of the second type can be detected and
filled in the Trajectory Editor.
Identification and selection of gaps
Gaps are indicated in the plot area by a brown area and an amber indicator
below the time axis at the frames where data is missing. The gaps are also lis-
ted in the Gaps panel in the Points of Interest sidebar.
1. Select the trajectory you want to edit. You can select the trajectory in one
of the Trajectory info windows, or by clicking on the marker in the 3D
view window.
2. Select a gap or a frame range that includes the gaps you want to fill.
The gaps included in the selected range will then be filled using the Type spe-
cified in the Fill settings. The filled parts are indicated in the plot area by a
dark blue area and a blue line below the time axis at the gap-filled frames, and
the data series is shown as a dashed line.
The available fill types can be divided into two categories. Linear, Polynomial,
Relational and Kinematic are gap fill types that interpolate the data between
the respective edges of the trajectory. Polynomial gap fill can only be applied
when there is trajectory data on both sides of the gap. Linear and relational
gap fill can also be used for extrapolation for gaps at the start or end of the cap-
ture. The filled parts of the trajectory are indicated as type Gap-filled in the Tra-
jectory info window. Static and Virtual are virtual fill types that are
independent of the data at the edges of the gap. These types can also be
Linear
Gaps are filled by a linear interpolation between the respective edges of
the trajectory. If the gap is at the beginning or the end of a capture, the
gap will be filled with a constant value (first or last data value after or
before the gap, respectively). Options:
Max Length: Check to apply a maximum length for linear gap filling.
Polynomial
Gaps are filled by a cubic polynomial interpolation between the respective
edges of the trajectory. Polynomial gap fill uses the data from two frames
before and after the gap for the calculation. If the gap starts or ends with
a trajectory part which consists of one frame, then the polynomial gap fill
will use the next available frame to calculate the polynomial. If there is no
other trajectory part then polynomial gap fill is calculate using just that
one frame.Options:
Max Length: Check to apply a maximum length for polynomial gap
filling.
Relational
Gaps are filled based on the movement of surrounding markers. The user
specifies one, two or three context markers, which are used to define a
local coordinate system (LCS). The filled trajectory consists of a linear
interpolation in the LCS, which is then transformed to the global coordin-
ate system. If the gap is at the beginning or end of a capture, the filled
part will be extrapolated. The options can be selected from a drop down
list. Alternatively, you can drag and drop trajectories on the field if the tra-
jectory is locked. The following options are available:
X Axis: Marker defining the primary axis of the LCS. Preferably, the
movement of the primary axis should be strongly correlated with
that of the target marker. If the target marker is on the same line as
the Origin and the X Axis markers, it should be sufficient to specify
only these two context markers.
XY Plane: Marker defining the secondary axis of the LCS, fixating its
full orientation.
Rigid body: Check this option in case the three context markers are
part of a rigid structure. If checked, the pose of the LCS will be cal-
culated by means of a rigid body fit. The definition of the rigid body
will be based on the average of the relative configuration of the con-
text markers across the gap. This option can only be checked if all
three context markers are specified.
Virtual
Gaps are filled based on the movement of surrounding markers. Similar
to Relational gap fill, except that the filled part is independent of sur-
rounding trajectory data. The user specifies one, two or three context
markers, which are used to define a local coordinate system (LCS). The
filled trajectory represents the movement of the origin of the LCS, with an
optional offset specified by the user. The options can be selected from a
drop down list. Alternatively, you can drag and drop trajectories on the
field if the trajectory is locked. The following options are available:
Origin (required): Marker defining the origin of the LCS, preferably a
marker close to the target marker. If only Origin is defined, the filled
trajectory will be entirely translational.
X Axis: Marker defining the primary axis of the LCS. Preferably, the
movement of the primary axis should be strongly correlated with
that of the target marker. If the target marker is on the same line as
the Origin and the X Axis markers, it should be sufficient to specify
only these two context markers.
Rigid Body: Check this option in case the three context markers are
part of a rigid structure. If checked, the pose of the LCS will be cal-
culated by means of a rigid body fit. The definition of the rigid body
will be based on the average of the relative configuration of the con-
text markers across the gap. This option can only be checked if all
three context markers are specified.
Kinematic
Kinematic gap fill of markers associated with skeleton segments or rigid
bodies based on current skeleton or 6DOF data.
NOTE: In some cases there may be a spike in the trajectory just before or
after a gap. Deleting such artifacts before gap filling can improve the qual-
ity of the filled part, in particular for the polynomial method.
Spikes
Spikes are to be understood as discontinuities between consecutive frames
within a trajectory. The Trajectory Editor can be used as a tool to detect
spikes. Two common types of spikes are due to:
1. Select the trajectory you want to edit. You can select the trajectory in one
of the Trajectory info windows, or by clicking on the marker in the 3D
view window.
2. Select a spike or a frame range that includes the spikes you want to
smooth.
3. Press the Smooth button.
Butterworth
Smoothing by means of a fourth order Butterworth low-pass filter. The
Butterworth type is most suitable for reduction of high-frequency noise
across large frame ranges. The available options are:
Cutoff: Cutoff frequency specifying the pass band and the stop band
of the filter. As a rule of thumb, the cutoff frequency should be a
factor 2-3 higher than the highest frequency of interest.
1. Select one or more trajectories and right click on the selection in the Tra-
jectory info window or in the 3D View window.
2. In the context menu, click Add new trajectory > Virtual (Average of
selected trajectories). This will create a new trajectory at the geometric
average of the selected trajectories.
3. Name the new trajectory.
The Trajectory Editor can be used to create virtual trajectories, giving more
control to the user to define their location. Follow these steps to create a vir-
tual trajectory with the Trajectory Editor:
4. In the Settings sidebar of the Trajectory Editor window, choose fill type
Constant or Virtual.
5. Specify the fill options, depending on the chosen type:
l For type Constant, specify X, Y and Z coordinates. This will result in
a static virtual trajectory at position X, Y and Z in the global coordin-
ate system.
l For type Virtual select the trajectories on which the virtual tra-
jectory should be based. The options can be selected from a drop
down list. Alternatively, you can drag and drop trajectories on the
field if the trajectory is locked. By default, the virtual trajectory will
correspond to the trajectory selected as Origin. By specifying the X,
Y, and Z values under Offset, you can move the virtual trajectory to a
different position, relative to the local coordinate system as spe-
cified by the fill options. See chapter "Filling of gaps" on page 642 for
more information about the options.
6DOF versus 3D
A single point in the mathematical sense can be fully described by its three
coordinates (X, Y, Z) in a Cartesian coordinate system (3D). This can also be
described as having three degrees of freedom. A physical body, such as a ship
or an aircraft, requires three additional degrees of freedom to fully char-
A rigid body is a configuration of points at fixed relative positions. For the 6DOF
tracking function to work the rigid body must be defined by at least three
points, which should not be on the same line. When designing a 6DOF body,
consider the following aspects.
Choice of markers
When designing a rigid body, take the following in consideration when choosing
markers:
Marker size
The markers should be large enough for tracking, see chapter "Marker
size" on page 529. Furthermore, the markers should be small enough so
that there is sufficient separation between the markers to minimize the
occurrence of merging of the markers in the camera views.
Marker configuration
The following aspects are important for the tracking of rigid bodies.
The markers should be placed in a way that they are clearly visible and
well separated in the camera views for optimal tracking.
Marker distribution
Generally, the accuracy of the 6DOF data increases with the distance
between the markers. Try to place the markers as far apart as possible in
different directions, spanning up a plane or a volume.
Asymmetry
Marker should be applied in an asymmetric configuration, so that the ori-
entation of the rigid body can be uniquely determined. When the markers
are placed symmetrically, the measured orientation of the rigid body may
flip.
NOTE: This limitation does not apply when using active markers
with marker ID.
Uniqueness
When tracking multiple rigid bodies simultaneously, they should have
unique configurations so that they can be identified. If the configurations
are the same, it is not possible to distinguish between the rigid bodies.
NOTE: This limitation does not apply when using active markers
with marker ID.
The marker placement can also be important for the definition of the rigid bod-
ies. For example, markers can be placed in a way that they can be associated
with important positions, axes or planes for the definition of a models
1. Apply the markers for tracking and the extra markers for the definition of
the rigid body.
2. Place the model in a way that all markers are visible.
l If it is not possible to see all markers in a single measurement, you
can also make several recordings with the model in different ori-
entations.
3. Make an initial rigid body definition, see chapter "Definition of 6DOF bod-
ies" below.
l In case you have multiple recordings, use the Add to rigid body (6
DOF) function from the Trajectory info window menu to add selec-
ted markers to the rigid body definition.
l If needed, you may even add virtual markers to the definition, for
example a point in between two markers.
4. Use the Translate body and Rotate body methods to change the local
coordinate system of the rigid body according to your specifications.
5. You can now remove the extra markers from the physical rigid body.
l You must either remove the corresponding points from the rigid
body definition, or check their Virtual option, so that QTM will not
try to track the point as a marker. When using the virtual option, the
point will be added as a virtual trajectory when measuring the rigid
body in QTM, which can for example be useful for visualization pur-
poses.
Definition of 6DOF bodies
When the 6DOF body has been designed according to the previous chapter you
must add the rigid body definition to the 6DOF Tracking page in QTM. This can
be done in the following alternative methods:
III. Use the Load Bodies option from the 6DOF Tracking page to load stored
rigid body definitions from a file,
IV. Use the Add Body and Add Point options from the 6DOF Tracking page
to manually add a rigid body and its points.
The first two methods can be used to create a new rigid body definition based
on a measurement. The latter two require that the points in the rigid body
definition are known, either from prior measurement or by design.
New rigid body definitions are most commonly created from a measurement.
This is the easiest way to define the exact positions of the points for the best
tracking results.
Preparations
Before creating the rigid body definition, the following should be taken into
account:
The Define rigid body (6DOF) method can be used to create a rigid body defin-
ition from selected trajectories in a measurement. This can be done during pre-
view or from a file. An advantage when creating the rigid body from a file is that
the rigid body points can be based on an average of multiple frames. The
object is allowed to move during the recording. This way the rigid body points
can be obtained from an average across many poses, making the definition less
dependent on local measurement errors.
Follow these steps to create a new rigid body definition:
1. Select the trajectories associated with the rigid body in the 3D view win-
dow or the Trajectory info window.
2. Right-click on the selection and choose Define rigid body (6DOF) from
the context menu. Alternatively, use the keyboard shortcuts F8 or Shift +
F8. There are two options:
l Current Frame (Shift + F8): Create rigid body definition based on
marker configuration of the current frame.
l Average of frames (F8): Create rigid body definition based on the
average configuration of the markers in the current capture. The
advantage of this method is that the statistics of the marker pos-
itions are taken into account. The Bone tolerance setting of the
body will be based on these statistics. This option is not available
when in Preview mode.
3. Specify the name of the rigid body.
NOTE: When defining a new body in this way, the 6DOF data is re-cal-
culated in the file, which means that the 6DOF data of all other bodies in
the file will be updated as well.
The Acquire body method can be used to create a rigid body from markers that
are detected during preview. Rigid bodies created using this method are added
to the rigid body list in the Project Options.
Follow these steps to create a rigid body using the Acquire body method:
1. Start QTM and start a preview by clicking the New file icon .
2. Place the rigid body in the measurement volume so that the rotation of
the desired local coordinate system is known in reference to the global
coordinate system. One way is to place the body so that the desired local
coordinate system is aligned with the global coordinate system and then
the local origin can just be translated to the desired position.
3. Check that the markers on the 6DOF body do not merge in any of the cam-
eras' 2D views.
4. Open the Project options dialog in QTM and go to the 6DOF Tracking
page.
5. Click Acquire body to open the Acquire body dialog.
6. Click Acquire.
Rigid body definitions can be loaded from an XML file using the Load bodies
button in the 6DOF Tracking page. This action will replace any rigid bodies
present in the list with those in the file. All rigid bodies that are loaded from a
file are by default enabled.
The XML file can be created by saving rigid bodies using the Save bodies but-
ton in the 6DOF Tracking page. The XML file can also be edited, for example,
you can add, delete or modify rigid bodies in the file before loading it.
You can manually add a rigid body definition to the rigid body list using the Add
body button in the 6DOF Tracking page. You can then add points to the rigid
body definition using the Add point button. This method can be used if you
use a rigid body with know marker positions, for example one created using a
3D printer.
Rigid bodies and points by double clicking on a property or by using the Edit
buttons (Edit color, Edit point, Edit label). For information about the prop-
erties, see chapter "Rigid bodies" on page 346.
When you have defined the points of the 6DOF body in QTM you can change
the definitions of the local coordinate system. The local coordinate system is by
default placed in the geometric center of the points.
The local coordinate system is used in the calculation of rotation and position
of the measured rigid body in reference to a reference coordinate system.
Therefore it is important that the local coordinate system is defined according
to the specifications of the measurement. The local coordinate system should
have an orientation and a location in reference to the points in the 6DOF body
definition, which is well-defined. Use a definition where the normal orientation
of the body is the same as no rotation, i.e. aligned with the reference coordin-
ate system.
When you have decided where the local coordinate system should be, use the
Translate and Rotate functions on the 6DOF Tracking page to specify the
local coordinate system, see chapter "Rigid bodies" on page 346.
Then you should also decide which coordinate system that the 6DOF body data
should refer to. This done in the Coordinate system for rigid body data dia-
log, which is opened by double-clicking on Global origin on the 6DOF bodies
page. See chapter "Coordinate system for rigid body data" on page 354 for the
alternatives.
NOTE: If you want to change these setting in a capture file you must
reprocess the file with the new setting.
The active Traqr is designed to be used as a rigid body. Each Traqr has the
same marker setup and is instead differentiated by the active IDs. The IDs of
the markers of the active Traqrs can be managed with the Traqr Configuration
Tool, for more information refer to its manual.
Follow these steps to create your active Traqr rigid body in QTM:
1. Place one active Traqr in the volume with the rotation that you want com-
pared to the global coordinate system.
2. Start preview with New on the File menu.
4. Click on Acquire body to create a new rigid body. Repeat this step for
each Traqr.
NOTE: If you know the IDs for each Traqr then you can acquire the
same Traqr and edit the IDs in the Rigid bodies list.
NOTE: If several bodies share the same marker definition and name, the
trajectories will only be shown once in the Labeled trajectories window.
NOTE: To delete all 6DOF data in a file you can reprocess the file with
Calculate 6DOF with an empty list of rigid bodies.
When reprocessing a file the 6DOF data is reprocessed in the following ways
depending on the processing steps.
IMPORTANT: When the pitch (f) is close to ±90°, small changes in the
orientation of the measured rigid body can result in large differences in
the rotations because of the singularity at f=±90°, see chapter "Rotation
angle calculations in QTM" on page 1010.
First the local coordinate system is rotated around the X-axis (roll) with an
angle q to the new positions y’ and z’ of the Y- and Z-axis.
After the roll the local coordinate system rotates around the Y-axis (pitch) with
the Y-axis in its new position. The X- and Z-axis is rotated with an angle f to the
new positions x’ and z’.
After the rotations the rigid body has a new orientation in reference to the
global coordinate system, see figure below.
Another description of rotations is to use the rotation matrix, which does not
have a singularity. QTM uses the rotation matrix internally to describe the rota-
tion of rigid bodies, and when exporting 6DOF to TSV files the rotation matrix is
included for all bodies in all frames, together with roll, pitch and yaw angles.
For a description of the calculation of the angles from the rotation matrix, see
chapter "Rotation angle calculations in QTM" on page 1010.
With the analog output option the information about 6DOF bodies’ positions
and rotations can be used in feedback to an analog control system. To enable
analog output a D/A board must be installed in the measurement computer.
NOTE: In regular capture it will only work when Display the last
fetched frame is selected and then it will only be used on the frames
that are tracked and displayed.
The data values that will be used are selected on the 6DOF Analog export
page, see chapter "6DOF analog export" on page 388. Since the required board
has 16 channels the output is limited to 16 data values of 6DOF bodies. In order
to maximize the use of the 16 bit resolution, the data on each channel can be
scaled so that the resulting value is then converted to a voltage which rep-
resents the value’s proportional position within the range.
NOTE: The output of a channel will be 0 V if the body is not found. If the
input value is outside of the input range the output will be either the Out-
put min or the Output max value depending on the input value.
A rigid body mesh can be used to visualize the object tracked by a rigid body.
QTM supports meshes in the format of the Wavefront 3D object (.obj ) files. To
use a 3D mesh copy the .obj file and all of the related .mtl and texture files to
the Meshes folder of the project. The mesh can be associated with a rigid body
by the following two ways.
From 3D view window
Right-click in the 3D view and select Rigid Body -> Change Mesh of Rigid
Body.
The Rigid Body Mesh Settings dialog is opened, see chapter "Rigid body Mesh
Settings dialog" on page 358. Select the mesh file to use from the list and
modify the settings. Use the Apply button to verify that it looks correct in the
3D view. It is often best to use trial and error to find the position and rotation
for the mesh. Use these tips to position the mesh.
2. Always scale the mesh first before translation and rotation, because the
distance to the mesh coordinate system is changed when scaling.
NOTE: Using really large meshes can slow down the rendering in QTM.
NOTE: The Meshes folder can be set to any folder on the Folder options
page in Project options. This means that the same meshes can be used
by different projects.
When sharing a file with a rigid body mesh it is important to include the mesh
files with the QTM file. This can be done in the following ways:
l Share the whole project.
l Share the QTM file and mesh files together. QTM will find the mesh files if
they are located in the same folder as the QTM file. The other user can also
copy the mesh files to the Meshes folder in their project.
The standard way to use the 6DOF bodies are to create a body for each sep-
arate object that you want to measure. In this case the markers on each subject
have no relation to markers on the other subjects. Follow these instructions to
use 6DOF bodies.
In this case the 6DOF bodies are placed on parts that move together, e.g.
clusters placed on a human subject. Then the best approach is to use AIM
model to identify the markers and then calculate the 6DOF bodies from the
already calculated markers.
1. Create the bodies and their definition normally. Make sure that you name
the points of the bodies in the same way as they are named in the AIM
model.
2. Create an AIM model from the subject, see chapter "Generating an AIM
model" on page 625. If you already have an AIM model with the correct
marker setup you can add the current measurement to that AIM model.
NOTE: The AIM model can contain markers that are not included in
6DOF bodies.
NOTE: If AIM fails to identify a marker, the 6DOF calculation will not
try to identify it either even if you select the Reidentify all body
markers in reprocessing. In most cases the best way to fix this is to
manually identify the data and add it to the AIM model.
4. There can still be 6DOF bodies in the file that are not included in the AIM
models. These will only be identified and calculated when the Calculate
6DOF option is activated, i.e. if you reprocess the file and only apply the
AIM model the trajectories of the separate bodies will not be reidentified.
How to use virtual markers in an AIM model
The virtual markers in the 6DOF functionality can be used in a regular AIM
model if it contains markers that are actually rigid bodies. This example
describes the case when the 6DOF data is only used to create virtual markers
and therefore the actual 6DOF data is not really important.
6. Make sure that both Apply the current AIM model and Calculate 6DOF
are activated on the Processing page.
Tracking of skeletons
The following chapters describe how to define and track skeletons in QTM. The
main applications of skeleton tracking are animation, sports biomechanics, and
virtual reality and gaming. The specific workflow may depend on the applic-
ation. Generally, tracking of skeletons involves the following steps:
Choice of marker set
Skeleton tracking requires the use of a dedicated marker set. For more
information about the available marker sets and the possibilities to cus-
tomize them, see chapter "Marker sets for skeleton tracking" on the next
page. It is also possible to use a custom skeleton definition in QTM, see
Skeleton calibration
Create a skeleton definition or update an existing one, see chapter "Ske-
leton calibration" on page 690.
l You can change the names of the default labels, see chapter "Skeleton
marker label mapping" on page 681.
l You can create a custom skeleton, see chapter "Using a custom skeleton
definition" on page 682.
l The use of extra markers, see also chapter "Adding extra markers to a
skeleton" on page 679.
Two AIM models for the Qualisys Animation Marker Set are included in the
installation of QTM in the subfolder Models\AIM\:
Animation.qam
Generic AIM model with the default markers.
Animation_Optional.qam
Generic AIM model including both default and all the optional markers as
described in the Animation Marker Set Guide.
For more information about automatic labeling of the markers, see chapter
"Automatic labeling of markers or Traqr configuration for skeleton tracking" on
page 682.
It is possible to use alternative labels for the markers. This can be useful if you
have existing files with a different markers set. For more information on how to
use a different mapping of markers, see chapter "Skeleton marker label map-
ping" on page 681.
Qualisys Sports Marker Set
The Qualisys Sports Marker Set is dedicated to sports and biomechanics applic-
ations using the Qualisys Skeleton Solver. As opposed to the animation marker
set, the segments associated with the sports marker set are defined in a con-
ventional and biomechanical way which helps to compute and interpret the
joint angles more easily. For detailed information about the Qualisys Sports
Marker Set, you can open the marker set guide via the Skeleton menu.
l The use of optional extra markers, see also chapter "Adding extra mark-
ers to a skeleton" on page 679.
Two AIM models for the Qualisys Sports Marker Set are included in the install-
ation of QTM in the subfolder Models\AIM\:
Sports_Static.qam
Generic AIM model including static markers for the skeleton calibration.
Sports_Dynamic.qam
Generic AIM model without static markers for dynamic measurements.
Both AIM models are pre-trained for trouble free automatic labeling of a wide
range of movements and actors. For guidelines on how to use these
AIM models in different scenarios, see chapter "Using AIM for sports and bio-
mechanics" on page 685.
It is possible to use alternative labels for the markers. This can be useful if you
have existing files with a different markers set. For more information on how to
use a different mapping of markers, see chapter "Skeleton marker label map-
ping" on page 681.
Traqr VR Marker Set
The Traqr VR Marker Set is dedicated to VR and gaming applications using the
Qualisys Skeleton Solver. When using the Traqr VR marker set, the Qualisys Ske-
leton Solver utilizes the 6DOF data from six Traqrs placed on the back, head,
hands and feet to calibrate and track the skeleton. It is recommended to use
the Active Traqr for the best real time performance, even though it is possible
Skeleton tracking with the Traqr VR Marker Set is easy and requires almost no
preparation once the Traqrs have been configured and set up in your QTM pro-
ject. Since the skeleton tracking relies entirely on rigid body tracking, there is
no need to train an AIM model for labeling markers. For more information
about how to set up the Traqrs for VR skeleton solving, see chapter "Setting up
the Traqrs for VR skeleton tracking" on page 687.
Qualisys Claw Marker Set
The Qualisys Claw Marker Set is dedicated to hand animation applications using
the Qualisys Skeleton Solver. For detailed information about the Qualisys Claw
Marker Set, you can open the marker set guide via the Skeleton menu.
Animation_Optional_Claw.qam
Generic AIM model including default and all the optional markers for the
Animation marker set and default markers for the Claw marker set.
Claw_left.qam, Claw_Right.qam
Generic AIM models with the default markers for the Claw marker set,
respectively for the left and right hand . These AIM models are only for
identifying the hands. They can't be combined with the Animation
AIM models, because then the marker labels will overlap. Use the Anim-
ation_Claw or Animation_Optional_Claw AIM models to combine Anim-
ation and Claw marker set.
For more information about automatic labeling of the markers, see chapter
"Automatic labeling of markers or Traqr configuration for skeleton tracking" on
page 682.
It is possible to use alternative labels for the markers. This can be useful if you
have existing files with a different markers set. For more information on how to
use a different mapping of markers, see chapter "Skeleton marker label map-
ping" on page 681.
Qualisys Full Fingers Marker Set
The Qualisys Full Fingers Marker Set is dedicated to hand animation applications
using the Qualisys Skeleton Solver. For detailed information about the Qualisys
Full Fingers Marker Set, you can open the marker set guide via the Skeleton
menu.
Animation_Optional_FullFingers.txt
Label list including default and all the optional markers for the Animation
marker set and default markers for the Full Fingers marker set.
FullFingers_left.txt, FullFingers_Right.txt
Label lists with the default markers for the Full Fingers marker set,
respectively for the left and right hand . These label lists can be used to
create individual AIM models for identifying the hands. The resulting
AIM models can't be combined with the Animation AIM models, because
then the marker labels will overlap. Use the Animation_FullFingers or
Animation_Optional_FullFingers label lists to combine Animation and Full
Fingers marker set.
For more information about automatic labeling of the markers, see chapter
"Automatic labeling of markers or Traqr configuration for skeleton tracking" on
page 682.
Extra markers or rigid bodies can be placed on the subject and included in the
skeleton definition. This option can be useful for improving the tracking of the
segments of the skeleton or for creating unique marker configurations when
capturing multiple actors. Follow these steps to add extra markers to a skel-
eton:
l The marker label or rigid body name must have the same prefix as the
skeleton, e.g., JD_... for a skeleton with name JD. The prefix should end
with an underscore; the first underscore in the label will be considered as
the separator for the skeleton name.
l The marker or rigid body will be automatically assigned to a segment
based on its placement. However, if you want to specify a specific seg-
ment to which the marker will be associated you can include the segment
name in the marker label, e.g., JD_Hips_... for a marker associated with the
Hips segment of the skeleton definition of JD. The assignment of extra
markers can also be altered via a dialog when calibrating the skeleton.
TIP: Refer to the marker set guides for the segment names.
The use of extra markers will depend on your camera configuration and the
type of movements you want to capture. For the Animation marker set, here
are some recommendations for the body part location of extra markers:
l On RightHand and LeftHand
When calibrating a skeleton with extra markers or rigid bodies, QTM displays a
dialog showing the assigned segments for each extra marker or rigid body. If
needed, you can change the assignment to a different segment, or specify None
if the marker or rigid body should not be used for skeleton solving. The func-
tions of the buttons are:
OK: Approve the chosen segment assignments and proceed with cal-
ibration and solving.
Skip: Proceed with calibration and solving without using the extra mark-
ers. The extra markers will not be included in the skeleton definition.
The skeleton solver automatically recognizes the default marker labels of the
Qualisys Animation Marker Set and the Qualisys Sports Marker Set. If you want to
use alternative marker labels, for example to apply the skeleton solver to exist-
ing measurements, you can create a custom mapping. If you are using an altern-
ative marker set, the marker positions should correspond closely to those
described in the marker guides of the Qualisys marker sets for Animation and
Sports.
Follow these steps to create a custom mapping.
2. Save the default skeleton marker label mapping by pressing the Save but-
ton. Fill in a name in the Save as dialog and press Save. The file contains
all labels of both the Animation and Sports marker sets.
3. Open the created file in a text editor (for example Notepad).
1. First, make sure that your label mapping file is loaded in the project set-
tings.
2. Reprocess the file with the Solve Skeletons option enabled and select
project settings.
3. After reprocessing the file you can calibrate the skeleton from the file.
4. You can now (batch) reprocess the other files with Skeleton solver project
settings to apply the calibrated skeleton.
It is also possible solve custom skeletons with QTM by importing an XML with a
valid skeleton definition. Qualisys supports a workflow for creating and editing
skeleton definitions in Maya that can be used with QTM using QTM-Connect-
For-Maya available on https://github.com/qualisys.
For sports and biomechanics applications with the Qualisys Sports Marker
Set, see chapter "Using AIM for sports and biomechanics" on page 685.
NOTE: The folder with generic AIM files includes a text file with the label
list for each AIM model.
1. Add the model(s) to the Applied models list on the AIM page in Project
Options.
2. Make sure that the Apply current AIM models is checked as an action (real time
and capture) in the Processing page under Project Options.
When using the Traqr VR Marker Set, the Traqrs need to be configured and set
up for rigid body tracking, see chapter "Setting up the Traqrs for VR skeleton
tracking" on page 687.
Generating an AIM model for animation
This chapter describes the best practice for using AIM for skeleton tracking for
animation.
It is highly recommended to create a specific AIM model for each actor to be
tracked. To create a new AIM model, follow these steps:
1. On the AIM page in Project Options, add one of the generic AIM models
for animation to Applied models. The following generic AIM models are
available in the QTM installation in the subfolder Models\AIM:
a. The file Animation.qam contains a generic AIM model for the default
markers of the Qualisys Animation Marker Set.
NOTE: You may also use the load the label lists for each respective
AIM model and label the trajectories manually. The label lists are
available in the subfolder Models\AIM. This option needs to be used
for the Full Finger Marker Set, because there are no generic
AIM models for the Qualisys Full Fingers Marker Set. The reason is
that the identification of the finger markers is improved by cap-
turing a specific AIM model for that actor.
4. If you are using extra markers, add the labels manually to include them in
the new AIM model.
5. Add a prefix to labels. You must end the prefix with an underscore, e.g.,
JD_ for a skeleton with name JD.
NOTE: You must have different prefixes on the right and left hand
if the hands are captured without the body.
There are two generic AIM models for the sports marker set available in the
installation of QTM in the subfolder Models\AIM:
l The file Sports_Static.qam contains all markers, including the static mark-
ers that must be included for the skeleton calibration
l The file Sports_Dynamic.qam contains all markers that are needed for skel-
eton tracking.
To apply one of these AIM models, add it to the Applied models list on the
AIM page under Project Options.
The best way to use AIM for sports and biomechanics depends on your specific
application. Two possible use scenarios are suggested below, but of course you
can choose an alternative approach that works best for your application.
Scenario 1
Use the generic AIM models included with QTM for calibration and track-
ing. The generic AIM models include a standard prefix Q_ for all the
marker labels, corresponding to a standard skeleton name Q. This scen-
ario is only suitable when measuring a single person.
Scenario 2
Create a specific AIM model for an individual person. This scenario is sim-
ilar to the one described for animation applications, see chapter "Gen-
erating an AIM model for animation" on page 683.
2. Optionally, you can rename the prefixes of the markers for creating
a skeleton definition with a different name. The easiest way to do
this is to select the trajectories in the trajectory info window, remove
the prefix of the selected trajectories and add a new prefix. Do not
forget to use an underscore character "_" as a separator.
3. If you want to use extra markers you need to manually label them.
Use the same prefix if you want to include them in the skeleton
definition.
If needed, you can improve the AIM model by training it with more move-
ment data, see chapter "Generating an AIM model" on page 625 for more
detailed information.
If you want capture multiple persons, repeat the above steps for each per-
son. When done, add all persons to be included in the capture to Applied
models on the AIM page in Project Options.
Setting up the Traqrs for VR skeleton tracking
To get started with VR skeleton tracking using the Traqr VR Marker Set, you will
first need to configure your Traqrs and set them up in your QTM settings.
For setting up your Traqrs, follow these steps:
1. Traqr configuration
When using the Active Traqr make sure that the all markers have a unique
marker ID. Use the Traqr Configuration Tool to change the configuration
of the Traqrs if needed.
2. Definition of the rigid bodies
Create a rigid body definition for each Traqr. The easiest way to do this is
to attach a set of 6 Traqrs to a person and do a short T-pose capture.
Then, for each Traqr, select the markers in the 3D view window and press
F8 to create a new rigid body and add it to the project. The rigid bodies
should be named according to the naming conventions described in the
Once the Traqrs have been defined as rigid bodies in the project, only step 3
and 4 are needed to prepare the player(s) for a session.
l For the Qualisys Animation Marker Set a T-pose is required for a correct cal-
ibration. Optional markers can be added for a better definition of the ori-
entation of the segments. For more information, refer to the marker set
guide that can be opened in QTM via the Skeleton menu.
l For the Qualisys Full Fingers Marker Set a Hand calibration pose is
required for a correct calibration.
l For the Qualisys Claw Marker Set a Claw calibration pose is required for a
correct calibration.
l For the Qualisys Sports Marker Set make sure that the static markers are
present. The actor or subject should stand in upright position with both
feet flat on the floor. A specific pose is not required, since the marker pos-
itions in the Qualisys Sports Marker Set provide sufficient information for
correct definition of the skeleton segments.
Press the Calibrate skeletons button (keyboard shortcut F10) to apply the skel-
eton calibration. When applying the skeleton calibration for the first time, a
new skeleton definition will be added to the Skeleton Solver page under Pro-
ject Options. When the skeleton is already defined, the skeleton calibration
will replace the existing skeleton definition in the project.
If there are multiple actors in the preview or capture, the skeleton calibration is
applied simultaneously to all actors.
When the skeleton calibration is applied to a capture, the new skeleton defin-
ition will be automatically applied to the data in the current measurement
range.
It is possible to calibrate and solve upper and lower body parts of the skeleton,
see chapter "Partial skeleton calibration" on page 692.
T-pose
When using the Qualisys Animation Marker Set, the actor must stand in a correct
T-pose when applying the calibration. When applying the skeleton calibration to
a file, make sure that the actor stands in T-pose in the current frame. For a cor-
rect T-pose, make sure that the following requirements are fulfilled:
l Thighs and shanks must be vertical making a small gap between both
ankles.
l Feet must be parallel to each other and pointing in the front of the sub-
ject.
l Head and neck must be aligned with the spine, i.e., the subject must stand
with the head straight, facing forward.
l Arms, forearms and hands must be parallel to the floor. Check that the
HandOut and WristOut markers are horizontally aligned.
l Palms of the hands must face the floor.
l Arm and forearm must be aligned. A virtual line should go through the
glenohumeral joint (underneath ShoulderTop marker), the elbow joint and
the wrist joint.
l When combined with the Qualisys Full Fingers Marker Set for the hands:
l Fingers must be straight, no bending.
When using the Qualisys Full Fingers Marker Set or Qualisys Claw Marker Set, the
actor must have their hands in a correct Calibration pose when applying the cal-
ibration. When applying the skeleton calibration to a file, make sure that the
actor have their hands in a Calibration pose in the current frame. For a correct
Calibration pose, make sure that the following requirements are fulfilled:
For the Sports Marker Set the lower and upper body can be solved. For
detailed information about the required markers, see the Sports Marker
Set guide.
Once the required markers are labeled, calibrating and solving the skeleton
works similar as calibrating and tracking the complete skeleton:
1. You can create an AIM model including the markers of the partial skeleton
to facilitate tracking, see chapter "Automatic labeling of markers or Traqr
configuration for skeleton tracking" on page 682.
The scale factor can be used to indicate the scale of the skeleton tracked in
QTM relative to an animated object, for example, an avatar in an external anim-
ation application. The scale factor can be set in the following ways:
1. By setting the Scale factor (%) value for a skeleton in the Skeletons list
on the Skeleton Solver page. For scaling up the skeleton, use a per-
centage larger than 100%, and for scaling down use a percentage less
than 100%.
2. By modifying the value in the Scale tag in the skeleton definition XML and
re-importing it in the project. In the XML file the scale factor is defined as
the inverse ratio. For example, a Scale value of 0.8 in the XML file cor-
responds to a scale factor of 125% in QTM. Alternatively, the skeleton
definition can also be updated by sending an XML packet via the real time
protocol, see the QTM RT Protocol documentation included in the
QTM installation for more information.
The scale factor is applied to the following data:
l Real time skeleton data streamed by QTM.
NOTE: The scale factor is not applied to the skeleton data that is dis-
played in the Data info window.
1. Using a skeleton template that modifies the parameters that are used
when creating the skeleton definition in the skeleton calibration process,
see chapter "Skeleton template" below.
2. By manually editing the parameters in the skeleton definition after the
skeleton calibration, see chapter "Manual editing of the skeleton defin-
ition" on page 696.
For more information about editing a skeleton definition or template, see
chapter "Skeleton XML editing" on page 697.
Skeleton template
The skeleton solver can use a skeleton template for the skeleton calibration.
The template can modify the degrees of freedom and the weights of markers
for segments used during the skeleton calibration. When the Skeleton tem-
plate option is set to default then QTM uses the predefined parameters for
each skeleton marker set.
NOTE: The skeleton calibration uses the template for all skeleton types.
For example, a skeleton template created from a Sports marker set will
change the corresponding parameters in an Animation marker set as
well.
The skeleton definition or template can be edited in a text editor. The most
important elements that can be edited are listed below. For the complete XML
specification, see the information about Skeleton XML parameters in the RT pro-
tocol documentation.
Marker weight
Edit the value of the <weight> tag to change the relative weighting of a
marker for solving the segment pose.
Degrees of freedom
Add or remove degrees of freedom for a segment by adding or removing
the corresponding tags under the <DegreesOfFreedom> tag.
Segment labels
Edit the segment label to use alternative segment names.
Marker labels
Edit the marker labels, for example to apply the skeleton definition to
existing labeled files with different marker names.
Removing segments
Remove segments for partial skeleton solving. Note that the root segment
and all intermediate segments between the root and the respective end
segments should be present.
NOTE: All elements can be modified when manually editing the skeleton
definition, but only marker weights and degrees of freedom of segments
can be modified in the skeleton template.
NOTE: The real time skeleton data may differ from the skeleton data in a
capture as the process used in real time is optimized for faster cal-
culation.
For animation applications it is good practice to start and end each capture
with the actor(s) standing in T-pose. This allows for the possibility to recalibrate
the skeleton if the quality of the fit is insufficient, for example if a marker has
moved.
Skeleton assisted labeling (SAL) identifies trajectories using the segment mark-
ers in a skeleton. The unidentified trajectory part that is closest to a segment
marker and fulfills the set distance criteria will be identified as the cor-
responding labeled trajectory. The options for SAL can be set at the SAL set-
tings page.
Use the Claim threshold option to set the required closeness between a
marker and a segment marker for claiming the associated trajectory label. The
default value is 20 mm. Use a lower value when markers can be close to each
other for example when solving fingers.
Post-processing
To use SAL as a post-processing step it is required that the file has solved
skeletons. Apply SAL either via the Reprocessing dialog or with the
Identify trajectories using skeleton (SAL) on the Skeleton menu. When
changing the SAL settings, the file should be reprocessed with the new val-
ues for them to take effect. Check that the labeled trajectories are correct
and then run the Solve skeletons processing step again in reprocessing
to update the skeleton data. Repeat the steps of SAL and skeleton solving
as many times as needed to get the skeleton data.
l In a batch capture just before QTM starts Waiting for next meas-
urement/trigger
NOTE: It is important to not stand on the force plate at the start of the
measurement, if you use the Remove offset/drift option on the analog
data. It is also important to not stand on a force plate when the reset sig-
nal is sent.
When the Recalculate forces command is used the settings of the force plate
can be changed in the File reprocessing dialog. The settings in the dialog are
the same as when the file was created. However, if the settings were not spe-
cified for the motion capture the current settings in the Project options dialog
are copied instead.
In the 3D view window the force data is displayed as a vector with its base in
the Center Of Pressure (COP). The vector displays the opposite force applied to
the force plate, that is the reaction force. The purple squares represent the
force plates and are placed at the force plate coordinates that are specified on
the Force plate page in the Project options dialog. The light blue force traces
NOTE: The force plates that are activated on the Force data page will be
shown in the 3D view window even if there is no analog data. So you can
show a force-plate even if there is no analog data. Which can be used if
the force is collected by another program, but you want to see the force
plate location in QTM.
NOTE: If you transform the global coordinate system the force plate
coordinates will be the same, which means that you have to change them
to move the force plate to the correct location, see chapter "Force plate
location" on page 382.
To make the most of the force data it can be exported to an analysis software.
The best format to use are TSV, C3D or MAT file, because then the force plate
location and for TSV and MAT the force data is included in the file, see chapter
"Data export to other applications" on page 710. For example Visual3D uses
C3D and recalculates the forces from the original data, therefore it can differ
some from what is displayed in QTM.
Adding events
Events can be used to mark something that is happening. The events is sent in
RT and can be added to a QTM file during the measurement and after the meas-
urement.
There are two ways to create an event.
Trigger event
You can use an external trigger to generate an events during a meas-
urement. When using the Qualisys trigger button, it is recommended that
you release the button quite quick, because releasing the trigger button
can sometime also generate an event. If you have trouble with extra
events, you can increase the hold-off time on the Synchronization page.
The event functionality is activated by default but can be changed with the
Generate event setting for the external trigger on the Synchronization
page, see chapter "Trigger ports" on page 273.
When the events are created during a capture it will be stored in the QTM
file as Trigger or Trigger start event. The Trigger start event is only used
for start event of a pretrigger measurement. You can change the default
color of these events on the Synchronization page in Project options,
see chapter "Synchronization" on page 266.
The timing of the Trigger event will be the exact time when the signal is
received by the camera. It is therefore the most exact way to set an event,
especially if the trigger signal is generated automatically. Because the
time of the event can be any time between two frames, it means that it is
most likely not placed exactly at the capture of a frame. The frame num-
ber of the event will be rounded to the nearest frame.
When the events are created during a capture it will be stored in the QTM
file as Manual event. The timing of the Manual event will be the time
when you press the button in QTM. This means that it is most likely not
placed exactly at the capture of a frame. The frame number of the event
will be rounded to the nearest frame.
Events created with Add event button in a file will open the Add event
dialog. The event will be placed on the current frame in the file. You can
change the Label of the event and also Time, Frame and Color.
You can also use event shortcuts in the Add event dialog to create the
events. Double-click on an event shortcut in the list to load the label name
and color. To edit the shortcuts click on Edit event shortcuts, that opens
the Events page in the Project options dialog, see chapter "Events" on
page 430.
l You can go to next and previous event in the file with Page Down respect-
ively Page Up.
You can access all of the events in the Edit event list dialog. Which can be
opened by right-clicking on an event and then on Edit event list.
Add
Add a new event with the Add event dialog.
Remove
Remove the selected event.
Edit
Open the Edit event dialog where you can change the Label, Time and
Frame.
Goto
Go to the frame of the event in the file.
NOTE: For Visual3D 2020.8.3 or later it is required to use the option Fol-
lowing the C3D.org specification for the C3D export.
For the TSV export you need to activate the Include events option to export
the events. For more information about the TSV format, see chapter "Motion
data (.tsv)" on page 713.
For the MAT export you need to activate the Events option to export the
events. For more information about the MAT format, see chapter "MAT file
format" on page 730.
In QTM you can define the Euler angles as any possible rotation of a right-hand
coordinate system, see chapter "Euler angles" on page 392. By default QTM
uses the Qualisys standard definition, which is described in the chapter "Rota-
tion angles in QTM" on page 663.
l Via the Batch Exporting dialog, see chapter "Batch exporting" below.
Batch exporting
With batch export several files can be exported with the same settings at once.
2. Select the files you want to export in the file dialog with the mouse and by
holding the Control or Shift key, and press Open.
3. Select one or multiple export formats in the Batch Exporting dialog.
l 6DOF data
l Skeleton data
l Analog data
Each data type is exported as a separate file. By default, TSV files have the
same name as the QTM file with an additional suffix, depending on the data
type. In case there are multiple devices (e.g. analog devices, force plates), one
file per device is exported with an index added to the suffix.
Motion data (.tsv)
The motion data file contains the data of the trajectories of the motion capture
and it has two parts: file header and data. It can contain either 2D or 3D data
depending on the settings on the TSV export page in the Project options dia-
log.
Header
The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable value as a
second word. Each variable is on a new line. The following variables are avail-
able in the file header:
FILE_VERSION
Version number of export format, currently 2.0.0.
NO_OF_FRAMES
Total number of frames in the exported file.
NO_OF_CAMERAS
Number of cameras used in the motion capture.
NO_OF_MARKERS
Total number of trajectories (markers) in the exported file.
FREQUENCY
Measurement frequency used in the motion capture.
DESCRIPTION
At present not in use by QTM.
TIME_STAMP
Date and time when the motion capture was made. The date and time is
followed by a tab character and then the timestamp in seconds and ticks
from when the computer was started.
DATA_INCLUDED
Type of data included in the file, i.e. 2D or 3D.
NOTE: The events are only added if Include events are active for
the TSV export settings.
MARKER_NAMES
List of trajectory labels (trajectories in the Labeled trajectories window)
separated by tab characters. Unidentified trajectories have no names and
are therefore only represented by a tab character. The number of names
in the list corresponds to the value given by the NO_OF_MARKERS vari-
able.
TRAJECTORY_TYPES
List of trajectory type for each exported trajectory separated by tab char-
acters. The list is sorted in the same order as the MARKER_NAMES. The
available types are Measured, Mixed, Virtual, Gap-filled and - (empty
label).
Data
In 3D export the data part follows on a new line after the last marker name.
The trajectory data (in mm) is then stored in tab-separated columns, where
every row represents one frame. Each trajectory has one column for each dir-
ection (X, Y and Z). The data for the first marker is therefore stored in the first
three columns and data for the second marker is stored in column 4, 5 and 6
etc.
In 2D export the data part follows on a new line after the variable DATA_
INCLUDED. The data part starts with a row with the number of the cameras.
The marker data for each camera is then given below its corresponding head-
ing, e.g. Camera: 1. Every marker has four columns with data, which is the
same as that shown in the Data info window: x, y, xSize and ySize. Each row in
There are three options in the TSV export which you can use to add more
information to the file.
When the Export time data for every frame option is checked, two
columns will be added containing time data. The first column contains the
frame number and the second contains the time in seconds relative to the
start of the capture. If the camera frames contain timestamps (SMPTE or
IRIG), a third column with timestamp data is added. The format of the
timestamp string depends on the type of timestamp used.
The TSV export of files with 6DOF data creates a TSV file (.tsv) with a file header
and a data part. The variable names in the file header are followed by a tab
character and then the value. Each variable is on a new line.
Header
The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable value as a
second word. Each variable is on a new line. The following variables are avail-
able in the file header:
NO_OF_FRAMES
Total number of frames in the exported file.
NO_OF_CAMERAS
Number of cameras used in the motion capture.
NO_OF_BODIES
Total number of rigid bodies in the exported file.
FREQUENCY
Measurement frequency used in the motion capture.
DESCRIPTION
At present not in use by QTM.
TIME_STAMP
Date and time when the motion capture was made. The date and time is
followed by a tab character and then the timestamp in seconds from
when the computer was started.
DATA_INCLUDED
Type of data included in the file, i.e. 6D.
EVENT
Each event is added on a new row starting with word EVENT. Then fol-
lowed by the name of the event, frame number and time, each separated
by a tab character.
BODY_NAMES
Tab-separated list with the names of the rigid bodies in the exported file.
BODY_FILTERS
Tab-separated list with the names of the used filter presets for the
respective rigid bodies.
TRANSLATION_ORIGIN
Tab-separated list with the translation origin for each rigid body. The
alternatives are Global, Relative 'Name of reference rigid body' and Fixed [X,
Y, Z]
ROTATION_ORIGIN
Tab-separated list with the rotation origin for each rigid body. The altern-
atives are Global, Relative 'Name of reference rigid body' and Fixed [Rota-
tion matrix]
Data
On a new line after the last rigid body name follows a tab-separated list of
the data headings for the rigid bodies. The headings are:
X, Y and Z
The position of the origin of the local coordinate system of the rigid
body. Where X, Y and Z are the distance in mm to the origin of the
coordinate system for rigid body data, see chapter "Coordinate sys-
tem for rigid body data" on page 354.
Residual
The average of the errors (in mm) of each measured marker com-
pared to the 6DOF body definition. This error is probably larger than
the 3D residual.
Rot[0] - Rot[8]
The elements of the rotation matrix for the rigid body. Where the ele-
ments are placed in the matrix according to the following table:
The data part follows on a new line after Rot[8]. The data is stored in tab-sep-
arated columns, where each row represents a frame. The columns are in the
same order as the heading list described above. If there is more than one rigid
body, their frames are stored on the same rows as the first body. They are just
separated by two tab characters after the Rot[8] data of the previous body.
There are two options in the TSV export which you can use to add more inform-
ation to the file.
When the Export time data for every frame option is checked, two
columns will be added containing time data. The first column contains the
frame number and the second contains the time in seconds relative to the
start of the capture. If the camera frames contain timestamps (SMPTE,
IRIG or Camera time), a third column with timestamp data is added. The
format of the timestamp string depends on the type of timestamp used.
NOTE: Each rigid body name is only entered before its respective X
column header, the following headers only includes the contents of
the column
The skeleton data files contain the skeleton data included in the capture. There
will be one file for each skeleton in the capture. The file name has suffix _s_
<skeleton name>, where the last part corresponds to the name of the respect-
ive skeletons in the Skeleton Solver page in Project Options. The file contain
a file header and a data part. The variable names in the file header are followed
by a tab character and then the value. Each variable is on a new line.
Header
The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable value as a
second word. Each variable is on a new line. The following variables are avail-
able in the file header:
FILE_VERSION
Version number of export format, currently 2.0.0.
NO_OF_FRAMES
Total number of frames in the exported file.
NO_OF_CAMERAS
Number of cameras used in the motion capture.
TIME_STAMP
Date and time when the motion capture was made. The date and time is
followed by a tab character and then the timestamp in seconds from
when the computer was started.
REFERENCE
Reference used for the skeleton data. Global: all segment positions and
rotations are expressed relative to the global coordinate system. Local: all
segment positions and rotations except the Hips segment are relative to
their respective parent segment.
SCALE
The scale setting used for the skeleton. In the TSV file the scale factor is
defined as the inverse ratio. For example, a SCALE value of 0.8 in the TSV
file corresponds to a scale factor of 125% in QTM.
SOLVER
The solver used for the skeleton.
When the Write column header option is checked, a header line is added
above each column describing the contents of that column. The column
headings for each segment are:
Segment name
Name of the respective skeleton segments. There will be no data in
the column below.
X, Y and Z
Segment position data in mm.
When Write column header is enabled there are 8 columns per seg-
ment. The first column below the segment name is empty, the remaining
7 columns contain the position and orientation data for the segment. The
total number of columns is 176 for all 22 segments.
When the Export time data for every frame option is checked, two
columns will be added containing time data. The first column contains the
frame number and the second contains the time in seconds relative to the
start of the capture. If the camera frames contain timestamps (SMPTE,
IRIG or Camera time), a third column with timestamp data is added. The
format of the timestamp string depends on the type of timestamp used.
Analog data (_a.tsv)
The analog data files contain the data of the analog capture. Each file contains
the data from one analog board or EMG system. If there is only one source of
analog data the file name ends with _a. If there are more than one source of
analog data the files are numbered in the same order as they appear in the
Data info window (_a_1, _a_2 and so on). There are two parts in the file: file
header and data.
Header
The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable. Each variable
is on a new line. The following variables are available in the file header:
FILE_VERSION
Version number of export format, currently 2.0.0.
TOT_NO_OF_CHAN
Total number of channels in the exported file.
NO_OF_CALC_CHAN
At present not used by QTM.
TIME_STAMP
Date and time when the measurement was made. The date and time is fol-
lowed by a tab character and then the timestamp in seconds from when
the computer was started.
DESCRIPTION
At present not used by QTM.
DATA_INCLUDED
Type of data included in the file. Set to ANALOG by the QTM software
CHANNEL_NAMES
List of tab-separated channel names. The number of names in the list cor-
responds to the number of channels given by the TOT_NO_OF_CHAN vari-
able.
CHANNEL_GAIN
List of tab-separated gains for each channel.
At present they are all set to 1 by QTM.
CHANNEL_FREQUENCIES
List of tab-separated analog sampling frequency values for each channel.
The sampling frequencies can differ between channels.
CHANNEL_SAMPLE_COUNTS
List of tab-separated sample count values for each channel. The number
of samples can differ between channels.
The data part follows on a new line after FP_GAIN. The data of the analog chan-
nels are then stored in tab-separated columns, one column for each channel
and one row per sample. The data is always saved with 6 digits.
NOTE: For analog boards, the units of the exported data are in V. EMG
data from integrated devices are exported in μV or mV, depending on the
units provided by the device. For other data types from integrated
devices, the units are converted to SI units associated with the data type.
There are two options in the TSV export which you can use to add more inform-
ation to the file.
When the Export time data for every frame option is checked, each
data column is preceded by two columns containing time data. The first
column (SAMPLE) contains the sample number and the second column
(TIME) contains the time in seconds relative to the start of the capture.
The force data files contain the data of the force plates. Each file contains the
data from one force plate. The file names end with _f and are indexed with the
number of the force plate in the same order they appear on the Force data
page (_f_1, _f_2 and so on). There are two parts in the file: file header and data.
Header
The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable. Each variable
is on a new line. The following variables are available in the file header:
NO_OF_SAMPLES
Total number of samples in the exported file.
TIME_STAMP
Date and time when the measurement was made. The date and time is fol-
lowed by a tab character and then the timestamp in seconds from when
the computer was started.
FIRST_SAMPLE
Original number of the first frame in the range that is exported from the
QTM software. The start time of the exported data can then be calculated
as FIRST_SAMPLE / FREQUENCY.
DESCRIPTION
Information about the coordinate system of the force plate in the file. It
can be either Force data in local (force plate) coordinates or Force
data in world (lab) coordinates.
DATA_INCLUDED
Type of data included in the file. Set to Force by the QTM software
FORCE_PLATE_TYPE
The type of force plate.
FORCE_PLATE_MODEL
The model of the force plate.
FORCE_PLATE_NAME
The name of the force plate defined on the Force data page.
FORCE_PLATE_CORNER_POSX_POSY_X, FORCE_PLATE_CORNER_POSX_
POSY_Y and FORCE_PLATE_CORNER_POSX_POSY_Z
Position (in mm) of the top left corner when looking at the internal force
plate coordinate system. The position is in the measurement coordinate
system.
FORCE_PLATE_CORNER_NEGX_POSY_X, FORCE_PLATE_CORNER_NEGX_
POSY_Y and FORCE_PLATE_CORNER_NEGX_POSY_Z
Position (in mm) of the top right corner when looking at the internal force
plate coordinate system. The position is in the measurement coordinate
system.
FORCE_PLATE_CORNER_POSX_NEGY_X, FORCE_PLATE_CORNER_POSX_
NEGY_Y and FORCE_PLATE_CORNER_POSX_NEGY_Z
Position (in mm) of the bottom right corner when looking at the internal
force plate coordinate system. The position is in the measurement
coordinate system.
FORCE_PLATE_LENGTH
The length of the force plate (in mm).
FORCE_PLATE_WIDTH
The width of the force plate (in mm).
Data
The eye tracker data files contain the data of eye tracking devices included in
the capture. The data exported depends on the eye tracking device used. For
more information about the export of specific eye tracker devices, refer to the
detailed information of the device.
l For Tobii, see " Process and export Tobii gaze vector data" on page 879.
NOTE: C3D data can be exported even if the file does not contain any 3D
data. For example if you have measured forces or EMG.
The C3D export creates a C3D file, for information about the binary C3D format
see http://www.c3d.org.
The C3D format has the following limitations for analog data:
If this is not the case for the analog data stored in the QTM file, the analog data
will be resampled at a multiple of the capture rate equal to or higher than the
highest analog frequency. For example, if the marker frequency is 120 Hz, the
EMG frequency is 1500 Hz, and the analog frequency is 1200 Hz, all of analog
data will be resampled to 1560 Hz.
Multiple subjects can be included in a C3D file, for more information see
chapter "Parameter Groups" on page 401.
When the data from QTM is exported to a MAT file a struct array is saved in the
file. The struct array is named the same as the file. If the file name does not
start with an English letter, a prefix qtm_ will be added to the name of the struc-
ture array. To use the struct array, write the name of the struct array and the
fields with a period between them. If several files have been exported to Mat-
lab, write the variable as QTMmeasurements(1), QTMmeasurements(2) and
so on to get the data of the file.
The struct array contains different fields depending on if the file includes 3D,
6DOF, analog (including EMG), force, Eye tracker, SMPTE timecode and events
data. The fields and their contents are described below:
FileVersion
Version number of export format (array with 3 elements), currently 2, 0, 0.
File
File name and directory path.
Timestamp
Time when measurement was started. In date format YYYY-MM-DD,
HH:MM:SS.SSS, followed by a tab character and the timestamp in seconds
and ticks from when the computer was started.
Frames
Number of frames.
FrameRate
Frame rate in frames per second.
Trajectories
Struct with fields Labeled for labeled markers and, optionally Uniden-
tified for unidentified markers. These fields are structs with the following
fields:
Count
Number of trajectories in the window.
Labels
A list of the trajectory labels.
Data
The location of the 3D points (in mm) of the trajectories in the win-
dow. The data is given in a matrix with the dimensions: Trajectories
* X, Y, Z direction and Residual * Frames.
Type
Type specification of the markers per frame. The data is given in a
matrix with the dimensions: Trajectories*Frames. The types have val-
ues 0-4, which indicate: Missing=0, Measured=1, Gap-filled=2, Vir-
tual=3, Edited=4.
Analog/EMG
Struct array with data from the analog capture. The analog and EMG data
from integrated wireless EMGs are stored in separate struct arrays, but
the structure is the same.
BoardName
The name of the board that was used in the capture.
NrOfChannels
The number of channels that were used in the capture.
ChannelNumbers
The channel numbers that were used in the capture.
Labels
An array with the names of the channels that were used on the ana-
log board.
Range
The range of the channels on the analog board.
NrOfFrames
The number of exported motion capture data frames.
SamplingFactor
The multiplication factor compared with the motion capture frame
rate. The SamplingFactor is specified per channel in a 1 x NrOfChan-
nels array.
NrOfSamples
The number of exported analog samples per channel (1 x NrOfChan-
nels array).
Frequency
The sampling frequency of the analog data per channel (1 x
NrOfChannels array).
NOTE: For analog boards, the units of the exported data are
in V. EMG data from integrated devices are exported in μV or
mV, depending on the units provided by the device. For other
data types from integrated devices, the units are converted to
SI units associated with the data type.
Force
Struct array with data from the force plates. The elements of the struct
array contain the data of the respective force plates present in the file.
ForcePlateName
The name of the force plate that was used in the capture.
NrOfFrames
The number of channels that were used in the capture.
SamplingFactor
The multiplication factor compared with the motion capture frame
rate.
NrOfSamples
The number of samples in the analog capture.
Frequency
The frequency of the analog capture.
Force
The force data in newton (N), the data is given for X, Y and Z dir-
ection.
Moment
The moment data in newton meter (Nm), the data is given for X, Y
and Z direction.
ForcePlateLocation
The location of the force plate in measurement coordinate system.
The corners are in the order upper left, upper right, lower right and
lower left seen in the force plate coordinate system.
ForcePlateOrientation
Coordinate system in which force data is expressed: 0 (local force
plate coordinates), 1 (global coordinate system).
ForcePlateOffset
The force plate offset values as filled in for the specific force plate.
The order is the same as on each respective force plate calibration
settings in Project options.
RigidBodies
Struct with data for the 6DOF bodies. The struct contains the data of all
rigid bodies present in the file.
Bodies
The number of 6DOF bodies.
Name
The names of the 6DOF bodies.
Filter
Struct array with information about the filter used for the respective
rigid bodies.
Preset
Name of the filter preset used for the rigid body.
CoordinateSystem
Struct array with information about the reference coordinate system
used for the respective rigid bodies.
Reference
Coordinate system option used for the rigid body. The possible
options are Global, Relative and Fixed.
DataOrigin
The fixed origin (x,y,z) used for the Fixed coordinate system
option. For the Global and Relative options the value is set to
(0,0,0).
DataRotation
The fixed rotation matrix in relation to the global coordinate
system used for the Fixed coordinate system option. For the
Global and Relative options the value is set to the unit matrix.
Positions
The position of the origin of the measured rigid body’s local coordin-
ate system. It is given as a matrix with the dimensions: Bodies *
Distances (X, Y and Z) * Frames. The distances are in mm to the ori-
gin of the coordinate system of the motion capture.
Rotations
The rotation matrices of the rigid bodies. It is given as a matrix with
the dimensions: Bodies * Rotation matrixes (elements 0-8) * Frames.
The elements are placed in the matrix according to the following
table:
RPYs
The roll, pitch and yaw of each rigid body. It is given as a matrix with
the dimensions: Bodies * Rotation angles (roll, pitch and yaw) *
Frames. The rotation angles are in degrees.
Residuals
The residual of the rigid bodies.
Skeletons
Struct array with data of the skeletons. The elements of the struct array
contain the data of the respective skeletons present in the file.
SkeletonName
The name of the skeleton [char array].
Solver
The type of solver used for the skeleton.
Scale
The scale setting used for the skeleton. In the MAT export the scale
factor is defined as the inverse ratio. For example, a Scale value of
0.8 corresponds to a scale factor of 125% in QTM.
Reference
Reference used for the skeleton data. Global: all segment positions
and rotations are expressed relative to the global coordinate sys-
tem. Local: all segment positions and rotations except the Hips seg-
ment are relative to their respective parent segment.
NrOfSegments
Number of segments of the skeleton [double].
SegmentLabels
Names of the segments of the skeleton [1 x NrOfSegments cell array
with char elements].
PositionData
Position data (X, Y, Z) of the segments of the skeleton [3 x NrOfSeg-
ments x Frames double].
SMPTETimecode
Struct array with the SMPTE timestamps of the frames in the file.
Hour
The hour of the timestamp.
Minute
The minute of the timestamp.
Second
The second of the timestamp.
Frame
The SMPTE frame number of the timestamp.
Missing
Indicates if the SMPTE timecode is extrapolated if the SMPTE syn-
chronization source is lost during the measurement.
IRIGTimecode
Struct array with the IRIG timestamps of the frames in the file.
Year
The year of the timestamp.
Day
The day of the timestamp.
Minute
The minute of the timestamp.
Second
The second of the timestamp.
Tenth
The decisecond of the timestamp.
Missing
Indicates if the timecode is extrapolated if the IRIG synchronization
source is lost during the measurement.
CameraTimecode
Struct array with the camera timestamps of the frames in the file. For
more information about Camera time, see chapter "Timestamp" on
page 284.
Tick
Tick value of camera timestamp. This value represents the time in
seconds multiplied with a factor 107.
Missing
Indicates if the timecode is extrapolated if the synchronization
source is lost during the measurement.
Events
Struct array with a list of all the events in the file.
NOTE: This struct array is only included if the capture file contains
events.
Label
The label of the event.
Frame
The corresponding frame of the event. The time will be rounded to
the nearest frame.
Time
The time of the event.
Use the Preview/Resize button to open a window where you can check
the dimensions and also change the dimensions.
Processing step
The AVI export can be done as processing step either directly after a meas-
urement, in reprocessing or in batch processing. The view that is used in
the export is the one saved in Previous settings in the Window settings
list. Since the Previous settings view is changed when you make an
export, it is important that you save any view that you want to use with
the Save view option. Then you can select that view again for the pro-
cessing step before the processing.
It is recommended to use a codec when creating the AVI file, since the video file
will be very large uncompressed. For more information about recommended
codecs, see chapter "Recommended codecs" on page 583.
The video export is done by displaying the view that you want export in a spe-
cial window that is then captured to an AVI file. Because the video has to be dis-
played in the window, the export will be faster on a computer with a good
external graphics board. The processing time can also be reduced by using no
more than 30 Hz and by making the dimensions of the video export smaller.
l USB-1608G
Any analog device which has an output voltage between ± 10 V can be con-
nected to the analog board. QTM will however only make calculations on data
from force plates. For information about force plates and EMG devices, see
chapters "How to use force plates" on page 756 and "How to use EMG" on
page 803.
5. Instacal will detect the USB A/D board and display the following dialog.
6. Click OK. The board will then be listed in Instacal as shown below.
It is important to check that the XAPCR Edge setting matches the setting
for the Synchronization output on the Synchronization page in the Pro-
ject options dialog. The default TTL signal polarity is Negative which
matches XAPCR Edge = Falling.
When the board is properly installed it will be listed on the Input devices page
in the Project options dialog in QTM.
Installing the USB-1608G board
Installing the USB-1608G board requires that InstaCal has been installed on the
computer. If needed, reinstall QTM with the InstaCal option checked. Follow
this procedure to install the USB A/D board:
3. The USB-1608G board should appear in the plug and play dialog. Click OK.
You have the following two options in the warning after you have captured a
file.
NOTE: If you have selected the Remove offset in real-time option, the
warning will appear directly when you start the capture. The capture will
however continue and if you like you can wait to choose what you want to
do until the capture is finished. If you choose what to do before the file
has finished, the warning will not appear again.
If the In RT too option is activated on the Analog board page, then there is
also a warning when you start RT/preview with New on the File menu. If you do
not use the RT output it is recommended to inactivate the Remove offset in
You have the following options when the warning is displayed in RT/preview
when you start a new measurement with New on the File menu.
NOTE: The offset check is performed every time the RT has to be restar-
ted, for example if you change a setting in Project options.
NOTE: If you do not want to turn off the offset compensation for all
of the channels, select Go to setting... and then select on which
channels to use the offset compensation, see chapter "Channels" on
page 297.
Go to settings...
Open the Project options dialog and go to the Analog boards page. You
have to select the correct analog board manually to change the offset set-
tings, see chapter "Compensate for analog offset and drift" on page 295.
When using an AMTI Gen5, OPT-SC (Optima Signal Conditioner) amplifier or the
AccuGait Optimized and AccuPower Optimized plate you need to follow these
steps. The steps are the same for all of the amplifiers and plates so the instruc-
tions only refer to Gen5.
These are the hardware connections that are needed. The next step is to activ-
ate the AMTI Gen5 force plate in QTM.
2. Go to the page for each amplifier under the Force plates page, see
chapter "AMTI Digital force plates" on page 306.
a. Set the Sync source option to the device that the sync cable is con-
nected to.
b. Click on Advanced to change the frequency of the force plate. You
will go to the Synchronization page with just the device used as
Sync source selected. Change the Synchronization output setting
to the desired frequency, see chapter "Synchronization output" on
page 285. It is recommended to use the Camera frequency mul-
tiplier option to set the frequency, because for example when
3. Go to the Force data page in the Project options dialog. Rename the
force plate if you want it to have a custom name.
4. Open the Force plate page for each AMTI Gen5 force plate.
l Enter the position of the force-plate, see chapter "Force plate loc-
ation" on page 382. It is good to do this after each new calibration
especially if the calibration L is not placed at the same position.
5. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured file make sure that it is activ-
ated both for Real-time actions and Capture actions.
6. Test the force plates in QTM. If there is no force, check that the Sync cable
is connected correctly.
l If the cable is connected, make a test in AMTI Netforce to see if there
is force data in that program.
The Arsalis force plates are digitally integrated in QTM. For further information
about the force plates, refer to the manufacturer's documentation.
The sections below describe how to connect the force plates and how to set
them up in QTM.
Hardware connections
Synchronization
The use of hardware synchronization is required and requires a Camera
Sync Unit for Arqus or Miqus systems, or a Sync/Trigger splitter for Oqus
systems.
Connect one of the Sync out outputs of the Camera Sync Unit to the Trig
In port of the Arsalis connection box. If you are using an Oqus camera as
sync device, use the Sync out connector of the Sync/Trigger splitter. In
the Synchronization settings, set the Synchronization output mode for the
Sync Out port to Mulitplier with a multiplier of 1. Make sure that the polar-
ity matches between the settings in QTM and 3D-Forceplate. Use the
default option of Negative polarity for the sync out signal.
Before setting up the connection in QTM, start a data stream in the 3D-For-
ceplate software.
1. Start the 3D-Forceplate software. If you need to unlock, try the default
password 1234.
2. Press Stream data, and zero the force plate. This opens up a 3D-For-
ceplate Data Streaming window with information about the connection.
Once the 3D-Forceplate data stream is set up, the Arsalis device can be added
and configured in QTM.
2. Click the Add Device button, and select Arsalis in the drop down menu.
5. Fill in the Local server IP address in the IP address setting. If the force
plate is connected to the same computer that is running QTM, you can
use the localhost IP address (127.0.0.1).
6. Fill in the Port Number of the server. Make sure that it is the same as in
the 3D-Forceplate Streaming Window.
7. Press the Locate Force Plates button in QTM. If the connection is estab-
lished, the information about the Arsalis force plates will show up in the
list below.
8. Make sure that the Trigger mode option is set to start only to get syn-
chronized start of the force plates.
9. Finalize configuring the device by setting the sample rate with the Fre-
quency option.
10. You can also zero the force plates from the Arsalis settings page.
When the force plates have been added to QTM, the next step is to configure
the force data calculation. The steps below presume that the force plate loc-
ations for your setup have already been defined in the 3D-Forceplate software.
2. Click on Define Plates to import the definitions for all the current Arsalis
force plates to the Force plates list.
3. Make sure that the plates you want to use are enabled in the list. You do
not need to do any other settings for the plates, but you can open the set-
tings for each force plate by double-clicking on it in the list.
To collect data with the Arsalis force plates, simply start a capture in QTM. The
3D-Forceplate Streaming window should show an active client connection when
QTM is in preview and during capture.
To view the Arsalis data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Arsalis analog data includes forces, COP, free moment
and several other signals. To show the force data calculated by QTM, right-click
on the Data Info window and select Force data.
The Bertec force plates are digitally integrated in QTM when used with the
digital amplifiers AM6500 and AM6800. For further information about the force
plates and how to install them physically, refer to the manufacturer's doc-
umentation.
The sections below describe how to connect the force plates and how to set
them up in QTM.
Hardware connections
Software requirements
Make sure that the latest version of the Bertec Digital Plugin is installed. Follow
these steps to download and install the Bertec Digital Plugin:
Once Bertec force plate is connected to the computer, the device can be added
and configured in QTM.
2. Click the Add Device button, and select Bertec Corporation Device in the
drop down menu.
Synchronization settings
1. Open the Bertec Corporation Device settings page, see chapter "Bertec
corporation device" on page 311.
2. To change the frequency of the force data, set the Frequency value and
press the Sync Settings button. Check that the frequency values for the
force plate channels are updated.
3. Open the Synchronization page under Project Options > Input Devices
> Camera System.
4. Use the following settings for the used Synchronization output (Out 1,
Out 2 or Synchronization output):
l Mode: Independent frequency
When the force plates have been added to QTM, the next step is to configure
the force data calculation.
2. Click on Define Plates to import the definitions for all the current Bertec
force plates to the Force plates list.
3. Make sure that the plates you want to use are enabled in the list.
4. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the force plates in QTM.
5. Set the location of the force plate with the Generate or View/Edit but-
ton, see chapter "Force plate location" on page 382.
6. Optionally, activate the COP threshold to suppress the visualization of
the force vector in QTM when there is no load on the force plate, see
chapter "COP (Center Of Pressure) threshold" on page 386
7. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured files, make sure that it is activ-
ated both for Real-time actions and Capture actions.
To collect data with the Bertec force plates, simply start a capture in QTM. The
force data is automatically synchronized with the start of the capture. Make
sure to re -zero the force plates when needed using the Zero Plates button on
the Bertec Corporation Device settings page, see chapter "Bertec corporation
device" on page 311.
To view the Bertec data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Bertec analog data includes forces and moments. To
show the force data calculated by QTM, right-click on the Data Info window
and select Force data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Connecting Kistler digital force plates
Hardware requirements
The Kistler force plate integration supports Kistler force plates with digital out-
put of the following types:
l Force plates with built in digital output (e.g., Digital force plate Type
9667AA...)
l Force plate with charge output after upgrade (e.g., Type 9281EA/E or
9287CA/C with corresponding DAQ 2.0 Type 5437A1). For more inform-
ation about upgrading force plates to digital force plates, contact Kistler
support.
NOTE: If you have force plates connected to a Kistler DAQ Type 5695B,
please refer to chapter "Connecting Kistler DAQ Type 5695B" on
page 780.
Software requirements
The following software is required for configuring and using Kistler digital force
plates with QTM.
Kistler software:
l BioWare (including DataServer)
The Kistler system should be configured for use with QTM according to the
instructions below. For more detailed information, contact Kistler or Qualisys
support.
Network configuration
The force plates are connected to the computer through Ethernet. It is recom-
mended to connect the Kistler system to the same network as the Qualisys cam-
eras. Use the Kistler SetupWizard utility to configure the network settings of the
Kistler devices (force plates and Kistler Sync Box). The Kistler devices should be
configured with static IPv4 addresses.
NOTE: If the static IP addresses of the Kistler devices are known, you can
also configure the Qualisys camera network to use the same subnet as
the Kistler devices, rather than reconfiguring the network settings of the
Kistler devices. For instructions on how to configure the network adapter
settings with QDS, see chapter "Advanced" on page 467.
To configure the network settings of the Kistler devices, follow these steps:
1. Take note of the subnet that is used for the camera network. In QTM, this
can be found on the Camera System page under Project Options > Input
Devices as the Interface in the Camera system settings information tab
(typically, 192.168.254.x).
2. Run SetupWizard.exe and click Find devices. All Kistler devices connected
to the network should show up in the list.
4. Set the IPv4 mode to Static IP. Optionally, change the configured name to
help you identify the force plate. Click Next when done.
6. Press Next to store the configuration changes to the device. The SetupW-
izard should show the confirmation page.
7. Press Start over to configure the next device, or Exit when all devices are
correctly configured.
NOTE: Once you set the IP addresses of the Kistler devices, you may
need to reboot the Qualisys camera system to make sure that they don't
have overlapping IP addresses.
Next, the force plate configuration should be defined in the Kistler BioWare
software and stored in a configuration file. Follow these steps:
2. In the Setup menu, go to Hardware > A/D board and select Ethernet DAQ
Device(s).
3. Open the Device Setup window (Setup > Hardware > Devices) and click
New....
8. Repeat steps 4-6 until all force plates are present in the Active Devices list.
2. Name the file config.xml and save it in the BioWare XML folder (typically
C:\Kistler\BioWare\XML).
3. Open the config.xml file in a text editor and add the following lines after
the <ConfigurationName> line to add the Kistler Sync Box (replace Seri-
alNumber and Address with the actual serial number and IP address,
respectively):
<EthernetTriggerDevice>
<Type>bio-digital</Type>
<SerialNumber>1234567</SerialNumber>
<Manufacturer>Kistler</Manufacturer>
<Name>bio-digital</Name>
<Address>123.456.789.123</Address>
</EthernetTriggerDevice>
Hardware setup
1. Connect up to 16 Kistler digital force plates to the Kistler Sync Box (Type
5699A). The digital Kistler devices can be connected in a daisy chain.
2. Use the Power/Ethernet cable set (Type 5793) to power the Kistler devices
and to connect them to the computer via Ethernet. It is recommended to
use the same network as used for the Qualisys camera system using an
Ethernet switch.
3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync input of the Kistler Sync Box with a BNC cable. If you are using an
Oqus camera as sync device, use the Sync out connector of the Syn-
c/Trigger splitter.
2. Click the Add Device button and select Kistler Force Plates in the drop
down menu.
3. Check the Kistler Force Plates item in the Input Devices list. The Kistler
Force Plates device should now show up as an input device under the
Force Plates category. The force plates should be included in the device
list as defined in the Kistler config.xml file.
Synchronization settings
1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used Synchronization output (Out 1,
Out 2 or Synchronization output):
l Mode: Independent frequency
l Output Frequency: 1 Hz
1. In QTM, open the Force Data page under Project Options > Processing.
2. Click the Define Plates button to add the Kistler force plates to the Force
Plates list.
4. Select a force plate and click the Edit Plate button (or double click) to edit
the force data settings.
5. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the force plates in QTM.
6. Set the location of the force plate with the Generate or View/Edit but-
ton, see chapter "Force plate location" on page 382.
To collect data with the Kistler force plates, simply start a capture in QTM. The
force data is automatically synchronized with the start of the capture. The force
plates are automatically re-zeroed when starting a preview or a capture, so
make sure that the force plates are unloaded at these instances.
To view the Kistler data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Kistler analog data includes forces and moments. To
show the force data calculated by QTM, right-click on the Data Info window
and select Force data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Connecting Kistler DAQ Type 5695B
Hardware requirements
The Kistler force plate integration supports Kistler force plates that can be con-
nected via the Kistler DAQ Type 5695B.
The following hardware is required:
l Kistler DAQ Type 5695B (or A)
For the synchronization, you need a Qualisys system with a Camera Sync Unit.
If you have an Oqus system, you need a sync/trig splitter cable (art. 510870)
connected to the control port of one of the cameras.
The following software is required for configuring and using the Kistler DAQ
Type 5695B with QTM:
l Instacal (included with QTM, see chapter "Software installation" on
page 54)
l Kistler BioWare (including DataServer)
The Kistler system should be configured for use with QTM according to the
instructions below. For more detailed information, contact Kistler or Qualisys
support.
1. Connect the Kistler DAQ Type 5695B to the computer via USB and make
sure it is switched on.
2. Open the Instacal program. The Kistler DAQ should appear as a USB-2533
board in the list.
2. In the Setup menu, click Hardware > A/D board to open the Data Acquis-
ition Configuration dialog.
3. In the Data Acquisition Dialog, select USB DAQ Device, select the USB-
2533 (Type 5695) board, and click OK.
1. In the Setup menu, click Hardware > Devices to open the Device Setup
dialog.
When done with the configuration of the force plates, save the configuration
file:
2. Name the file config.xml and save it in the BioWare XML folder (typically
C:\Kistler\BioWare\XML).
The force plate range settings are set in the force plate properties and included
in the config.xml file. If you want to change the range settings, you will need to
change the force plate properties in Bioware and overwrite the config.xml file.
1. Make sure that the Kistler DAQ Type 5695B is connected to the computer
via USB and that it is switched on.
2. Connect up to 8 Kistler digital force plates to the Kistler DAQ.
3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync input of the Kistler Trig/sync splitter cable with a BNC cable. If you
are using an Oqus camera as sync device, use the Sync out connector of
the Sync/Trigger splitter.
2. Click the Add Device button and select Kistler Force Plates in the drop
down menu.
Synchronization settings
1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used Synchronization output (Out 1,
Out 2 or Synchronization output):
1. In QTM, open the Force Data page under Project Options > Processing.
2. Click the Define Plates button to add the Kistler force plates to the Force
Plates list.
3. Make sure that the plates you want to use are enabled in the list.
4. Select a force plate and click the Edit Plate button (or double click) to edit
the force data settings.
To collect data with the Kistler force plates, simply start a capture in QTM. The
force data is automatically synchronized with the start of the capture. The force
plates are automatically re-zeroed when starting a preview or a capture, so
make sure that the force plates are unloaded at these instances.
To view the Kistler data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Kistler analog data includes forces and moments. To
show the force data calculated by QTM, right-click on the Data Info window
and select Force data.
Kistler force plates must have both the analog channels and a control cable con-
nected to the analog board. The following is a description of how to install a
Kistler force plate with an analog board. It only describes the connection from
the Kistler connection box to the analog board. For a description of the con-
nection between the force plate and the connection box, please refer to the
Kistler manual.
Start with the hardware connections to the analog board. The picture below is
an example of the setup.
2. Check whether you have a Kistler connection box 5606 or Kistler control unit
5233A2.
3. Connect the analog signals (8 BNC cables) from the Kistler box to 8 channels
on the analog board.
l Make sure that the analog channels are connected in the correct order.
l Do not connect the analog signals from one force plate to different ana-
log boards.
4. Connect the Digital cable from the Digital I/O on the analog board where
the Kistler force plate is connected. The force plate is then controlled with
the settings on the Force plate control settings page in the Project
options dialog.
The connection differs between the analog boards. Check that you have
the correct cable with Qualisys AB.
USB-2533
There are two Digital I/O ports (DSUB15) on the front of the analog
board.
5233A2
You can either use a cable that controls two force plates per I/O port
(230137) or a cable that controls one force plate on the first I/O port
(230129).
Connect the DSUB15 end of the cable to port A on the analog board.
Connect the DSUB37 end of the cable to the DSUB37 connector on
the 5233A2 units.
Remember to press the Remote on the 5233A2 units to activate the
digital control.
NOTE: If you want to use port B on the analog you must have
a cable for 2 force plates in port A so that you can control the
3rd and 4th force plates from port B.
The Digital I/O is also used to reset the force plate before a measurement.
It is important to not stand on a Kistler force plate when the reset signal is
sent. It is sent at the following operations in QTM:
l New file
l In a batch capture just before QTM starts Waiting for next meas-
urement/trigger
This is all of the connections that is needed to connect the force plate to the
analog board. Then you must add the force plate in QTM follow these steps:
Trigger start
Specify the frequency in multiples of the marker capture rate.
For normal gait measurements you can use a sample rate of
600-1000 Hz. For sport measurements you need a bit higher
sample rate.
c. Go to the Force plate control settings page and add the number of
Kistler force plates that you want to control to the list.
2. Create the force plates on the Force data page in the Project options dia-
log.
3. Open the Force plate page.
a. Enter all of the calibration parameters for the force plates, see
chapter "Kistler force plate calibration parameters" on page 370.
They are found in the manual for the force plate. Use the option
Select by forceplate control, so that the ranges used in the cal-
culation are always correct.
For AMTI and Bertec force plates only the analog channels needs to be con-
nected to the analog board. The following is a description of how to install a
AMTI or Bertec force plate in QTM. It only describes the connection from the
AMTI or Bertec amplifier to the analog board. For a description of the con-
nection between the force plate and the amplifier, please refer to the AMTI or
Bertec manual.
IMPORTANT: For the AMTI portable plate you need an extra box [art.
no. 230009] from Qualisys ABthat has an output of the 8 analog channels
as BNC.
Start with the hardware connections to the analog board. The picture below is
an example of the setup.
1. If you have AMTI's or Bertec's software make sure that the force plate is
working in those programs.
b. Select the correct analog channels for each force-plate, see respect-
ively chapter "AMTI force plate settings" on page 364 and "Bertec
force plate settings" on page 369.
4. Enter the position of the force-plate, see chapter "Force plate location" on
page 382. It is good to do this after each new calibration especially if the
calibration L is not placed at the same position.
5. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured file make sure that it is activ-
ated both for Real-time actions and Capture actions.
6. Test the force plates in QTM. If there is no force, first check that there is
no signal on the analog channels.
l If there are signals on the analog channels the error is in the settings
in QTM. Check the steps 1-5 above.
l If there are no analog signals. Check if the BNC cables are connected
to the wrong channels.
Synchronization
The use of hardware synchronization is optional but recommended. Hard-
ware synchronization requires a Camera Sync Unit for Arqus or Miqus sys-
tems, or a Sync/Trigger splitter for Oqus systems.
Connect the MEAS. TIME output of the Camera Sync Unit to the Trigger in
port of the Gaitway-3D amplifier. If you are using an Oqus camera as sync
device, use the Sync out connector of the Sync/Trigger splitter and in the
Synchronization settings, set the Synchronization output mode to Meas-
urement time.
Before setting up the connection in QTM, start a data stream in the Gaitway-3D
software.
1. Start the Gaitway-3D software. If you need to unlock, try the default pass-
word 1234.
2. Press Stream data, and zero the force plate. This opens up a Gaitway-3D
Data Streaming window with information about the connection.
Once the Gaitway-3D data stream is set up, the treadmill can be added and con-
figured in QTM.
2. Click the Add Device button, and select Gaitway-3D Instrumented Treadmill
in the drop down menu.
3. Check the Gaitway-3D item in the Input Devices list. The Gaitway-3D
device should now show up as an input device under the Instrumented
Treadmills category.
4. Open the Gaitway-3D settings page, see chapter "Gaitway-3D" on
page 315.
5. Fill in the Local server IP address of the treadmill. If the treadmill is con-
nected to the same computer that is running QTM, you can use the loc-
alhost IP address (127.0.0.1).
1. Open the Gaitway-3D force plate settings under Project Options > Pro-
cessing > Force Data.
2. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the treadmill in QTM.
3. Set the location of the force plate. The recommended option is to press
the Use default button to set the default location. This requires that the
L-frame is placed at the origin of the tread mill at the rear right corner
when calibrating the camera system, and that it is level with the Gaitway-
3D surface. For alternative options, see chapter "Force plate location" on
page 382.
Both the Gait and the Running Analysis Module support the automatic decom-
position of force data with Arsalis’ software to reconstruct separate ground
reaction forces for the left and right foot. To use this decomposition, make sure
that:
l The Gaitway-3D is connected to the computer when starting the analysis.
l The export of analog data for TSV is enabled in the Project Options.
For more information, refer to the manual of the Gait or Running Module.
Introduction
With an EMG (electromyography) device the muscle activity can be measured
together with the motion capture. QTM can collect EMG data directly from sev-
eral EMG devices. For an overview of integrated EMG devices and instructions
on how to connect them, see chapter "Wireless EMG systems" on the next
page.
It is also possible to connect an EMG device via an analog board. Then the EMG
device is connected to any free channels on the analog board of the meas-
urement computer, see chapter "How to use analog boards" on page 747. The
channels must be activated on the page for the analog board in the Project
options dialog, see chapter "Channels" on page 297. The EMG device must
have an analog output with an output voltage between ± 10 V.
This chapter describes the Delsys Trigno integration in QTM. This integration
supports Delsys Trigno Centro and Delsys Research+ systems.
Hardware requirements
l Delsys Research+
For both types of base stations, all sensors that are compatible with Delsys
Trigno Discover software are supported.
For the synchronization, you need a Qualisys system with a Camera Sync Unit.
If you have an Oqus system, you need a sync/trig splitter cable (art. 510870)
connected to the control port of one of the cameras.
The following software is required for configuring and using Delsys Trigno with
QTM:
l Delsys Trigno Discover, version 2.0.1.3 or higher.
Please, refer to Delsys resources or support for more information about ver-
sion requirements and download.
Make sure that the latest compatible version of the Delsys Trigno integration
for QTM is installed. Follow these steps to download and install the integration:
l In QTM, open the Input Devices page under Project Options.
Hardware setup
1. Connect the Delsys Trigno Centro base station to the computer via USB.
2. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
first trigger input of the base station with a BNC cable. If you are using an
Oqus camera as sync device, use the Sync out connector of the Syn-
c/Trigger splitter.
2. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Trigger input of the Delsys Trigger module with a BNC cable. If you are
using an Oqus camera as sync device, use the [Trig in/Sync out] con-
nector of the Sync/Trigger splitter.
Before using the Delsys device with QTM for the first time, the sensors must be
configured with the Delsys Trigno Discover software. This is done in a similar
way for Delsys Trigno Centro and Delsys Research+ systems. Follow these steps
to set up and configure the sensors:
1. Connect the Delsys base station to the computer with USB and make sure
it is switched on.
2. Open the Delsys Trigno Discover software.
For more detailed information, please refer to Delsys help resources or sup-
port.
2. Click the Add Device button and select xxx in the drop down menu.
3. Check the Delsys Trigno item in the Input Devices list. The Delsys Trigno
device should now show up as an input device under the EMGs category.
Device settings
The Delsys Trigno device settings are managed via the Delsys Trigno settings
page.
Synchronize Settings
Synchronize changed settings to the [xxx] device.
The settings list contains a top section with common settings and a section with
individual settings for each [device/sensor].
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.
Integration version
The version number for the integration.
SID
Serial number of the sensor.
Battery Percentage
Indication of the charge level of the battery. Press the Synchronize
Settings button for updating the reading.
Channels
Information about the channels of the sensor (channel name, units,
sample frequency).
Configuration in QTM
Follow these steps to set up the Delsys device and sensors in QTM:
1. Make sure that the base station is connected to the computer, and that it
is switched on.
2. Take out the sensors from the charger and switch them on by holding
them against the lock decal on the charging dock.
3. Open the Delsys Trigno settings page under Project Options > Input
Devices > EMGs.
4. Enable the Start Input Trigger check box, and disable the Stop Input Trig-
ger and Output Triggers checkboxes.
6. Check that the sensor settings are correct, as configured in the Trigno Dis-
cover software.
l In case you enable or disable sensors, press the Scan button to
update the sensor list.
7. Press OK to close the Project Options dialog.
Synchronization settings
1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used port Out 1, Out 2 or Syn-
chronization output (Oqus cameras):
l Mode: System Live Time
To stream or collect data with Delsys Trigno, simply start a preview or a capture
in QTM.
To view the Delsys Trigno data during preview or in a capture, open a Data
Info window via the View menu (keyboard shortcut Ctrl + D), right-click in the
window and select Display Analog data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Delsys Trigno (API integration)
This chapter describes the Delsys Trigno API integration in QTM. The
API integration is recommended if you are using Trigno Avanti-style sensors
(button-less, with a large arrow-shaped status LED). For a complete list of sup-
ported sensors, see "Hardware requirements" on the next page.
The Trigno API integration does not support Trigno legacy sensors (classic and
IM, with a button and a small status LED). If you have Trigno legacy sensors,
you can use the Delsys Trigno SDK integration instead, see "Delsys Trigno EMG
(SDK legacy integration)" on page 826.
For using the Delsys Trigno API integration with QTM, follow these steps:
2. Add Delsys Trigno API as input device in QTM, see chapter "Setting up
Delsys Trigno (API) in QTM" on page 816.
3. Choose a synchronization method and connect the equipment accord-
ingly, see chapter "Connecting Delsys Trigno (API)" on page 816.
Hardware requirements
Delsys hardware
NOTE: Trigno Lite is also supported but not recommended for use
with QTM. The Trigno Lite does not support hardware syn-
chronization and the bandwidth is limited to 4 data slots.
l Qualisys Camera Sync Unit, or a sync/splitter cable if you are using an Oqus
system.
The following types of sensors are supported in the Delsys Trigno API integ-
ration.
EMG sensors
AUX sensors
NOTE: Trigno classic and IM sensors are not supported by the Delsys
API integration. If you have Trigno legacy sensors, you can use the Delsys
Trigno SDK integration, see "Delsys Trigno EMG (SDK legacy integration)"
on page 826.
Version information
The following firmware is required for use with the Delsys API integration:
l Delsys Trigno base station firmware: MA2919-BE1506-DS0806-US2008-
DA0901/0000.
For more information about compatibility of Delsys firmware and software ver-
sions or up/downgrading Delsys firmware, refer to Delsys documentation or
support.
The API integration requires that the 64-bit version of the Delsys USB drivers is
installed. The easiest way to install the correct drivers is by installing the Delsys
Trigno Discover software. For more information, refer to Delsys documentation
or support.
For measuring data with the Delsys Trigno API, start by adding it as an input
device as follows.
1. Go to QTM Project Options > Input Devices, and press the Add Device
button.
2. Select Delsys Trigno API from the drop down menu and click OK. The
Delsys Trigno API is now added to the Input devices list.
3. Select Delsys Trigno API in the Input devices list. This will add the Delsys
Trigno API settings page under Input Devices > EMGs.
The Delsys Trigno base station must be connected via USB to the computer run-
ning QTM. The exact connection of the hardware depends on the method used
for synchronization. Three possible synchronization methods are described
below:
Follow these steps to connect the hardware and configure the synchronization:
1. Connect the MEAS. TIME output of the Camera Sync Unit to the
START Trigger input of the Delsys Trigger Module, using a BNC cable.
2. In the QTM Project Options, go to Input Devices > EMGs > Delsys
Trigno API and set Synchronization input to Measurement Time.
3. Make sure that the polarities of the synchronization signals are set cor-
rectly, see chapter "Polarity of synchronization signals" on page 819.
NOTE: If you have an Oqus system without a Camera Sync Unit, use the
Sync out connector of the sync/trigger splitter, and set the syn-
chronization output mode to Measurement time in the Synchronization
settings.
When using a Qualisys trigger button for synchronization, connect the hard-
ware according to the below schematic.
1. Use a BNC T-connector to connect the Trig NO port of the Camera Sync
Unit with the Qualisys trigger button and the START Trigger input of
the Delsys Trigger Module.
2. In the QTM Project Options:
a. Go to Input Devices > EMGs > Delsys Trigno API and set Syn-
chronization input to Trigger.
3. Make sure that the polarities of the synchronization signals are set cor-
rectly, see chapter "Polarity of synchronization signals" on the next page.
4. Use the Qualisys trigger button to start your captures.
NOTE: If you have an Oqus system without a Camera Sync Unit, use the
Trig in connector of the sync/trigger splitter, and set the Trigger port
function to Start Capture in the Synchronization settings.
When using the button on the Delsys Trigger Module, connect the hardware
according to the below schematic.
1. Connect the Trig NO port of the Camera Sync Unit to the START Trigger
output of the Delsys Trigger Module, using a BNC cable.
2. In the QTM Project Options:
a. Go to Input Devices > EMGs > Delsys Trigno API and set Syn-
chronization input to Trigger.
3. Make sure that the polarities of the synchronization signals are set cor-
rectly, see chapter "Polarity of synchronization signals" below.
4. Use the button on the START Trigger input side of the Delsys Trigger Mod-
ule to start your captures.
NOTE: If you have an Oqus system without a Camera Sync Unit, use the
Trig in connector of the sync/trigger splitter, and set the Trigger port
function to Start Capture in the Synchronization settings.
QTM uses by default negative polarity for the Measurement time and Trig NO
synchronization signals, which means that the trigger signal corresponds to a
falling edge. It is therefore recommended to set the Edge selectors on the
Delsys Trigger Module to negative polarity as indicated in the below illustration.
The Delsys Trigno API configuration is managed via the Delsys Trigno API set-
tings page under Project Options > Input Devices > EMGs.
The upper section of the Delsys Trigno API settings page displays information
about the connected base station. Make sure that the correct firmware is
installed on the base station, see "Hardware requirements" on page 814.
For populating the sensor list with the available sensors, follow these steps:
2. Open the Delsys Trigno API settings page in the QTM Project Options.
3. Press the Scan button. A dialog comes up with information of how many
sensors were detected. Press OK to proceed with the detected sensors.
For canceling and keeping the previous list of sensors, press X (close win-
dow) in the upper-right corner.
The detected sensors appear in the order as they are paired to the Trigno base
station. The pairing of sensors is managed in Delsys Trigno Discover software.
For more information about pairing the sensors to the base station, refer to
Delsys documentation or support.
Configuration of sensors
Once the sensors have been detected, you can view and configure the sensors
in the sensor list.
Name: Name of sensor (editable field). QTM will assign a default name,
based on the sensor type and the position in the list.
Data Slots: Number of data slots occupied by the sensor. The number of
data slots may depend on the bandwidth required for the selected mode.
NOTE: When certain channel types are absent in a mode, any cus-
tomizations made to those channels will be erased when finalizing
the changes.
Edited and updated fields in the sensor and channel list are highlighted in yel-
low. The changes will be finalized when pressing Apply or OK. Press the Cancel
button to discard any pending changes, for example if you changed to an
undesired mode leading to loss of custom channel names.
Configuration of channels
Name: Channel name (editable field). The default name assigned by QTM
is based on the data type.
Edited and updated fields in the channel list are highlighted in yellow. The
changes will be finalized when pressing Apply or OK. Press the Cancel button
to discard any pending changes.
TIP: Once you have completed the configuration of sensors and chan-
nels, create a Project back up, or a Project preset in case you want to use
the same sensor configuration for multiple projects.
Synchronization method
In the last section of the Delsys Trigno API, you can select the synchronization
method. Make sure that the synchronization method corresponds to the hard-
ware setup, see "Connecting Delsys Trigno (API)" on page 816. The available
options are:
Trigger: Synchronization using a trigger signal.
WARNING: Using the wrong synchronization method for the used hard-
ware setup will lead to synchronization errors.
The sample rates of the Delsys Trigno data channels are defined per channel,
dependent on the selected sensor modes. In QTM, all channels are auto-
matically upsampled to the channel with highest sample rate. This device fre-
quency is determined by the selected sensor modes, and not influenced by the
selection of channels in the channel list. The upsampling is done in real time by
repeating the current value when no new value is available for a channel. This
means that channels with lower sample rates are updated in steps in the ana-
log data stored in QTM.
When measuring with multiple sensors, it is recommended to choose sensor
modes with the same EMG frequency, whenever possible. When using auxiliary
channels (acceleration, gyro, etc.) in combination with EMG, it is recommended
to use modes with auxiliary frequencies with an integer relation with the EMG
frequency. For example, when using an EMG frequency of 1925.93 in com-
bination with an acceleration frequency of 148.15, the sample rate factor is 13.
After setting up and configuring the Delsys Trigno device and sensors, you can
start capturing data. Before you start a measurement session, it is good prac-
tice to first scan for the available sensors on the Delsys Trigno API settings
page and make sure that all sensors used in the project are detected and have
sufficient battery level. If the sensors have not been scanned after opening the
QTM project, a scan will be automatically performed at the beginning of the
first measurement.
To view Delsys data during preview or a capture, open a Data Info window via
the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data.
QTM supports integration with Delsys Trigno systems. The Delsys Trigno SDK
integration is a legacy integration, which can be used if you have Delsys Trigno
classic or IM sensors. If you have a Delsys system with Trigno Avanti and other
types of button-less sensors, also recognizable by a large arrow-shaped status
LED, it is recommended to use the newer Delsys Trigno integration, see chapter
"Delsys Trigno Integration" on page 804.
The Delsys Trigno SDK integration in QTM supports the following types of
Trigno and Trigno Avanti sensors.
Trigno sensors
l Trigno IM Sensor
l Trigno DR Sensor
The following instructions for Delsys Trigno EMG will only cover how to connect
to and capture data in QTM. For more detailed information on EMG and Trigno
please refer to the Delsys documentation.
Trigno installation
1. Connect and install the Delsys Trigno system according to the chapters
"Trigno computer installation" on the next page and "Trigno syn-
chronization connections" on page 830.
2. Start the Trigno system.
5. Double-click on the Delsys Trigno SDK line and go to the Delsys Trigno
SDK page.
l Make sure that the Connection settings are working, see chapter
"Delsys Trigno QTM settings" on page 834
6. Activate the EMG and acceleration channels that you want to use.
Before you connect the Trigno system to the computer you need to install the
Delsys Trigno Control Utility program. This is included in the Trigno SDK. It is
highly recommend to download the SDK via QTM via Project Options>Input
Devices>Download device drivers to make sure that you use a compatible
version.
The SDK integration requires that the 32-bit version of the Delsys USB drivers is
installed. The easiest way to install the correct drivers is by installing the Delsys
EMGWorks software. For more information, refer to Delsys documentation or
support.
NOTE: Make sure that your Trigno base station and sensors have the
firmware installed that is compatible with the Delsys SDK version used for
QTM. For upgrading or downgrading firmware, please refer to Delsys doc-
umentation or support.
a. The first time you use the Trigno system on a computer you need to
pair the sensors. Click Pair on the sensor images in the program and
then hold down the button on sensor for about 3 seconds until it is
detected. For Trigno Avanti sensors, the sensor is paired by tapping
it over the built-in magnet of the base station, indicated by the lock
decal.
b. Press the Configure button and make sure that the following set-
tings are correct:
l On the tab Orientation Filter, make sure that the option Turn
on Orientation Filter for all sensors is unchecked.
c. You can configure the sensors (e.g., the gain of the accelerometers)
by pressing the configure button of the sensors. For Avanti IMU
sensors, make sure to use the EMG+IMU mode.
For synchronization of the Trigno system with Qualisys the use of the Delsys
Trigger Module is required. There are two synchronization options:
For correct synchronization of the EMG data, it is very important to select the
correct Input option corresponding to the used hardware connection in the
Trigno QTM settings, see chapter "Delsys Trigno QTM settings" on page 834.
WARNING: Using the wrong Input option in the QTM settings will lead
to synchronization errors.
1. Connect the Delsys Trigger Module to the Trigger port on the Trigno
base station.
2. Put a BNC T-coupling on the Start Input BNC connector (green side of
the module) so that you can also connect a trigger button to the input.
The measurement must be started with the trigger button.
3. Connect the Start Input BNC connector to the trigger input of the cam-
era system.
NOTE: The green LED on the trigger button of the Trigger module
lights up when the trigger signal arrives.
1. Under Project Options go to Input Devices > EMGs> Delsys Trigno. Set
Input to Trigger.
2. Open the Synchronization page in the Project Options and manage the
Trigger port(s) settings, see chapter "Trigger ports" on page 273.
l For Oqus, set Function to Start capture and make sure that TTL sig-
nal edge is set to Negative.
l When using a Camera Sync Unit, set Trig NO: Function to Start cap-
ture and make sure that Trig NO: TTL signal edge is set to Negative.
2. Connect the Start Input BNC connector (green side of the Delsys Trigger
Module) to the synchronization output of the Qualisys system.
l When using a Camera Sync Unit, use the Meas. Time input on the
Camera Sync Unit
l When using an Oqus camera for synchronization, use the Sync out
connector on the control port splitter cable.
3. Make sure that you select the correct starting edge with the Start Input
Edge selector on the Delsys Trigger Module. The default setting in QTM
is negative polarity, or falling edge.
NOTE: The green LED on the trigger button of the Delsys Trigger
Module lights up when the synchronization start signal arrives.
1. Under Project Options go to Input Devices > EMGs> Delsys Trigno. Set
Input to Measurement Time.
l When using a Camera Sync Unit, make sure that under Measurement
time the TTL signal polarity is set to Negative, see chapter "Meas-
urement time (Camera Sync Unit)" on page 290.
The Delsys Trigno SDK page, with settings for the Delsys Trigno EMG, is
included in the Project options dialog when Delsys Trigno is activated on the
Input Devices page. To use Delsys Trigno you need to install the drivers that
Command port, EMG data port, ACC data port, IM EMG data and
IM Aux port
Ports used to communicate with the Trigno Control Utility. The
defaults are 50040, 50041, 50042, 50043 and 50044.
Synchronization settings
Input
Select the synchronization input. The selected input depends on which sig-
nal from the Qualisys system is used as synchronization signal. The
options are:
Trigger: This option should be selected when the trigger button is
connected to the Delsys Trigger Module as described in chapter
"Trigno trigger connection" on page 831. When this option is selec-
ted, the trigger delay from the Qualisys system is compensated for.
NOTE: Activate the Use trigger in real time option when the setup
is finished to minimize the number of times you have to press the
trigger button.
Channels
Channel name
Activate the EMG and Acceleration sensors and enter a name for the
channels. Make sure that the channel number matches those that
are active in the Trigno Control Utility. The name of the EMG channel
is the same as the Channel name. The auxiliary data channels have
the channel name with a suffix, for example _ACC_X, _ACC_Y and _
ACC_Z for the acceleration data.
Auxiliary data
Check this option for the channels for which you want to retrieve
auxiliary data from the sensors (e.g., acceleration, gyro, mag-
netometer). If you do not need the auxiliary data, uncheck it to
reduce the amount of analog data in the QTM files.
1. The first time a Trigno EMG is connected to a computer you must follow
the instructions in the chapter "Trigno computer installation" on
page 828.
2. Make sure to set up the synchronization correctly according to the instruc-
tions in chapter "Trigno synchronization connections" on page 830.
3. Make sure that the Trigno Control Utility program is running on the com-
puter. This program is needed for the communication with the EMG sys-
tem.
5. Go to the Delsys Trigno page and select the settings below, for more
information see chapter "Delsys Trigno QTM settings" on page 834.
l Make sure that QTM has connection with the Trigno Control Utility
program. If it does there is an OK after the port numbers.
l Activate the EMG and accelerometer sensors you want to use in the
list. Do not activate sensors that are not needed since it will only res-
ult in larger files.
6. Close the Project options dialog and start a new measurement.
l When Use Trigger in Real Time is selected, the Trigno data must be
triggered by an external trigger signal to be synchronized in real
time. Then the dialog below will appear every time you start a new
measurement and every time you change something in the Project
options dialog. If you do not use the real-time output from QTM dis-
able the Use Trigger in Real Time option on the Delsys EMG page.
This option requires that the trigger start option is used for syn-
chronization.
7. Open the Data info window on the View menu and then right-click and
select Display Analog data.
8. The data from the Delsys EMG is listed with the analog data. The channels
are called the Channel name for the EMG data. The auxiliary channels
are called the Channel name and then _ACC_X and so on. The Board
9. When starting a capture the start of the EMG data will be synchronized
with the motion data according to the selected synchronization method.
The EMG data is resampled at 2000 Hz via the Delsys SDK when recorded in
QTM. Auxiliary sensor data is upsampled in QTM to 2000 Hz. For the export to
MAT and TSV the sample frequency will be 2000 Hz for all Trigno sensor data.
When you are using C3D export it is recommended to choose a capture fre-
quency with a direct integer relationship to 2000 Hz to avoid additional res-
ampling when exporting to C3D. See chapter "C3D file format" on page 728 for
more information.
Cometa EMG
QTM supports integration with the Cometa Wave Plus wireless EMG system.
The supported sensor types are EMG and IMU sensors. A single Cometa base
unit allows the use of maximum 16 EMG and/or IMU sensors. It is possible to
connect two base units for the use of maximum 32 sensors.
For use with QTM the Cometa system should include the following com-
ponents:
The following chapters cover how to connect to and capture Cometa EMG and
IMU data in QTM. For more detailed information, please refer to the Cometa
documentation.
There is also a tutorial available at QAcademy for connecting and using Cometa
EMG with QTM.
Cometa installation
Before using the Cometa EMG system for the first time you will need to install
the drivers. You can download the current drivers via QTM via Project
Options>Input Devices>Download device drivers. Follow these steps to
install the drivers:
1. Connect the Cometa base unit to a USB port on the QTM computer. Win-
dows will automatically detect the device.
2. Unzip the file containing the drivers (typically called emg-
musbdrivers_...zip).
3. In Windows explorer, locate the folder containing the files that are com-
patible with your computer (e.g. Win10\x64).
4. Right-click on the file EmgMUsb.inf and choose Install from the context
menu.
To add the Cometa device as an input device in QTM, follow these steps:
For synchronizing Cometa with Qualisys motion capture data, the following
accessories are required:
l A Cometa trigger box with BNC connector.
1. Connect the Sync/Trig splitter cable to the control port of one of the
Oqus cameras.
2. Connect the Sync out connector of the Oqus splitter cable with a BNC
cable to the BNC connector of the Cometa trigger box.
The Cometa page, with settings for the Cometa EMG, is included in the Project
options dialog when Cometa is activated on the Input Devices page.
Channels
Channel name: Name of the channel. Click in the text area to edit.
Active: Check sensors that are currently used for data acquisition in
QTM.
Sensor type: Select the type of sensor (Emg or Imu) per channel.
7. For viewing the EMG and IMU data, open a Data Info window in the View
menu, right-click and select Display Analog data.
8. The data from the Cometa EMG is listed with the analog data. The chan-
nels are called the Channel name for the EMG data. The IMU channels
are called the Channel name and then _ACC_X and so on. The Board
name is Cometa.
9. Start a capture via the Capture dialog (Ctrl+M). The start of the EMG and
IMU data will be synchronized with the motion data.
10. EMG and IMU data can be exported to several export formats, see chapter
"Export Cometa EMG and IMU data" below.
Cometa EMG and IMU data can be exported to TSV, MAT and C3D file formats.
For the TSV and MAT export, make sure that the Analog data type is selected.
The sample frequency of all Cometa EMG and IMU data is 2000 Hz. The actual
sample rate of IMU data depends on the used IMU acquisition type, see chapter
"Cometa QTM settings" on page 841.
When you are using C3D export it is recommended to choose a capture fre-
quency with a direct integer relationship to 2000 Hz to avoid additional res-
ampling when exporting to C3D. See chapter "C3D file format" on page 728 for
more information.
Cometa Systems
This chapter describes the Cometa Systems integration in QTM. The following
Cometa systems are supported:
l Cometa WavePlus
l Cometa WaveX
The following chapters cover how to connect and use Cometa System devices
in QTM. For more detailed information about Cometa System devices, please
refer to the Cometa documentation.
Hardware requirements
The Cometa Systems integration requires a Qualisys camera system with a Cam-
era Sync Unit. If you have an Oqus system, you need a sync/trig splitter cable
(art. 510870) connected to the control port of one of the cameras.
WavePlus
For use with QTM the Cometa WavePlus system should include the following
components:
l A receiver unit
l WavePlus compatible sensors: Mini Wave EMG, WaveTrack IMU, Pico EMG
WaveX
For use with QTM the Cometa WaveX system should include the following com-
ponents:
l A receiver unit with power supply
Driver install
Before using the Cometa system for the first time you will need to install the
drivers. You can download the current drivers via QTM via Project
Options>Input Devices>Download device drivers. The following drivers are
available:
l For WavePlus, download the file emgmusbdrivers_...zip.
1. Connect the Cometa base unit to a USB port on the QTM computer. Windows
will automatically detect the device.
3. In Windows explorer, locate the folder containing the files that are com-
patible with your computer (e.g. Win10\x64).
4. Right-click on the file EmgMUsb.inf and choose Install from the context
menu.
Software requirements
The following software is required for configuring and using Cometa Systems
with QTM:
Cometa software:
l EMG and Motion Tools
Hardware setup
How to connect
1. Connect the Cometa Receiver unit with USB to the computer running
QTM.
2. Connect the Synchronization output (Out 1 or Out 2) of the Camera sync
unit to the Cometa trigger box connected to the the Cometa Receiver. If
you are using an Oqus camera as sync device, use the Sync out connector
of the Sync/Trigger splitter.
Sensor configuration
The sensor configuration is defined in the Cometa EMG and Motion Tools soft-
ware. The software can be used to create sensor configuration files for both
WavePlus and WaveX systems. The sensor configurations contain information
about the sensor types and labels, that are required for the use in QTM. You
can create multiple configurations, that can later be selected in the device
setup in QTM.
The sensor configurations are created as outlined below. For detailed inform-
ation, please refer to the Cometa documentation.
3. In the Sensors tab, select the sensor modes for the respective sensors and
choose the data protocols used for the respective modes.
5. In the IMU tab, define the labels for the respective IMU sensors. This can
be done by dragging and dropping the joints from the Joints map onto the
sensor name fields.
2. Click the Add Device button, select Cometa Systems from the list, and
click OK to add it to the Input Devices list.
3. Check Cometa Systems in the Input Devices list. This will add the
Cometa Systems settings page under Input Devices > EMGs.
Device settings
The Cometa Systems configuration is managed via the Cometa Systems set-
tings page under Project Options > Input Devices > EMGs.
Synchronize Settings
Synchronize changed settings with the Cometa Systems device.
Calibrate IMU
Calibrate the IMU sensors.
The settings list contains a top section with common settings and a section with
sensor information.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.
Integration version
The version number for the integration.
File name
Name of the selected configuration file. If there are multiple con-
figuration files, they can be selected from the dropdown menu.
Message
Display of messages from the Cometa Systems device.
Protocol information
Information about the used protocols and sample rates. In case of
multiple sample rates, all channels are resampled to the highest
sample rate.
Configuration in QTM
Follow these steps to select the sensor configuration that you created earlier
with the EMGandMotionTools software (see chapter "Sensor configuration" on
page 848):
1. In Configuration file path, specify the folder name containing the stored
Cometa sensor configurations.
2. Press the Synchronize Settings button.
3. Under File name, select the configuration file you want to use.
4. Press the Synchronize Settings button again. The message filed should
now show that the Configuration was loaded succesfully, and the protocol
and channel information should be updated.
1. Open the Synchronization page under Project Options > Input Devices
> Camera System.
2. Go to the settings for the used synchronization output (Out 1, Out 2 or
Synchronization output), and configure it as follows:
After setting up and configuring the Cometa Systems device and sensors, you
can start capturing data. Before you start a measurement session, it is good
practice to make sure that all sensors used in the project are connected and
have sufficient battery level.
If you are using IMU sensors, you must calibrate them before starting the meas-
urements. This is needed to compensate for gyroscope and accelerometer off-
set. Follow these steps to calibrate the IMU sensors:
3. Open the Cometa Systems settings page under Project Options > Input
Devices > EMGs.
4. Press the Calibrate IMU button and wait a few seconds.
For more information about IMU sensor calibration, please refer to Cometa doc-
umentation.
To view Cometa Systems data during preview or a capture, open a Data Info
window via the View menu (keyboard shortcut Ctrl + D), right-click in the win-
dow and select Analog data.
Exporting data
When exporting, all Cometa Systems data will be exported as analog data from
a single device. The sample rate is normally fixed at 2000 Hz, except for WaveX
if there are no sensors in EMG mode included in the sensor configuration. In
the latter case, the output frequency corresponds to the selected IMU mode.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Noraxon EMG
This chapter describes the Noraxon EMG integration for QTM. The following
Noraxon systems are supported:
l Noraxon Ultium EMG
The following instructions will cover how to connect and use Noraxon EMG
devices in QTM. For more detailed information about Noraxon EMG, please
refer to the Noraxon documentation.
Requirements
Hardware requirements
l Noraxon EMG sensors and a a sensor charging dock (IMU and Motion
sensors are not supported)
l A Phono 3.5 mm to BNC sync cable (art. 230049, not included with the Nor-
axon equipment), or a Phono 3.5 mm (mono) to BNC-Female adapter.
For connecting the receiver via USB to the computer, you need the Noraxon
USB driver, which can be downloaded from https://www.nor-
axon.com/noraxon-download/noraxon-usb-driver/.
For more detailed information about Noraxon devices, refer to Noraxon doc-
umentation.
Software requirements
Make sure that the latest version of the Noraxon EMG integration for QTM is
installed. Follow these steps to download and install the integration:
The Noraxon EMG device should be configured for use with QTM according to
the instructions below. For more detailed information, contact Noraxon or
Qualisys support.
2. Click the Add Device button and select Noraxon EMG in the drop down
menu.
3. Check the Noraxon EMG item in the Input Devices list. The Noraxon EMG
device should now show up as an input device under the EMGs category.
Hardware setup
2. Connect the sensor docking station to the Ultium receiver with the ded-
icated cable.
3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync input of the Ultium receiver with a phono 3.5 mm to BNC cable. If
you are using an Oqus camera as sync device, use the Sync out connector
of the Sync/Trigger splitter.
Device and sensor configuration
Follow these steps to setup your Noraxon Ultium EMG device and sensor con-
figuration.
1. Open the Noraxon EMG settings page under Project Options > Input
Devices > EMGs.
2. Press the Setup button to open the Noraxon Hardware setup program. The
Ultium device should be shown as one of the detected devices.
3. Drag the Ultium device to the Selected Devices tab. This will open the
Ultium Setup dialog (if not, double click on the Ultium device).
4. In the Ultium General setup, add your sensors to the configuration. The
easiest way is to press the Detect Sensors in Chargers button, making
sure that all sensors you want to use are present in the connected sensor
docking stations.
When all sensors are listed, you can change the labels. The labels will be
used as the channel names in QTM.
6. In the Advanced setup page, make sure to check the Invert sync input
checkbox.
Optionally, check the Enable EMG IMU accel and/or Enable EMG IMU
gyro & mag checkboxes to include auxiliary data from the EMG sensors.
1. Connect the Noraxon Desktop DTS receiver to the computer via USB.
2. Make sure that the sensors are charged in the sensor docking station.
3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync in input of the Desktop DTS receiver with a phono 3.5 mm to BNC
cable. If you are using an Oqus camera as sync device, use the Sync out
connector of the Sync/Trigger splitter.
Device and sensor configuration
Follow these steps to setup your Noraxon Desktop DTS device and sensor con-
figuration.
1. Open the Noraxon EMG settings page under Project Options > Input
Devices > EMGs.
2. Press the Setup button to open the Noraxon Hardware setup program. The
Desktop DTS device should be shown as one of the detected devices.
Configuration in QTM
Device settings
The Noraxon EMG device settings are managed via the Noraxon EMG settings
page under Project Options > Input devices > EMGs.
Synchronize Settings
Update the device and channel information.
Setup
Open the Noraxon EMG hardware setup interface for modifying the
device and channel configuration.
The settings list contains a top section with common settings and a section with
device and channel information.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.
Integration version
The version number for the integration.
Synchronization settings
1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used port (Out 1, Out 2 or Syn-
chronization output):
Smart-Eye - smarteye.se
For the Smart-eye hardware it is possible for the Qualisys system to
lock on the capture frequency of the Smart Eye camera. Then the
start synchronization is handled via a real time event from QTM that
is sent to the Smart eye software. Finally the Smart Eye and Qualisys
systems can be aligned via markers on the Smart eye calibration
equipment. For questions about the Smart eye integration please
contact [email protected].
The connectivity kit can be purchased from Tobii or via Qualisys AB.
To use Tobii eye trackers in QTM, follow these steps:
l Set up the Tobii Glasses in QTM, see chapters "Setting up Tobii Pro
Glasses 2 in QTM" on the next page or "Setting up Tobii Pro Glasses 3 in
QTM" on page 868.
l Adding a gaze vector in QTM, see chapter "How to use gaze vectors in
QTM" on page 873.
For information about capturing and processing data, refer to chapters "Mak-
ing a measurement with Tobii" on page 878 and " Process and export Tobii
gaze vector data" on page 879.
For more information about the eye tracker data and the available gaze vector
settings, see chapters "Tobii data in QTM" on page 883 and "Gaze vector set-
tings" on page 887.
To connect the Tobii Pro Glasses 2 eye tracker to QTM, the QTM computer and
the Tobii Recorder must be connected to the same local network. The Tobii
Recorder can be connected via WiFi. Follow these instructions to add Tobii to
your project:
1. Configure the Tobii glasses network connection (WiFi) using the Tobii Pro
Glasses Controller application.
2. To add the Tobii eye tracker device, open the Input Devices page in the
Project options dialog and click on Add device.
3. Select the Tobii Pro Glasses 2 from the list with devices and click OK.
4. Make sure that the Tobii device is enabled in the list. Then double-click on
the Tobii line in the list to open the Tobii page.
5. On the Tobii page enter the following information.
Address
The IP address or name of the Tobii glasses. Press the Find Glasses but-
ton to automatically locate them on the network and update the address
field.
Calibrate
Initiate Tobii calibration. Calibration is done the same way as in the Tobii
Pro Glasses Controller application using the Tobii calibration card.
When the glasses are connected and set up in QTM, you can proceed to add a
gaze vector, see chapter "How to use gaze vectors in QTM" on page 873.
If you want to make sure that the data from the Tobii Pro Glasses 2 is syn-
chronized with the motion capture data, you can use the hardware syn-
chronization for a simultaneous start of the recording.
The following items are required for hardware synchronization:
l A synchronization cable included in the Tobii-Qualisys connectivity kit.
1. Connect the Trigger/Sync splitter cable to the control port of one of the
cameras.
2. Use a BNC cable to connect the Tobii sync cable to Sync out on the split-
ter cable.
3. Connect the 3.5 mm connector of the Tobii sync cable to the Data syn-
chronization port on the Tobii recorder.
4. Do the following settings in QTM:
a. Open the Synchronization page under Project Options.
For using the Tobii Pro Glasses 3 with QTM, the QTM computer and the Tobii
Recording unit must be connected to the same local network. For connecting
and setting up your Glasses 3 device, follow the instructions at
For connecting a single Glasses 3 device for use with QTM, the following con-
nection options can be used.
l WiFi access point: connect the QTM computer to the network with name
"TG03B-XXXXXXXXXX" (serial number of Recording unit). Default pass-
word: TobiiGlasses.
l Cable (router): Connect the Tobii recorder with an Ethernet cable to a
router. When using a wireless router, the computer running QTM can be
connected via WiFi.
For a detailed description of the connection options, refer to the Connection
Guide in the Glasses 3 Controller application.
After connection the Glasses 3 device it should be accessible in the Glasses 3
Controller application.
For connecting multiple Glasses 3 devices for use with QTM, the following con-
nection options can be used.
l Cable (router): Connect the Tobii Recorder units with an Ethernet cable to
a router. When using a wireless router, the computer running QTM can be
connected via WiFi.
l Alternatively, the Tobii Recorder units can be connected wireless through
a WiFi router. This requires setting up an alternative network con-
figuration on the Tobii Recorder devices.
For a detailed description of the connection options, refer to the Connection
Guide in the Glasses 3 Controller application or other Tobii resources.
1. Open the Project Options and navigate to the Input Devices page.
3. In the Add Device dialog, select Tobii Pro Glasses 3 from the Select
device drop down menu.
4. Open the Tobii Pro Glasses 3 settings page under Project Options >
Input Devices > Eye Trackers.
5. Press the Locate Glasses button. After a short while the connected
Glasses 3 devices should be shown in the settings list.
The Glasses 3 settings are shown on the Tobii Pro Glasses 3 settings page
under Project Options > Input Devices > Eye Trackers.
Hardware Synchronized
Enable synchronized capture start between QTM and Tobii glasses.
See chapter "Setting up hardware synchronization with Tobii Pro
Glasses 3" on the next page for how to set up hardware syn-
chronization.
When the glasses are connected and set up in QTM, you can proceed to add a
gaze vector, see chapter "How to use gaze vectors in QTM" on page 873.
Latency information
The Tobii data stream arrives to QTM over the network with a certain latency.
This will affect the real time calculation of the gaze vector. For reliable syn-
chronization of captured data, it is recommended to use hardware syn-
chronization. If you cannot use hardware synchronization, you can estimate the
latency as outlined in chapter "Tips for compensating latency" on page 877.
When not using hardware synchronization or when using the Tobii data in real
time, please take note of the following:
l The latency of Tobii data may vary between trials.
l The latency of the Tobii device data (e.g. pupil data), stored as analog data
in QTM is not compensated for when applying latency compensation to
the gaze vector calculation.
l The latency may increase when using Glasses 3 software to record data
and stream video from the Tobii glasses.
l When using multiple glasses, latencies can be quite large (e.g. up to a few
seconds) and may differ between devices.
l The latency may depend on the quality of the network.
If you want to make sure that the data from the Tobii glasses is synchronized
with the motion capture data, you can use the hardware synchronization for a
simultaneous start of the recording.
The following items are required for hardware synchronization:
l A synchronization cable included in the Tobii-Qualisys connectivity kit.
b. The TTL signal polarity can be set to any value (Negative or Positive).
After connecting and setting up a Tobii eye tracker device, follow these steps to
add a gaze vector in QTM.
1. Go to the Gaze Vector page under Project Options > Processing. For an
overview of the gaze vector options, see "Gaze vector settings" on
page 887.
2. Click on Add to add a new gaze vector. The gaze vector includes the defin-
ition for both left and right eye.
3. Double click on the gaze vector to open the Gaze vector dialog.
5. Associate a rigid body with the eye tracker. The rigid body is needed to
project the gaze vector in the 3D space. You can select the rigid body from
the Rigid body drop-down list.
l The list contains a list with predefined rigid bodies for the selected
Tobii eye tracker, see chapter "Tobii rigid body definitions" below for
more information.
l It is also possible to choose a custom rigid body from the rigid bod-
ies that are included on the 6DOF Tracking page, for example if you
want to use a refined rigid body definition.
l For tips on how to optimize the tracking of the glasses, see chapter
"Tips for improving the tracking of the glasses" on the next page.
6. If you are not using hardware synchronization you can specify an offset to
compensate for the latency of the recorded eye tracker data.
l For tips on how to estimate the latency, see chapter "Tips for com-
pensating latency" on page 877.
7. Optionally, check the Use median filter option for smoothing the gaze
position and gaze vector data.
8. On the Processing page under Project Options, make sure that the pro-
cessing steps Calculate 6DOF and Calculate gaze vector data are selec-
ted for both Real time actions and Capture actions.
The Qualisys-Tobii connectivity kits for Tobii 2 and Tobii 3 include marker sets
that can be attached to the glasses. QTM includes predefined rigid body defin-
itions corresponding to the available attachments, that can be chosen from the
For Tobii Pro Glasses 3 there are separate attachments for the left
and the right side of the glasses. The marker set number is spe-
cified on the box containing the kit items. When you have multiple
glasses, the left and right attachments can be used in four different
combinations. The predefined rigid body definitions in QTM are
named Tobii3-Set-Lx-Ry, where x and y need to be replaced with
the respective set numbers. Left (L) and Right (R) are considered
from the perspective of the person wearing the glasses.
The predefined rigid body definitions in QTM for the Tobii marker attachments
give a good starting point for tracking the glasses. However, the actual marker
positions may deviate from the predefined ones, which may lead to suboptimal
tracking results.
The tracking can be improved in the following ways:
l Create a refined rigid body definition (see section below).
l Add a marker to the rigid body used to track the glasses, or remove one,
to make it more asymmetric.
l In some cases it can also help to use an AIM model for labeling the mark-
ers, for example to avoid swapping of the left and right clusters in QTM.
Refinement of the Tobii rigid body definition
TIP: Instead of following the steps below, you can use the Refine rigid
body script, that is included with the scripting tools at https://-
github.com/qualisys/qtm-scripting.
1. Open the Gaze Vector page under Project Options and select it as the
rigid body for your Tobii eye tracker.
2. Open the 6DOF Tracking page under Project Options, set the Bone
length tolerance value to 5 mm, and remove the unused rigid body defin-
itions.
When not using hardware synchronization, there will be a certain latency of the
recorded Tobii eye tracker data. The latency needs be compensated in QTM for
a correct calculation of the gaze vector.
The latency may be dependent on a number of factors, such as the network
connection or simultaneous use of the Tobii Glasses Controller software. When
using multiple Tobii Glasses 3 devices, the latency increases for the subsequent
devices since they are started sequentially.
The latency can be estimated as follows:
1. In QTM, configure the gaze vector using no rigid body and zero offset.
2. Make a capture with the glasses on. During the capture, look at a fixed
point, for example a marker in front of you, while gently shaking or nod-
ding your head.
3. Create a plot of the pitch or roll angle of the Tobii 6DOF data and another
plot of the gaze vector data, and align the plots.
l The latency can be estimated by measuring the delay between the
peaks in the 6DOF angle and the corresponding peaks in the gaze
vector.
4. Reprocess the gaze vector data with the estimated latency.
l For Tobii Glasses 2 the latency is specified by the Offset value (in mil-
liseconds) in the Input device settings page.
6. When you reprocess the gaze vector data associated with the rigid body
definition of the glasses, the gaze vector should point at the fixed position
you were looking at during the capture.
To make sure that the latency is consistent, capture a number of trials and
check that the latency compensation is correct for all trials.
NOTE: The latency compensation is only applied to the gaze vector cal-
culation in QTM. The recorded eye tracker data or analog data from the
Tobii device is not affected by the compensation.
The eye tracker data is collected together with the marker data in QTM and can
be reprocessed and exported together with the other data, see chapter " Pro-
cess and export Tobii gaze vector data" on the next page. Follow these steps to
capture eye tracker data in QTM:
1. Make sure that the Tobii eye tracker device(s) are connected and correctly
set up in QTM.
2. Make sure that you have activated Calculate gaze vector data and Cal-
culate 6DOF on the Processing page in Project options.
3. Calibrate the glasses using the Tobii calibration card.
l For Tobii Glasses 2, the calibration can be done using the Tobii Pro
Glasses Controller application or via the Input device settings page
in QTM, see chapter "Setting up Tobii Pro Glasses 2 in QTM" on
page 866.
l For Tobii Glasses 3, the calibration can be done using the Glasses 3
Controller application or via the web interface, see chapter "Setting
up Tobii Pro Glasses 3 in QTM" on page 868.
The Gaze vector data can be reprocessed in a file, both in reprocessing and
batch processing. This is useful if you need to update the gaze vector data
because of changed 6DOF data.
If you have changed the 6DOF data then you need to reprocess the Gaze vector
data to update it. Follow these steps to reprocess it.
l If you need to change the rigid body that is associated with the gaze
vector, then go to the Gaze vector page and double click on the gaze
vector in the list.
The gaze vector and eye tracker data can then be exported and analyzed in
external programs. You can either export it to TSV or MAT files.
TSV
To export the eye tracker data select the data type Eye tracker in the TSV
export options. The gaze vector and eye tracker data will be exported in sep-
arate files, for example *_g_1.tsv and *_g_2.tsv with gaze vector data for the left
and right eye, respectively. The data included in the gaze vector export (_g.tsv)
is:
GAZE_VECTOR_NAME
The name of the gaze vector.
NO_OF_SAMPLES
The number of samples in the gaze vector data.
FREQUENCY
The frequency of the gaze vector data.
TIME_STAMP
Date and time when the motion capture file was made. The date and time
is followed by a tab character and then the timestamp in seconds from
when the computer was started.
START_OFFSET
The offset time for the first gaze vector frame. It is needed since you may
not always cut the measurement range at the start of gaze vector frame.
Reduce the start time of the gaze vector data with the offset so that it
actually starts before the marker data.
HW_SYNC
Indicates if hardware sync was used (value: YES) for the eye tracker data
or not (value: NO).
FILTER
Indicates if the gaze vector data was filtered (value: YES) or not (value:
NO).
For Tobii Pro Glasses 2 devices, the eye tracker data is exported as *_e.tsv files.
The data included in the eye tracker data export (_e.tsv) is:
EYE_TRACKER_NAME
The name of the eye tracker device.
NO_OF_SAMPLES
The number of samples in the eye tracker data.
FREQUENCY
The frequency of the eye tracker data.
TIME_STAMP
Date and time when the motion capture file was made. The date and time
is followed by a tab character and then the timestamp in seconds from
when the computer was started.
START_OFFSET
The offset time for the first eye tracker data frame.
HW_SYNC
Indicates if hardware sync was used (value: YES) for the eye tracker data
or not (value: NO).
For Tobii Pro Glasses 3 devices, the eye tracker data is exported as analog data,
see chapter "Analog data (_a.tsv)" on page 722.
MAT
To export the eye tracker data select the Eye tracker option for the
MATLAB file export. The struct array of the MAT file then includes the following
data:
NrOfSamples
The number of samples in the gaze vector data.
StartOffset
The offset time for the first gaze vector frame. It is needed since you
may not always cut the measurement range at the start of gaze vec-
tor frame. Reduce the start time of the gaze vector data with the off-
set so that it actually starts before the marker data.
Frequency
The frequency of the gaze vector data.
HWSync
Indicates if hardware sync was used for the eye tracker data or not.
The values can be 0 (no hardware sync) or 1 (hardware sync).
Filter
Indicates if the gaze vector data was filtered or not. The values can
be 0 (not filtered) or 1 (filtered).
GazeVector
Array with the gaze vector data in the following order: X Pos, Y Pos, Z
Pos, X Vec, Y Vec, Z Vec.
EyeTracker
Struct array with the following eye tracker data.
EyeTrackerName
The name of the eye tracker device.
NrOfSamples
The number of samples of eye tracker data.
StartOffset
The offset time for the first eye tracker data frame.
HWSync
Indicates if hardware sync was used for the eye tracker data or not.
The values can be 0 (no hardware sync) or 1 (hardware sync).
EyeTracker
Array with the pupil diameter data in mm in the following order: left
pupil, right pupil.
For Tobii Pro Glasses 3 devices the eye tracker data is exported as analog data,
see chapter "MAT file format" on page 730.
Tobii data in QTM
The Gaze vector data is displayed in the 3D view of the window as a vector, for
example color and length can be modified on the 3D view settings page in Pro-
ject options.
There are two different types of data available for the Tobii eye trackers in
QTM.
The gaze vector is provided by the Tobii recorder and then transformed in QTM
to global 3D coordinates through the pose of the associated rigid body. The
gaze vector data can be viewed in a Data Info window as Gaze Vector data.
The gaze vector data includes:
Device
The device is the name of the Gaze vector. (L) and (R) stands for left and
right eye.
X Pos
This is the X position (mm) for the origin of the Gaze vector in the Global
coordinate system.
Y Pos
This is the Y position (mm) for the origin of the Gaze vector in the Global
coordinate system.
Z Pos
This is the Z position (mm) for the origin of the Gaze vector in the Global
coordinate system.
X Vec
This is the X value of the Gaze vector in the Global coordinate system.
Y Vec
This is the Y value of the Gaze vector in the Global coordinate system.
Frame Number
This is the frame number of the Tobii data. Since the frame rate of the
Tobii data is mostly lower than the marker data, it means that the Tobii
data is updated less often than the marker data.
The eye tracker data from Tobii Pro Glasses 2 devices can be viewed in a Data
Info window as Eye Tracker data. The eye tracker data for Tobii Pro Glasses 2
includes:
Device
The name of the Tobii eye tracker device under Input devices. The name
for a single Tobii device is by default "Tobii". Additional Tobii devices are
numbered with 2, 3, etc.
L Pupil D
The diameter of the left pupil in mm.
R Pupil D
The diameter of the right pupil in mm.
Frame Number
Frame number of the Tobii data. Note that the Tobii frame number is not
the same as the QTM frame number.
The eye tracker data from Tobii Pro Glasses 3 devices can be viewed in a Data
Info window as Analog data. The eye tracker data for Tobii Pro Glasses 3
includes:
Channel
Channel name indicating the type of data.
Value
Value of the recorded data including the units.
Board Name
Name of the Tobii Glasses 3 device (serial number of the Recorder unit).
Channel No
Analog channel number.
The Gaze Vector page contains a list of the current gaze vectors and the set-
tings needed to calculate a gaze vector from the eye tracker data. Use the fol-
lowing options to modify the list.
Add
Add a new gaze vector to the list.
The options in the dialog defines how to calculate the Gaze vector.
Eye tracker
Select an eye tracker from the list.
Rigid body
Select the rigid body that is mounted on the eye tracker. This is used
as reference for the gaze vector calibration.
NOTE: In Project options the list contains all of the rigid bod-
ies that are available on the Input devices page. For repro-
cessing with file settings the list contains all of the eye trackers
that were used in the capture.
Delete
Delete the selected gaze vector.
From the menu you can Change name of the selected gaze vector and
Remove gaze vector.
The MANUS integration allows for the use of MANUS gloves for finger tracking
in combination with the skeleton solver in QTM.
MANUS setup
3. Make sure that the gloves are recognized and correctly configured in the
MANUS Core Dashboard.
After connecting you can close the MANUS Core Dashboard. MANUS Core will
continue to run in the background even when the MANUS Core Dashboard is
closed.
3. Click the Add Device button and select Manus Gloves in the drop down
menu.
4. Open the Manus Gloves settings page under Input Devices > Gloves to
access the MANUS device settings, see chapter "Manus Gloves" on
page 317.
5. Specify the IP address of the computer running the Manus Core software
(use 127.0.0.1 when the gloves are connected to the same computer run-
ning QTM).
6. Choose a Model Type depending on the skeleton type used in QTM
(Qualisys Skeleton/Metahuman).
7. Press the Synchronize Settings button to locate the gloves and syn-
chronize the settings.
Create a skeleton
Apply the markers to the actor according to the Qualisys Animation marker set
guide. The markers are the same for the Qualisys Animation and the MetaHu-
man skeleton models. Attach the LHandOut and RHandOut markers to the
gloves in the positions specified in the marker set guide. Use a T-pose to cal-
ibrate the skeleton.
The use of gloves is supported for the Qualisys Animation skeleton without any
further customizations. For using the MetaHuman skeleton model, a custom
skeleton is required. Contact Qualisys support for more information.
Create bindings
Go to the Glove processing page under Project Options > Processing, see
chapter "Glove" on page 344.
The glove processing step settings dialog is used to create bindings which asso-
ciate a glove to the skeleton its data will be applied to. To create a new binding,
select an available glove in the bottom row of the bindings grid and then select
the associated skeleton.
Once the glove bindings are created, you can stream and capture skeleton data
including the hands driven by the gloves.
Capturing, viewing and exporting data
Make sure that the latest compatible version of the StretchSense Gloves Integ-
ration for QTM is installed. Follow these steps to download and install the integ-
ration:
Follow the below instructions for setting up the StretchSense gloves. For more
detailed information, refer to StretchSense resources at https://stretch-
sense.com/support/.
4. Setup the joints as follows. Press Submit and close the window when
done.
Setup streaming:
2. Switch on TCP Streaming for the left and the right hand.
2. Click the Add Device button and select StretchSense in the drop down
menu.
3. Check the StretchSense item in the Input Devices list. The StretchSense
device should now show up as an input device under the Gloves category.
Device settings
The StretchSense device settings are managed via the StretchSense settings
page.
The StretchSense page contains the following buttons to communicate with the
gloves and a list with settings for the gloves included in the configuration.
Restore Default Settings
Reset settings to their default values.
Synchronize Settings
Synchronize changed settings to the StretchSense device.
Integration version
The version number for the integration.
Hand Engine IP
IP address of the Hand Engine server.
Glove Ports
Specify the port numbers of the gloves according to the con-
figuration in Hand Engine. Use commas to separate the port num-
bers for the respective gloves.
Channels
List of channels and sample frequency.
Configuration in QTM
1. Open the StretchSense device settings page under Project Options >
Input Devices > Gloves.
2. Specify the Hand Engine IP address. Use 127.0.0.1 when on the same
computer.
3. Specify the port numbers for the gloves, separated by commas.
The use of glove data requires a calibrated skeleton in QTM with matching
hand hierarchies. Glove data is natively supported by Qualisys Animation skel-
eton. The following processing steps need to be setup for using StretchSense
glove data.
Enable glove processing in QTM:
Create bindings:
It is important to notice that you need one Blackmagic Design card for each
video source that you want to capture. Therefore, you also need a PCI Express
slot on the computer for each card. If you have a computer made before 2008
there might not be that many PCI Express slots on the motherboard. It is also
5. When the installation is finished, open the Desktop Video Setup applic-
ation.
a. Make sure that the input is set to HDMI Video & HDMI Audio,
unless you are using another input.
b. Close and click OK if there is an elevation question.
For more information about how to install the card see the manuals from Black-
magic Design.
Connecting a video source to the Intensity Pro card
The Intensity Pro card can capture from different types of video sources (HDMI,
Component, S-video and Composite). The card can also capture the sound on
the HDMI input or from an analog RCA input. However it can only capture from
one of the video inputs at a time so you have to follow these instructions when
connecting a video source.
1. Select the correct input for your video source on the Blackmagic Control
Panel on the Windows control panel. You get the best image with HDMI
input, but you can use that with eitherHDMI audio or Analog RCA audio.
You do not have to change any other setting on the control panel.
2. Connect the video source to the input you have chosen in the Blackmagic
Control Panel. You have to use the breakout cable for other inputs than
HDMI.
IMPORTANT: If you are using the breakout cable, make sure that
you read the labels on the connectors so that you use the input con-
nectors and not the output connectors. For more information about
the analog inputs see the manual from Blackmagic Design.
The Decklink Mini Recorder card can capture video and sound from either
HDMI or SDI. However it can only capture from one of the video inputs at a
time so you have to follow these instructions when connecting a video source.
1. Select the correct input for your video source on the Blackmagic Control
Panel on the Windows control panel. On a standard camcorder the out-
put is always HDMI, so set the video source to HDMI input. The SDI out-
put is only available on pro camcorders.
You do not have to change any other setting on the control panel.
2. Connect the video source to the input you have chosen in the Blackmagic
Control Panel.
Using Blackmagic Design video source in QTM
QTM uses DirectShow to capture the video and audio from the Intensity Pro
card. It will, therefore, appear as a video device called Decklink Video Capture
on the Video Devices page in Project options. To use the board follow the
instructions below:
4. The Video view is most probably black, because the Video format is on
the default value. Right-click on the Video view for the Blackmagic card
and select Video camera settings.
5. Select the Video Format of the video source and click OK. You must check
which format is actually used by the video camera, there is usually a sep-
arate setting for the HDMI output. If you select the wrong format the
image will be black in QTM.
The options for the Video Format goes from NTSC to HD1080p 24. The
first five settings for PAL and NTSC are ambiguous, but they correspond
to the following formats.
l NTSC
l NTSC 23.98
l PAL
l NTSC Progressive
l PAL Progressive
The rest of the formats can be interpreted directly from the settings, e.g.
HD 720p 59.94options is 720p at 59.94 Hz. If you use the HDMI input QTM
will automatically detect the number of pixels in the image and scale the
video view accordingly.
6. Then if you want to record sound with the video, right click on the Video
view again and select the correct audio source on the Connect to audio
source option. QTM will remember the audio source as long as the Intens-
ity Pro card is installed on the computer.
On a new Sony HDR-CX330 camera you need to change the HDMI output and
some other settings to make it work best with QTM and the Blackmagic Design
card. Click on Menu with the control button to change the video settings and
the settings menu below is displayed on the camera.
HDMI Resolution
First of all you must set the HDMI resolution so that you know what Video set-
tings to use in QTM.
1. Go to Setup and scroll down to HDMI Resolution and open that option.
2. Select the option that you want to use. The recommended option is
720p/480p which will give you 720p 59.94 Hz.
l 1080p -
Not supported by Intensity Pro because the frequency is too high.
l 1080i - HD 1080i 59.94 - 8 bit 4:2:2 YUV
This option is interlaced which is not recommended when being
viewed on a computer, because when played there will be horizontal
lines in the image.
l 720p - HD 720p 59.94 - 8 bit 4:2:2 YUV
Recommended option because it is the highest possible resolution
that uses progressive scanning. The image is good, but the files will
be large so it is recommended to compress the files in QTM.
Then it is recommended to change the two settings that makes the camera
operate better.
Demo mode
Turn off Demo mode on the camera, otherwise the camera will start showing
you demo pictures after a while.
Face detection
l Go to Camera/Mic on the first menu and scroll down to the Face Detec-
tion option and turn it off. If not turned off there will be a rectangle in the
image as soon as a face is detected.
On a new Sony HDR-CX430V camera you need to change the HDMI output and
some other settings to make it work best with QTM and the Blackmagic Design
card. Click on Menu on the top left corner on the touch screen to change the
video settings and the settings menu below is displayed on the camera.
First of all you must set the HDMI resolution so that you know what Video set-
tings to use in QTM.
1. Go to Setup and scroll down to HDMI Resolution and open that option.
2. Select the option that you want to use. The recommended option is
720p/480p which will give you 720p 59.94 Hz.
The list below shows different resolutions and the matching Video
format setting in QTM, for information about how to change the setting
in QTM see chapter "Using Blackmagic Design video source in QTM" on
page 901“Using Intensity Pro in QTM” in the QTM manual.
l Auto - Unknown
l 1080p/480p -
Not supported by Intensity Pro because the frequency is too high.
l 1080i/480i - HD 1080i 59.94 - 8 bit 4:2:2 YUV
This option is interlaced which is not recommended when being
viewed on a computer, because when played there will be horizontal
lines in the image.
Then it is recommended to change the two settings that makes the camera
operate better.
Demo mode
Turn off Demo mode on the camera, otherwise the camera will start showing
you demo pictures after a while.
l Go to Setup and scroll down in the list until you find the Demo mode
option and turn it off.
l Go to Camera/Mic on the first menu and scroll down to the Face Detec-
tion option and turn it off. If not turned off there will be a rectangle in the
image as soon as a face is detected.
Panasonic AW-HE2
To setup the Panasonic camera for use with QTM, connect the power adapter
and an HDMI cable. Refer to the Panasonic manual for how to connect the
power adapter and switch on the camera. The LED on the front of the camera
should light green when the camera is switched on.
Connect the Panasonic AW-HW2 camera with a standard HDMI cable to the
input of the BlackMagic design card. Follow the instruction under "Using Black-
magic Design video source in QTM" on page 901 to enable video capture in
QTM.
For Video format (step 5 in "Using Blackmagic Design video source in QTM" on
page 901) use HD 720p 59.94 Hz - 8 bit 4:2:2 YUV.
DV/webcam devices
Only DV/webcam devices that support the Microsoft DirectShow API can be
used with the QTM software. This is true for most webcams, however there are
very few video cameras equipped with a DV–output or FireWire 4-pin con-
nection. This is a requirement to be able to stream out video in real-time. The
option for video cameras without firewire is to use the Intensity Pro card, see
chapter "Video capture with Blackmagic Design cards" on page 899.
There can be several devices connected to QTM, but there might be problems
with the video capture if too many are connected. The reason is that the video
data stream becomes too large and the frame rate of the video devices will
NOTE: For information about how to connect the video device to the
computer see the manual of the video device.
1. Open the View window menu by right-clicking in an open Video view win-
dow.
2. Then click on Set time offset and enter the starting time of the video in
the dialog. The starting time is in the measurement time, i.e. with a 1 s
starting time the video file will start at 1 s when looking in the 3D view win-
dow.
3. Play the file and check that it is ok.
4. Repeat the steps for each video file that has been captured. The offset will
be remembered for the video device until it is unplugged from the com-
puter.
2. Right-click on the Video view that you want to set the audio source for and
click on Connect to audio source in the menu.
Connect the h/p/cosmos treadmill to an open Ethernet port on the same com-
puter as QTM. Refer to the h/p/cosmos manual for information how to con-
figure the plate.
Once a h/p cosmos treadmill is connected to the computer, the device can be
added and configured in QTM.
2. Click the Add Device button, and select hpcosmos treadmill in the drop
down menu.
3. Check the hpcosmos treadmill item in the Input Devices list. The hpcos-
mos treadmill device should now show up as an input device under the
Generic category.
4. Open the hpcosmos treadmill settings page, see chapter "h/p/cosmos
treadmill" on page 318.
5. Enter the IP address for the treadmill. You can find the IP address under
the External control setting in the treadmill software.
Capturing, viewing and exporting data
To collect data with the hp cosmos treadmill, simply start a capture in QTM. The
treadmill data is automatically captured with the start of the capture. The tread-
mill data is captured at 1 Hz, so there is no need for an exact synchronization of
the capture start.
To view the h/p/cosmos data during preview or a capture, open a Data Info
window via the View menu (keyboard shortcut Ctrl + D), right-click in the win-
dow and select Analog data. The hp cosmos analog data includes speed, elev-
ation, and heart rate. Note that the slow sample rate means that the plot can
look strange if it includes too few samples. To make the real time plots look
good, it is recommended to increase the Default Real-Time Plot Size option
on the GUI page to around 20-30 seconds.
Analysis Modules
Analysis Modules are predefined applications based on the Project Automation
Framework (PAF) in QTM. They are used to streamline the motion capture work-
flow for specific applications. In QTM, projects with PAF functionality contain a
structured Project view pane guiding the user to collect captures and meta-
data, start analyses, and generate reports, see chapter "PAF Project view" on
page 921.
Qualisys offers Analysis Modules for a range of biomechanical applications,
providing an integrated workflow for collecting, analyzing and reporting data.
The Analysis Modules generally use Visual3D software by HAS-Motion for per-
forming the biomechanical calculations. Analysis Modules are available for the
following applications:
l Baseball
l Cycling
l Equine lameness
l Functional Assessment
l Gait
l Golf
l Running
Other modules available for download that are based on the Project Auto-
mation Framework are:
l CalTester
APPLICATIONS 915
You can download installers, documentation and demo projects of your pur-
chased Analysis Modules via the Qualisys dashboard. For information about
how to install or update Analysis Modules, see chapter "PAF module install-
ation" below.
For more information about the available analysis modules, please refer to the
product information on the Qualisys website or contact your sales rep-
resentative or [email protected].
The Open Project Automation Framework (Open PAF) is freely available and can
be used to create your own custom projects. For more information, see "Pro-
ject Automation Framework (PAF)" on page 1016.
The required files and licenses are available via the Qualisys client login, see
http://www.qualisys.com/my/. Log in with your Qualisys user account asso-
ciated with your QTM registration.
New installation
Follow these instructions if you are installing the module for the first time. If
you are updating an existing installation, see next section.
1. Download the analysis module installer via the Analysis modules page at
http://www.qualisys.com/my/.
2. Run the installer and follow the instructions.
APPLICATIONS 916
3. Start QTM and create a new project. Use this project to collect all future
data that you want to analyze with the analysis module:
a. Select File > New Project.
b. Enter name for the project. Change the location of the project folder
if you wish to, however we recommend that you keep it in Docu-
ments unless multiple different users need access.
c. If you collected data previously and you want to use the same cam-
era settings, choose “Settings imported from another projects” and
select a project that contains the correct camera settings that you
want to use. Otherwise select “Use default settings”.
d. Check Use PAF module and select the correct module from the drop-
down.
4. If you selected to import the settings from another project, choose the set-
tings file in the next dialog.
5. Enter the module license when prompted to do so. Select the name of the
module in the Plug-in drop-down. For Gait modules, always select PAF Gait
Module, regardless of the exact name of the module you are using:
APPLICATIONS 917
6. Open the Project Options, select Folder Options and specify the loc-
ation of the computational engine. If using Visual3D, locate Visual3D.exe
by clicking the “…” icon (on most computers the path is 'C:\Program
Files\Visual3D v6 x64'). If using QBE navigate to' C:\Program Files\Run-
ning_analysis\application'.
7. Check the C3D export settings in Project Options match the screenshot
below:
APPLICATIONS 918
l For the Event Output Format, choose Following the c3d.org spe-
cification
l Export units are millimeters
Zero force baseline can be used if the force plate is unloaded at the begin-
ning of the measurement but note that depending on which module you
are using the Visual3D script is set up to apply a zero baseline correction
as well. The Visual3D correction will overwrite the zero frame range that is
set here.
8. Open the Project View by selecting View > Project View (shortcut Ctrl+R).
If you have an existing analysis module that needs to be updated, follow these
steps to update to a new version.
APPLICATIONS 919
3. Start QTM and load the project that you want to update. QTM will ask if
you want to upgrade the project. Click Yes to confirm.
4. If you have modified template files within the PAF project and there is a
conflict because the same files have also been changed in the PAF
installer, the following dialog is shown:
NOTE: For AIM files, your modified files are kept by default. If only
AIM files are affected from conflicts, the dialog is not shown.
APPLICATIONS 920
NOTE: If you have modified a file, but the original file has not
changed between module versions, your modified file will be kept
automatically.
5. The update is complete. You can verify the module version number in
QTM under Help > About.
1. The Project data tree shows all files that are part of the project (i.e. files
contained in the Data subfolder of the project root folder).
You can use the buttons at the bottom of the project tree:
Add
Add new subjects, sessions, other types as defined in the PAF set-
tings.
Open
Open the selected file.
Find
Search files or folders within the PAF project.
APPLICATIONS 921
Find next
Search for the next occurrence of the current search term.
2. The Details area allows editing the properties of the selected item (for
example personal data, session details, file details).
3. The Project automation area shows contents depending on the selected
item. If the selected item is a session, this area shows buttons that cor-
respond to files and the Go button for data collection.
At the top of the Project automation area there is a breadcrumb trail to
navigate back to the session, subject or root level.
When a session has been selected, colored buttons appear in the Project auto-
mation area:
b. Click a red field, select Edit Settings and Capture. Changes to the dur-
ation of the measurement period will be stored and be used the
next time you make a measurement of the same type.
c. Click the Go button. If you click the Go button, QTM starts at the top
of the file list and records all files of the first measurement type (for
example static). If you click again, it continues with all meas-
urements of the second measurement type (for example). Activate
the external trigger button in Project Options > Timing and use it to
control when the captures start. At any point, click the Esc key to
stop the data collection.
2. The following options are available for a recorded file:
APPLICATIONS 922
a. Add comments by clicking the field next to the green file name.
b. Use the plus/minus button to show more/less file buttons (to make
additional captures).
c. Un-check a file to exclude it from the processing (only checked files
will be exported).
3. Once the minimum number of files has been recorded, the Start Pro-
cessing button can be used. Click the button to run the default processing
step. Depending on the PAF module, there may be multiple processing
steps. Click the triangle on the button to show all processing steps. The
default processing step is topmost in the list.
4. Click the Show guide button to show the marker set guide, if available.
Calqulus
Calqulus is using a cloud-based approach to perform analysis on motion cap-
ture data where the engine and scripts are stored in the cloud. The engine is
called Calqulus engine, and the scripts are called Calqulus pipelines which con-
sist of a succession of Calqulus steps. The engine can be considered as the glue
putting together the pipelines and steps to process the data and creation of the
web reports.
With this cloud-based approach, Web Reports are automatically updated upon
modifications to pipelines and the Calqulus pipelines and steps are public. The
pipelines and steps are hosted on the following public GitHub repositories
which make them easy to share, track and modify:
APPLICATIONS 923
l https://github.com/qualisys/Calqulus-Pipelines
l https://github.com/qualisys/Calqulus-Steps
Calqulus functionality is added to a QTM project via the Calqulus module, which
can be downloaded via the Qualisys dashboard. For information about
installing and using Analysis Modules, see chapter "PAF module installation" on
page 916.
The Calqulus module includes predefined sessions for amongst others Base-
ball, Cycling, Cricket and Running. The Generic session can be used for col-
lection of unspecified biomechanical trials.
For more information about Calqulus, please refer to the product information
on the Qualisys website or contact your sales representative or [email protected]
com.
l Report Center
l Calqulus
All modules give the option to visualize the results as a Web Report. Each Web
Report is hosted on Report Center. Demo reports are available on the main
page of our Report Center.
The main Web Report features are:
l Synchronized 3D view, videos and charts
l Annotate charts
APPLICATIONS 924
l Export to PDF
l Compare sessions
Report Center, home of all the Web Reports, also offers some valuable features
such as:
l Sharing Web Reports
APPLICATIONS 925
Technical reference
The below table gives an overview of the sensor specifications and sensor
modes for all Qualisys camera models, when in marker mode.
Max capture
Camera model Sensor mode Resolution
rate (Hz)
Miqus Video
N/A 340 1920×1088
(VC / VM)1
The below table gives an overview of the sensor specifications and sensor
modes for all Qualisys video camera models when using in-camera MJPEG com-
pression.
Max capture
Camera type Sensor mode Resolution
rate (Hz)
100 1920x1440
Miqus Video Plus N/A1
120 1920x1080
Miqus Video
N/A1 86 1920×1080
(VC / VM)
Miqus Hybrid N/A1 86 1920×1080
Miqus M5 N/A 30 2048×2048
Miqus M3 N/A 30 1824×1088
Miqus M1 N/A 30 1216×800
2 MP @ 24 Hz 24 1920×1088
Oqus 2c 0.5 MP @ 62 Hz 62 960×544
0.13 MP @204 Hz 204 480×272
12 MP @ 3 Hz 3 4096×3072
Oqus 7+
3 MP @ 10 Hz 10 2048×1536
6 MP @ 6 Hz 6 3072×1984
Oqus 6+
1.5 MP @ 20 Hz 20 1536×992
4 MP @ 13 Hz 13 2048×2048
1 MP @ 50 Hz 50 1024×1024
Oqus 5+
0.25 MP @ 176 Hz 176 512×512
0.06 MP @ 557 Hz 557 256×256
1Miqus Video cameras automatically switch to a high-speed sensor mode
at lower resolution presets.
Camera output
Marker coordinates / Intensity map / Video preview
modes
Security attach-
Kensington Lock
ment
Active filtering
Outdoor tracking
Sun filter1
1Sun filter included for protected models, optional for standard models.
Arqus
A26 A12 A9 A5
model:
Pixels
Image size 26 MP 12 MP 9 MP 5 MP
Max. frame 5120×5120 4096×3072 4224×2160 2560×1920
TECHNICAL REFERENCE
Normal mode (full FOV)
rate 150 Hz 300 Hz 300 Hz 703 Hz
Camera 6.7 ms 3.3 ms 3.3 ms 1.4 ms
latency
Pixels
Image size 6.5 MP 3 MP 2.5 MP 1 MP
Max. frame 2560×2560 2048×1536 2112×1080 1280×960
High-speed mode (full FOV)
rate 297 Hz 1040 Hz 591 Hz 1400 Hz
Camera 3.4 ms 1.0 ms 1.7 ms 0.7 ms
latency
931
Arqus
A26 A12 A9 A5
model:
TECHNICAL REFERENCE
Motorized lens No Yes No No
932
Description of Arqus cameras
1. LED ring
LED ring for camera identification and indication of status during startup
sequence.
l Green light: Active camera indicator (see "Identifying the cameras
with the identification tool" on page 480).
l Pulsing green light: Camera system being calibrated.
3. Camera display
o Display of camera ID.
4. Mounting plate
o Quick release for Manfrotto and Arca Swiss, two 1/4'' camera
mounts.
Mechanics
Physical specifications
Mounting
The Arqus camera has an integrated mounting plate that is compatible with the
quick-release mount mounting from Manfrotto and Arca Swiss.
There are also two UNC 1/4”-20 tripod mounting points.
This is only needed for camera models with manual focus and aperture.
Press the strobe unlock button while pulling the strobe mechanics away from
the camera body until the aperture and focus ring is accessible. For cameras
with protected housing, the strobe mechanics must be pulled all the way out so
that the lens cover can be removed.
Follow the instructions above but pull the strobe all the way out.
Follow the instructions above but pull the strobe all the way out.
Add or remove the filter and remount the strobe. Make sure that there is
enough room between the filter and flash glass to allow for focus adjustments.
All lenses can be changed by the user by following the steps above, unscrewing
the lens and replacing it with a new lens.
Electrical specifications
Power supply
Arqus cameras are powered by 48 VDC (R1 power supply) or 56 VDC (R2 power
supply) through the Data/Power connectors. The operating voltage range is 36-
58VDC. The power supply should be dimensioned for a minimum of 40W for
each camera.
The external AC/DC converter available for the Arqus camera is capable of deliv-
ering 200W@48VDC or 250W@56VDC and supplying up to five Arqus cameras.
The maximum total cable length for a full chain with Arqus cameras is 50 m per
power supply.
Power consumption
Idle 13
The Arqus camera can communicate with a host computer through the Gigabit
Ethernet interface. For detailed information, see section "Ethernet (Gigabit)" on
page 977.
Digital IO
The Arqus camera does not provide any digital inputs or outputs. For more
information about digital inputs and outputs to a Qualisys system, see section
"Camera Sync Unit" on page 949.
Miqus cameras
Miqus
M5 M3 M1 Hybrid
model:
Pixels
Image size 4 MP 2 MP 1 MP 2 MP
Max. frame 2048×2048 1824×1088 1216×800 1824×1088
TECHNICAL REFERENCE
Normal mode (full FOV)
rate 183 Hz 340 Hz 250 Hz 340 Hz
Camera 5.5 ms 2.9 ms 2.1 ms 2.9 ms
latency
Pixels
Image size 3 MP 0.5 MP 0.5 MP
Max. frame 1024×1024 912×544 912×544
High-speed mode (full FOV) N/A
rate 362 Hz 667 Hz 667 Hz
Camera 2.8 ms 1.5 ms 1.5 ms
latency
941
Miqus
M5 M3 M1 Hybrid
model:
TECHNICAL REFERENCE
Miqus Video specifications
VC+ VC VM Hybrid
Max. frame rate 480 fps (540p 4:3) 714 fps (480p 1:1)
942
VC+ VC VM Hybrid
Dual
Filter (built in) IR cut-off IR cut-off
band pass
TECHNICAL REFERENCE
In-camera video
MJPEG Yes Yes Yes Yes
compression
Through regular
Video overlay Yes Yes Yes Yes
wand calibration
943
Description of Miqus cameras
1. LED ring
LED ring for camera identification and indication of status during startup
sequence.
l Green light: Active camera indicator (see "Identifying the cameras
with the identification tool" on page 480).
l Pulsing green light: Camera system being calibrated.
Mechanics
Physical specifications
Mounting
The Miqus camera has two UNC 1/4”-20 tripod mounting points on the bottom
of the camera.
Put the lock lever in the open position, then move the strobe mechanics away
from the camera body until the aperture and focus ring is accessible.
Follow the instructions above but pull the strobe all the way out. Make sure
that the new strobe unit is mounted correctly (see up-label on strobe rails).
All lenses can be changed by the user by following the steps above, unscrewing
the lens and replacing it with a new lens.
Electrical specifications
Power supply
Miqus cameras are powered by 48 VDC (R1 power supply) or 56 VDC (R2 power
supply) through the Data/Power connectors. The R2 power supply only works
with Miqus cameras and Camera Sync units with serial number 28123 or
higher. The operating voltage range is 20-58VDC. The power supply should be
dimensioned for a minimum of 20W for each camera.
The external AC/DC converter available for the Miqus camera is capable of deliv-
ering 200W@48VDC or 250W@56VDC and supplying up to ten Miqus cameras
and one Camera Sync Unit. The maximum total cable length for a full chain with
Miqus cameras is 120 m per power supply.
Power consumption
Idle 8
The Miqus camera can communicate with a host computer through the Gigabit
Ethernet interface. For detailed information, see section "Ethernet (Gigabit)" on
page 977.
Digital IO
The Miqus camera does not provide any digital inputs or outputs. For more
information about digital inputs and outputs to a Qualisys system, see section
"Camera Sync Unit" below.
Camera Sync Unit
The Camera Sync Unit (CSU) is an optional accessory to a Qualisys camera sys-
tem that provides trigger and sync inputs and outputs.
Operating temperature
0–35°C (32–95°F)
range
1. Trig NO indicator
Turns on for 0.5 s when the Trig NO input transitions from high to low.
2. Trig NO input
Trigger input (TTL, 0-5 Volt, normally open). The base voltage of the port is
5 Volt (high).
Physical specifications
Mounting
Electrical specifications
Idle 8
Measuring 8
Digital IO
Trigger inputs
Event/IRIG input
The Event input is dedicated for generating events. The Event input is pulled
high and can be used with the Qualisys trigger button. For an overview of the
available event port settings, see chapter "Event port (Camera Sync Unit)" on
page 276.
This input doubles as IRIG input. It is possible to use the IRIG timecode as a syn-
chronization input source and/or to timestamp the data frames with the IRIG
timecode.
NOTE: IRIG cannot be used when there are any Oqus cameras included
in the system.
Synchronization input
SMPTE input
This input is dedicated for SMPTE time code signals. It is possible to use the
SMPTE timecode as a synchronization input source and/or to timestamp the
data frames with the SMPTE timecode. For an overview of the settings and syn-
chronization scenarios, see chapters "Timestamp" on page 284, "External
timebase" on page 278 and "Using SMPTE for synchronization with audio
recordings" on page 512.
This input is dedicated for Genlock signals. It is possible to use this input as a
synchronization input source. For an overview of the settings, see chapter
"External timebase" on page 278.
Synchronization outputs
The CSU has three synchronization outputs Measurement time (MEAS. TIME),
Output 1 (OUT1) and Output 2 (OUT2). For an overview of the available set-
tings, see chapters "Synchronization output" on page 285 and "Measurement
time (Camera Sync Unit)" on page 290.
All outputs are fused and capable of driving 50 Ohm transmission lines.
Oqus cameras
General specifications
Wired com-
Hub-less daisy-chained Ethernet 802.3 @ 100Mbps
munication
Wireless com-
WiFi 802.11b/g @ 54Mbps
munication2
Built-in camera
128×64 graphical high contrast OLED
display
Standard (IP10)
Available camera Weather protected (IP67)
housing Underwater (IP68)
MRI (EMI shielded)
Position data
±1 sub-pixels
noise level
Maximum frame
1152 MB (Oqus high-speed video)
buffer size
1Depending on camera model and configuration
2Optional feature
Oqus
7+ 6+ 5+ 5
model:
Pixels
Image size 12 MP 6 MP 4 MP 4 MP
Max. frame 4096×3072 3072×1984 2048×2048 2352×1728
TECHNICAL REFERENCE
Normal mode (full FOV)
rate 300 Hz 450 Hz 179 Hz 178-182 Hz2
Camera 3.3 ms 2.2 ms 5.6 ms 5.6 ms
latency
Pixels
Image size 3 MP 1.5 MP 1 MP
Max. frame 2048×1536 1536×992 1024×1024
High-speed mode (full FOV) N/A
rate 1121 Hz 1662 Hz 355 Hz
Camera 0.9 ms 0.6 ms 2.8 ms
latency
Max frame rate (fps, reduced FOV) 10000 10000 10000 10000
957
Oqus
7+ 6+ 5+ 5
model:
TECHNICAL REFERENCE
Marker resolution1 (µm @1m) 3.9 5.4 6.9
958
Oqus model: 4 3+ 3 1
Pixels
Image size 3 MP 1.3 MP 1.3 MP 0.3 MP
Max. frame 1696×1710 1296×1024 1280×1024 640×480
Normal mode (full FOV)
rate 476 Hz 502 Hz 503 Hz 247 Hz
Camera 2.1 ms 2.0 ms 2.0 ms 4.0 ms
TECHNICAL REFERENCE
latency
Pixels
Image size 0.3 MP
Max. frame 648×512
High-speed mode (full FOV) N/A N/A N/A
rate 1740 Hz
Camera 0.6 ms
latency
Max frame rate (fps, reduced FOV) 10000 10000 10000 1000
959
Oqus model: 4 3+ 3 1
Motorized lens No No No No
TECHNICAL REFERENCE
Lens mount C C C C
1For standard lens.
Streaming video
For an overview of Oqus cameras that can be used for streaming video and their sensor specifications, see
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927.
High-speed video
A selection of Oqus camera types can be equipped to capture full-frame, full-speed, full-resolution high-speed
video. In this configuration the camera is equipped with a large 1.1 GB buffer memory and a clear front glass to
get the best possible performance out of the image capture.
960
The below table gives an overview of the sensor specifications, sensor modes and video buffer capacity for all
Oqus high-speed cameras, when in video mode. The specifications are only applicable to uncompressed video
(8-bit raw images). Note that the number of frames that can be stored in the buffer can be increased by cropping
the image.
The maximum measurement duration (in seconds) at full resolution and maximum capture rate is indicated in
the table below. For other combinations of image size (X, Y in pixels) and capture frequency use the following for-
mula to calculate the maximum capture duration:
TECHNICAL REFERENCE
Max capture Max video buffer capacity
Camera type Sensor mode Resolution
rate (Hz) (frames / s @max capt. rate)
961
Max capture Max video buffer capacity
Camera type Sensor mode Resolution
rate (Hz) (frames / s @max capt. rate)
TECHNICAL REFERENCE
NOTE: When using Oqus high-speed video, take into account prolonged data fetch times after a capture.
The data transfer time for a full buffer memory (1.1 GB) can be several minutes.
962
Description of Oqus devices
The Oqus camera has a large graphical OLED display and three LEDs on the
front to inform the user of the current status of the camera. The display shows,
among other things, the camera number and the number of markers currently
seen by the camera.
NOTE: The display will be turned off when the camera enters stand-by
mode, i.e. if the camera has not been in use for 2 hours. Start a preview
in QTM to light up the display again.
NOTE: If the camera has never been connected to QTM the last
three digits of the serial number (upper part) and the last octet of
the IP-number assigned to the camera (lower part) will be shown
instead. This can also be activated from the QDS menu, see "QDS"
on page 462.
8. Marker area
During a marker measurement this area shows the number of markers
currently seen by the camera. When the camera is idle or is collecting
video, this area shows ’-----’.
9. Text area
This area is used for scrolling text messages, for example during startup.
The back of the camera holds six connectors for power, data and control con-
nections. The view differs slightly depending on the type of camera. The image
below shows the standard version of the camera. The water protected version
uses different connectors and lacks the LEDs found on the standard version.
1. Oqus connector
Connector for control port on an Oqus camera.
2. Trig In indicator
Lit when in Trig in mode.
3. SMPTE indicator
Lit when in SMPTE mode.
4. Sync In indicator
Lit when in Sync in mode.
5. Video In indicator
Lit when in Video/Genlock mode.
6. Sync out activity indicator
Lit green for 0.5 s when output transitions from high to low.
7. Sync in activity indicator
Lit green for 0.5 s when signal transitions from high to low.
8. Sync output
Programmable synchronization output (TTL, 0-5 Volt).
l Video input when Signal source set to Video sync. Compatible with
Composite video (s-video) and Component video (YPbPr / GBR).
10. Trig input
Mode dependent on QTM project settings.
l Trigger input (TTL, 0-5 Volt) when in Trig in mode (default).
Mechanics
The Oqus camera is made of a three-piece die-cast aluminum assembly. The fin-
ish is anodized and powder painted with "Structured black" and "Clear silver".
Some versions of the Oqus camera comes in other colors: for example the MRI
camera is white.
Physical specifications
Industrial Standard 3× M6
All optics available to the Oqus camera have adjustable aperture and focus. In
the standard camera the optics is easily accessible by turning the strobe part
counter-clockwise. After adjusting aperture/focus the opening is closed by turn-
ing the strobe clockwise. In the IP-classified version the locking screws on the
side must be loosened and the strobe unit pulled out.
1. Make sure the power is turned off and the strobe is in its closed position.
All lenses on the Oqus cameras can be change by the user by following the
steps below.
2. Unscrew or dismount the lens and replace with the new lens.
Electrical specifications
Power supply
The Oqus camera can be powered by either 48VDC through the connectors
marked POWER or by 12VDC through the connector marked BATT. The power
supply should be dimensioned for a minimum of 30W for each camera. The sup-
ply should be able to deliver higher peak currents during the short periods
when the strobe is lit.
The external AC/DC converter available for the Oqus camera is capable of deliv-
ering 240W@48VDC and supplying up to five Oqus cameras.
Power consumption
The below table specifies the maximum power consumption of Oqus 7+ cam-
eras. The power consumption is dependent on the duty cycle of the flash. The
values for other models may be lower.
Idle 15
Digital IO
The control port of the Oqus camera provides an interface for digital I/O. For
the available control connections, see below.
The following digital I/O signals are available on the Oqus control port:
Trigger input: The Trigger port on the Oqus camera is mainly similar to
the TRIG NO input on the Camera Sync Unit, see chapter "Trigger inputs"
on page 953.
Control connections
The following splitter cables can be used for connecting to the control port.
They all have BNC connectors where BNC cables can be connected to extend
the length.
Sync out/Sync in/Trig splitter
This splitter has three connectors. One BNC female for Sync out, one BNC
female for Sync in and one BNC male for Trigger in.
Sync in splitter
This splitter has one connector, a BNC female for Sync in.
USB-2533
The USB A/D board (USB-2533) is a portable A/D board that can easily be con-
nected to any computer with an USB port. The board has 64 analog channels
and is distributed in a case with BNC connections for the analog signals and the
synchronization signal. For instructions how to install the board see chapter
"Installing the USB-2533 board" on page 748.
NOTE: The port can also be used to control other applications. Pin
1-12 on each port is then the digital I/O and pin 13-16 is ground.
EXTERNAL TRIGGER
BNC connection for synchronous start of the analog capture.
The following connections are available on the rear view of the board:
SYNC
BNC connection for frame synchronization of the analog capture.
INTERFACE
POWER
Connection for the power supply, which is supplied by Qualisys.
USB
USB connection to the measurement computer. The cable is supplied by
Qualisys.
POWER LED
The Power LED is lit when the board has power. The power can come
either from the USB port or from an external power supply. To use the
external power supply it must be plugged in before the USB connection,
however the Power LED will still not be lit until the USB cable is con-
nected.
USB-1608G
The USB-1608G is a portable A/D board that can easily be connected to any
computer with a USB port. The board has 16 analog channels and is distributed
in a case with BNC connections for the analog signals and the synchronization
signal. For instructions how to install the board see chapter "Installing the USB-
1608G board" on page 750.
The following connections are available on the front view of the board:
CH1 - CH16
BNC connections for the 16 analog channels.
The following connections are available on the rear view of the board:
SYNC
BNC connection for frame synchronization of the analog capture.
DIGITAL OUT
Not in use.
USB
USB connection to the measurement computer. The cable is supplied by
Qualisys.
STA. LED
The Status LED turns on when the device is detected and installed on the
computer.
ACT. LED
The Activity LED blinks when data is transferred, and is off otherwise.
USB-1608G specifications
Communication
Arqus and Miqus
Arqus and Miqus cameras communicate with a host computer through the Gig-
abit Ethernet interface.
The Oqus camera can communicate with a host computer through the fol-
lowing interfaces.
Ethernet (Oqus)
Ethernet (IEEE 802.3) is a low level communications protocol which normally car-
ries IP traffic in Local Area Networks (LANs). The physical transmission media is
generally twisted pair cables. The standard used by the Oqus cameras is
100BaseTX/802.3i, Fast Ethernet, with a communications speed of 100 Mbps.
To comply with the standard, cables classified as cat 5 or better should be used
and cable length between each node should be limited to 100m. For best per-
formance, it is recommended to keep cable length 50m or shorter.
Oqus cameras are equipped with daisy-chained Ethernet which means that a
maximum number of 15 cameras can be connected in a chain without the
requirement of an external hub or switch. It is however possible to connect the
system in a star configuration which could improve performance in very large
systems.
Oqus cameras use both TCP/IP and UDP/IP to communication with a host com-
puter and other cameras within a system.
The Oqus system can run with a wireless communication from the camera sys-
tem to the computer. The camera uses the 802.11b/g@54mbps standard.
However, the communication speed can be reduced depending on the signal
strength or the presence of other wireless networks.
A wireless configuration requires one camera in the system to be equipped
with WLAN. The cameras are connected to each other by cables and the host
communicates wireless with the entire system by using the WLAN camera as a
gateway to the other cameras.
Setting up Oqus for wireless communication requires that the wireless adapter
of the computer is set up as a hosted network. This requires a computer run-
ning Windows 7. This feature is no longer supported for most wireless adapters
in Windows 10 or higher. The configuration requires a special version of QDS,
contact [email protected] for more information.
Environmental protection
Qualisys cameras are available with a several types of environmental pro-
tection. For an overview and a general description, see chapter "Qualisys cam-
era types" on page 432.
Underwater
Marker cameras
Arqus underwater
TECHNICAL REFERENCE
Resolution (pixels) 4224×2160 (9 MP) 4096×3072 (12 MP)
979
Arqus A9u Arqus A12u
Weight 4.0 kg
TECHNICAL REFERENCE
Classification IP68, pressure tested to 5 bar (40 m depth)
980
Miqus underwater
TECHNICAL REFERENCE
Optional FOV N/A N/A
Motorized lens No No
981
Miqus M5u Miqus M3u
Weight 2.5 kg
Buoyancy Neutral
TECHNICAL REFERENCE
Operating voltage 24 VDC
982
Oqus underwater
TECHNICAL REFERENCE
Optional FOV (narrow) 24°×18° 19°×19°
983
Oqus 7+u Oqus 5+u
Weight 8.4 kg
Buoyancy Neutral
TECHNICAL REFERENCE
Operating voltage 48 VDC
984
Video cameras
TECHNICAL REFERENCE
Color No Yes
Strobe light 12 high power Blue LEDs 12 high power White LEDs
985
Miqus VMu Miqus VCu
Weight 2.5 kg
Buoyancy Neutral
TECHNICAL REFERENCE
Operating temperature range 0–35°C
986
How to connect
An Oqus system is connected in a similar way, but make sure that you use
Oqus the correct 48V connection units and power supplies.
Weight 4.0 kg
Length: 166 mm
Diameter: 180 mm
Width (including mounting bracket): 192
Physical dimensions
mm
Height (including mounting bracket): 229
mm
Operating temperature
0–35°C
range
Weight 2.5 kg
Length: 262 mm
Physical dimensions Diameter: 110 mm
Width (including mounting bracket): 143 mm
Operating temperature
0–35°C
range
Storage temperature
range
Connection unit
Weight 2.6 kg
Width: 210 mm
Dimensions (excluding connectors) Height: 77 mm
Depth: 221 mm
Classification IP65
Mechanics
Dimensions of the 24V Underwater Connection Unit.
Mount foot
Wall mount
A Quick Attach (QA) mount is available for both Arqus and Miqus underwater
cameras. The Quick Attach mount consists of two parts:
l A Quick Attach mount base, which is similar for Arqus and Oqus Quick
Attach mounts.
l A Quick Attach camera mount for Arqus and Miqus cameras, respectively.
The carbon fiber wand kit consists of a carbon fiber wands with a length of 600
mm and a L-frame where the long arm is about 600 mm long.
The wand is attached to the handle by pressing and turning it in the track until
the handle locks. Make sure that you lock the handle so that the wand is not
dropped and the markers damaged.
The L-frame rests on three points. A static corner point that is the origin when
calibrating in QTM. The resting points on the arms are adjustable with the
adjustment wheels. Check the spirit levels on the frame and adjust the points
so that the L-frame is level.
There are also force plate positioning plates on the side of the L-frame, so that
the L-frame can be placed on the same position on the force plate for every cal-
ibration. Loosen the screws to fold down the positioning plates, then tighten
the screws before placing the L-frame on the force plate with the positioning
plates on the sides of the force plate.
When folded the L-frame is held together by a magnet. To open the L-frame
pull the arms and unfold the arms to the maximum position. The arms are then
locked, so to fold it pull the red locking sprint away from the center.
It is recommended to store the L-frame in one piece. In case the L-frame needs
to be disassembled, for example for transport, follow these instructions.
1. Press the release button and pull gently on the short arm of the L-struc-
ture to separate it into two pieces. Make sure the locking setscrews are
disengaged.
3. The two pieces are now separated. Assembly is done in the reverse order.
The active 500 mm calibration kit has compound white and IR LEDs, and can be
used for the calibration of both marker and video cameras. To use the active
calibration kit, the cameras need to be in active marker mode, see chapter
"Marker mode" on page 229. The recommended exposure time of the cameras
is 400-500 μs.
For outdoor use, it is recommended to use active filtering, see chapter "Active fil-
tering" on page 246.
NOTE: When using a system with only Miqus video cameras for mark-
erless motion capture, the cameras should be equipped with strobes that
contain one IR LED. Contact [email protected] if you need to
upgrade the strobes of your Miqus video cameras.
Configuration
The active 500 mm L-frame and wand can be configured via the USB-C port.
The options that can be configured are: triggering mode (triggered,
untriggered), LED activation (IR, white or both), and intensity (default power
and high power). The active 500 mm calibration kit is by default configured in
triggered mode, with both IR and white LEDs activated and a flash time of 400
μs.
The untriggered mode can be used if the L-frame needs to be visible for non-
Qualisys cameras, e.g. standard video cameras.
For more information about how to configure the active calibration kit, contact
[email protected].
The L-frame and wand can be charged using any standard USB-C charger. The
charging time is less than 2 hours. The battery time for the default con-
figuration (triggered, white and IR LEDs) is 10 hours or more for the L-frame
and 20 hours or more for the wand.
Active marker types
Naked Traqr
A component kit for embedding Qualisys active tracking into custom
objects or props. For more information about the hardware, see chapter
"The Naked Traqr" on page 1003.
The Active Traqr is a compact lightweight trackable object for 6DOF tracking.
The Active Traqr has four active markers that are identified using sequential
coding. This facilitates robust real-time rigid body tracking and identification of
3. Battery indicator
o Green light: Battery status 16-100%.
o Yellow light: Battery status 6-15%.
o Red light: Battery status 0-5%.
4. USB-C connector
Connector for charging and configuration of the device.
5. Active markers
Four active markers with diffusor.
6. IR eye
Infrared detector for synchronization of the device.
7. Reset button
Recessed button for resetting the device.
8. Mount
Bayonet mount for attaching several types of mounts. A screw mount is
included with the active Traqr.
Range >35m1
20 hours @100 fps continuous meas-
Battery time
urement
Charging time 2 hours
Connector USB-C
The Naked Traqr consists of a circuit board and a set of components that can
be embedded in an object that needs to be tracked. The Naked Traqr works
just like the Active Traqr with the added benefit that it allows you to seamlessly
integrate Qualisys active marker technology into custom objects. A single
Naked Traqr unit supports up to eight sequence coded markers.
The Naked Traqr can be used in a wide variety of applications. Two typical
examples are:
Location Based Virtual Reality
The Naked Traqr can be embedded into props, such as HMDs or VR
weapons. This way the props can be reliably tracked without the need to
add markers that are visible to the eye.
Engineering applications
The Naked Traqr can be embedded in moving objects for reliable real-
time tracking. The active markers can be used on both rigid or flexible
objects, making use of the sequential coding for identification of single
markers or markers in a rigid body configuration. An additional advantage
of the use of active markers is that it helps to avoid the interference of
extra reflections otherwise caused by the strobes of the cameras when
l IR detector
l IR LEDs (8x)
A short manual with instructions on how to assemble and power the Naked
Traqr is included in the package.
NOTE: Optionally, a battery can be used to power the Traqr. The battery
is not included in the package.
Range >35m1
20 hours @100 fps continuous measurement
Battery time
with 4 markers
Charging time 2 hours
Connector USB-C
LEDs up to 8 wide angle, NIR LEDs (850 nm)
Sequence coding 0-8 LEDs2
Synchronization Optical
Input voltage 5V3
Size (PCB) 45 x 30 x 6.3mm
Weight 6g
Operating tem-
0-50°C
perature range
1Depending on camera resolution and LED separation. For
example, about 35m with Arqus A12, or 16m with Miqus M3.
2Maximum number of unique sequence coded LEDs in one sys-
tem: 740
3If it is not possible to supply 5V or use the battery, contact
Qualisys support.
The Short Range Active Marker consists of a driver unit capable of driving up to
32 markers through four daisy-chained outputs. The markers are small and
lightweight and can be attached to the skin with simple double adhesive tape.
The driver has an integrated battery which can either be charged in the driver
unit or replaced with another battery and charged in a standalone charger. The
battery is selected to last a day of normal measurements.
The use of the Short Range Active Marker (SRAM) can be beneficial in the fol-
lowing situations:
Outdoors
The high intensity of the active marker allows for effective tracking when
outdoors at frequencies up to 500 Hz.
IR in/Sync In
Connector for the IR eye or Sync in. The standard use is to connect an IR
eye to the input, The eye is triggered by a modulated pulse from the Oqus
camera.
The sync in signal must be coded to be correct and must therefore be
sent either from another active marker driver or from an Oqus camera
with the sync out signal set to Wired synchronization of active
markers.
Sync out
Use the Sync out connector to daisy chain several active marker drivers to
one IR eye. Connect the Sync out signal from one driver to the Sync in con-
nector on the next driver.
Charge/ 9 VDC
Use this connection to charge the battery with the supplied power supply.
The driver can still be used while it is being charged.
CAUTION: Do not use the driver with the power connected while
the driver is mounted on a person.
ON/OFF
Turn on the driver by pressing the button. Turn off by holding down the
button a couple of seconds.
Battery indicator
When pressing the power button, the four LEDs indicates the battery
status. One LED means that you soon have to charge the battery.
Status indicators
The three status indicators at the top displays the following information:
Power/Batt Low
The LED is lit green when the power is on and starts flashing green
when the driver goes into the power save mode. The battery is low
when the LED is lit or flashing red.
Charging
The LED is lit green when the driver is being charged, if something is
wrong the LED is lit red.
Sync active
The sync is active when the LED is flashing orange.
The active markers are mounted on a daisy-chained wire that can have up to 8
markers and a length of 3 meters. The size of the markers is comparable to 12
mm passive markers. The ID of the markers is decided by the connector on the
driver. When a chain has less than 8 markers the other IDs of that connector
will be skipped, e.g. if you have chain of 4 markers connected to the first two
connectors these will have ID 1-4 and ID 9-12.
The battery in the short range active marker is a rechargeable Li-Polymer bat-
tery. It can power the active marker for a day of normal measurements. The bat-
tery can be recharged either in the driver with the power supply or separately
in the battery charger. It takes about 3 hours to fully charge the battery.
CAUTION: Do not use the driver with the power connected while the
driver is mounted on a person.
To remove the battery, press the button on the side of the driver and lift the
bottom lid. Make sure that you insert the battery according to the image inside
the driver.
Driver 83 x 52 x 15 mm, 99 g
IR marker D 16 x H 11 mm, 3 g
Frequency range 1 - 500 Hz
Number of markers per driver Up to 32 markers
Up to 5 drivers can be used,
Total number of markers
up to 160 sequential coded markers
Maximum measurement distance more than 25 m
Battery time, 200 fps, 16 markers 4.5 h
Time to fully charge battery 2-3 h
The Long Range Active Marker is usually used for industrial applications, for
example marine measurements. The marker is synchronized with the cameras
via a pulsed strobe signal. The same signal is used for spherical and reference
marker. Because of the synchronization the marker can be lit only during the
exposure time, which means that the LEDs can be brighter. For more inform-
ation about the long range marker please refer to the information included
with the marker or contact Qualisys AB.
Marker maintenance
The passive markers should be attached to the object by double adhesive tape.
The passive markers can be cleaned from grease and dirt with soap and water.
Do not use very hot water when cleaning the markers and do not use a brush
when cleaning the markers.
The 6DOF tracking function uses the rigid body definition to compute Porigin,
the positional vector of the origin of the local coordinate system in the global
coordinate system, and R, the rotation matrix which describes the rotation of
the rigid body.
The rotation matrix (R) can then be used to transform a position Plocal (e.g. x'1,
y'1, z'1) in the local coordinate system , which is translated and rotated, to a pos-
ition Pglobal (e.g. x1, y1, z1) in the global coordinate system. The following equa-
tion is used to transform a position:
Pglobal= R ·Plocal + Porigin
If the 6DOF data refer to another coordinate system than the global coordinate
the position and rotation is calculated in reference to that coordinate system
instead. The coordinate system for rigid body data is then referred to the
global coordinate system.
The rotation matrix (R) is then calculated by multiplying the three rotation
matrices. The orders of the multiplications below means that roll is applied
first, then pitch and finally yaw.
The following equations are then used to calculate the rotation angels from the
rotation matrix:
The range of the pitch angle is -90° to 90°, because of the nature of the arcsin
function. The range of the arcos function is 0° and 180°, but the range of roll
and yaw can be expanded by looking respectively on the r23 and r12 elements
From the matrix above the roll can be calculated in the range ±180°.
IMPORTANT: With the definitions above, roll, pitch and yaw are unam-
biguous and can describe any orientations of the rigid body. However,
when the pitch (f) is close to ±90°, small changes in the orientation of the
measured rigid body can result in large differences in the rotations
because of the singularity at f=±90°.
NOTE: If you use rotations around global axes the order of multiplication
of the individual rotation matrices are reversed and if you use a left-hand
system change the positive direction to counterclockwise, which means
that the sign of the angle is swapped.
First there are two types of rotation matrices: those with three different rota-
tion axes and those with the same rotation axis for the first and third rotation.
You have to look at the rotation matrix to see what indexes and signs that
should be used. The singularity will always be at ±90° for the second rotation
and at the singularity the third rotation is always set to 0°.
In the second type the third rotation is round the same axis as the first rota-
tion. This means that one of the individual rotation matrices below is used as
the last rotation, depending on which axis that is repeated.
The rotation matrix (R) is then calculated by multiplying two of the individual
rotation matrices from the first type and then one of matrices above. In the
example below the rotations are round the x, y and then x axis again.
The rotation angles can then be calculated according to the equations below.
These equations are similar for other rotation matrices of this type, just the
indexes and types are changed.
Developers' resources
The following chapters contain information about the possibilities for
developers to communicate with QTMs interface and extend its capabilities.
Real-time protocol
The real-time protocol makes it possible to write external applications that can
control QTM and receive real-time data. The real-time protocol is implemented
in a range of SDKs and QTM Connect modules for external software for real-
time control and communication with QTM. All resources are available on
Github (https://github.com/qualisys) and the Qualisys website
(https://www.qualisys.com/downloads/).
SDKs are available for:
l C++
l C# (.NET)
l Python
l JavaScript
l ROS resources
QTM also supports the Open Sound Control (OSC) protocol for sound and mul-
timedia applications and devices.
Full documentation of the real-time protocol is included with the QTM installer
and is available online at https://docs.qualisys.com/qtm-rt-protocol/.
The QTM Scripting Interface provides, amongst others, the following func-
tionality:
l Access and modification of recorded data (captures)
l Extending the main menu, adding new QTM commands and keyboard
shortcuts
l Rendering of text and graphical elements in the 3D View window
The components in QTM related to scripting are shown in the above figure:
Terminal button: Show/hide the Terminal window.
Reload scripts button: Reload script files that are used in the project
(keyboard shortcut F5).
Scripting settings: Terminal settings and script files used in the project.
The settings for scripting can be accessed via Project Options > Mis-
cellaneous > Scripting. In the settings, you can select the interpreter language
that is used for the terminal and add scripts that are loaded into the project,
see chapter "Scripting" on page 431 for more information.
TIP: The import step can be automated in a startup script that can be
added to the Script files in the QTM project.
Scripting resources
Demo scripts
Collection of scripts that demonstrate various capabilities of the scripting
engine.
Tools
Collection of tools that can be helpful add-ons to QTM, or can serve as
examples for developing your own tools.
Using scripts
Scripts can be used in QTM by simply adding them to the Script files list in the
QTM project. To load scripts files:
2. Press the Add button and locate the script file to be added in the file dia-
log.
You can also add your own scripts to QTM. You can write Python or Lua scripts
in any text editor, for example Visual Studio Code.
Here is a simple example of a Hello World! script, showing how you can add a
menu to QTM with a button that displays “Hello world!” in the terminal and the
3D View window.
When your script is ready, you can add it to the script files list under Project
Options > Miscellaneous > Scripting. The script will be loaded when pressing
Apply or OK, and when starting the project.
When you make modifications to your script, you can reload it in QTM using the
Reload button.
Use of external packages
You can add external packages for Python using pip install, following these
steps:
l Open cmd.exe (as administrator).
l Change the current path to the folder in which QTM is installed (usually
C:\Program Files\Qualisys\Qualisys Track Manager).
l Type the command: .\python.exe -m pip install <name of the
package to install> (replace <> with the package you want to
install).
For example for installing numpy: .\python.exe -m pip install
numpy
NOTE: Some external packages may not work properly in QTM, for
example packages with GUI functionality.
QDevice API
The QDevice API can be used to integrate external data acquisition devices with
QTM. The SDK including documentation can be downloaded from the Qualisys
website at https://www.qualisys.com/downloads/.
Troubleshooting QTM
Troubleshooting connection
Symptoms
QTM cannot find the connection to the camera system or do not find the
whole system.
Resolutions
l Check that the camera system is turned on.
l Check that the cable between the computer and the master camera
is connected.
l Check that the cables between the cameras in the camera system
are connected.
l Check that the cameras are working and that the cables are not dam-
aged.
Symptoms
The camera does not got an IP-address (Arqus/Miqus status ring keeps
pulsing yellow; Oqus: status bar on the display gets stuck at about 75%).
Resolutions
l Restart the computer.
l Check that the cable between the computer and the master camera
is connected.
l If the computer has two network cards remove the other network
cable and go to Network connections in Windows and check that the
correct network is connected.
l Check that the network has the correct static IP-address settings,
see "Network card setup" on page 461. Run the QDS wizard if you do
not know how to change the IP-address settings.
l Check that QDS is not blocked by a firewall. Most importantly check
that the Windows firewall allows the program.
Troubleshooting calibration
Symptoms
Resolutions
l Check that the Exact wand length on the Calibration page in the
Project options dialog are correct.
l Check that all cameras can see the L-shaped reference structure.
l Check that the calibration wand has not been moved too fast.
l Check that the calibration wand has been moved inside the cal-
ibration volume during the calibration, i.e. check that there is at
least 500 points used for each camera in the Calibration results dia-
log.
l Check that the camera positioning is OK, see chapter "Camera pos-
itioning" on page 439.
Symptoms
Symptoms
Resolutions
l The camera is deactivated on the Linearization page in the Project
options dialog. Activate the camera and perform the calibration
again or reprocess the calibration, see chapter "Recalibration" on
page 563.
Troubleshooting capture
Symptoms
There are no visible markers in the 2D view window in the preview mode.
Resolutions
l Check that the cameras are focused.
Symptoms
The markers in the 2D view are too small (check the size in the Data info
window).
l Check that the marker is not worn, if it is replace it with a new one.
Symptoms
Resolutions
l Check the Exposure time and Marker threshold settings, see
chapter "Tips on marker settings in QTM" on page 483.
Symptoms
Resolution
l Check the aperture, see chapter "Tips on setting aperture and focus"
on page 481.
Symptoms
Resolutions
l Check that the Bounding box setting on the 3D Tracking page in
the Project options dialog are large enough.
Resolution
l Check that the Use external trigger option is not selected on the
Synchronization page in the Project options dialog.
Symptoms
A camera is indicated as Not use for tracking in the 2D view and the 3D
view window.
Resolutions
l The camera is deactivated on the Linearization page in the Project
options dialog. Activate the camera and retrack the measurement if
you have already made measurements, see chapter "Reprocessing a
file" on page 601.
Symptoms
There are error messages saying that the camera did not reply three
times in a row.
Resolutions
l Check that you are using the built-in wired Ethernet. Sometimes the
communication does not work with Ethernet adapters.
l Check the Ethernet cables so that they are not broken.
Symptoms
Resolutions
l Check the tracking parameters, see chapter "3D Tracker parameters"
on page 325.
l Make a new calibration.
l Check that the camera positioning is OK, see chapter "Camera pos-
itioning" on page 439.
l Check that the calibration wand has been moved in the area where
the segmentation occurs. E.g. when segmentation occurs close to
the floor at gait measurements.
Symptoms
Resolutions
l Check if the markers on the measurement object can be smaller.
Symptoms
Symptoms
Resolutions
l Make sure that it is the correct AIM model that is applied.
l Make sure that the model was generated from a file where all of the
trajectories could be viewed during the whole measurement and
where there were no erratic data, see chapter "Generating an AIM
model" on page 625.
Symptoms
Resolutions
l 2D tracking has been used instead of 3D tracking. If you want 3D
tracking in the following captures change to 3D on the Processing
page in the Project options dialog. To 3D track a saved capture file
with 2D tracked data, you have to Batch process the file with 3D
tracking, see chapter "Batch processing" on page 605.
Troubleshooting reflections
Symptoms
TIP: You can use the Video mode to locate the reflections.
l Check that the cameras cannot see each other IR flashes or the
reflection of the IR flashes.
l Check that there are no very glossy materials in the measurement
volume.
l Use exposure delay, see chapter "Delayed exposure to reduce reflec-
tions from other cameras" on page 534.
l Use the marker masking function if you cannot remove the reflec-
tions, see chapter "Marker masking" on page 536.
Symptoms
Resolutions
l Check that the calibration parameters for the force plate have been
entered correctly, for information about the settings see chapter
"Force plate settings" on page 362.
Symptoms
Symptoms
Resolutions
l Check the settings for the force plate location on the Force plate
page in the Project options dialog.
Symptoms
The area representing the force plate is not placed at the correct place in
the 3D view window.
Resolutions
l Change the Force plate location settings on the Force plate page
in the Project options dialog.
Troubleshooting 6DOF
Symptoms
Resolutions
l Check that Calculate 6DOF is activated on the Processing page in
the Project options dialog. If you have a file reprocess it with Cal-
culate 6DOF activated with the correct 6DOF bodies.
l Check that the 6DOF body is within the measurement volume.
Symptoms
There is no 6DOF body data in the Data info window even though it is dis-
played in the 3D view window.
Resolutions
l Check if the 6DOF body uses the Use the current position of this
rigid body option in the Coordinate system for rigid body data
dialog. If it does it will not be displayed in the Data info window
when the other body is not visible.
Troubleshooting update
Symptoms
You are using an old version of QTM and you want to update the soft-
ware.
Resolutions
l Look for the latest software on this web page https://www.qualisys.-
com/my/ and follow the instructions for installing the QTM software,
see chapter "Software installation" on page 54.
Symptoms
QTM needs to update the firmware and you cannot find it on your com-
puter.
Troubleshooting other
Symptoms
There are less analog channels displayed in the Data info window than
there should be.
Resolutions
l Check that all of the analog channels have been activated on the
Analog board (...) page in the Project options dialog.
Symptoms
The Status bar shows the message ”No analog trigger received” after a
motion capture.
Resolutions
l Check that the synchronization cable between the camera and the
analog board is connected.
Symptoms
There is a Too fast pacer rate error for the USB-2533 board even though
the analog capture rate is low.
Resolution
l Check the sync cable that is connected between the camera system
and the analog board.
Symptoms
Resolution
l Retrack the file with new tracking parameters.
NOTE: This will delete all processing that have been made to
the capture file.
Symptoms
Resolutions
l There can be 30 View windows opened for a qtm-file, this also
includes the Plot windows. If you have 30 windows opened, close
one to open a new window.
2D
Two dimensions
2D tracking
Tracker that uses the 2D data of a single camera to calculate trajectories in a plane.
2D view window
Window with the 2D views of the cameras.
3D
Three dimensions
3D point
Point that is specified with the three coordinates of the 3D space.
3D tracking
Tracker that uses the 2D data of all cameras in the system to calculate marker pos-
itions in three dimensions.
3D view window
Window with a 3D view calculated from the 2D data.
6DOF
Six degrees of freedom
GLOSSARY 1035
6DOF tracking
Tracker that calculates the position and rotation of a rigid body in the 3D view.
Analog capture
QTM can capture analog voltage data in synchronization with the motion capture
data. If you have an analog board.
Analog output
With analog output 6DOF data can be used as feedback to an analog control system. If
you have an analog output board.
Aperture
The size of the opening in the camera’s lens. This opening can be adjusted with the
adjustment ring.
GLOSSARY 1036
B
Bit
Computer unit, which can be either 0 or 1.
BNC
Type of contact for coaxial cables.
Bone
Visible connection between two trajectories in the 3D view.
Byte
Computer unit. 1 byte = 8 bit
C3D
Standard file format in motion capture
Calibration
Process that defines the position of the cameras in the 3D space. The calibration is
used for the 3D reconstruction.
Calibration kit
Equipment that is needed for a wand calibration, e.g. calibration wand and L-shaped
reference structure.
Calqulus
Online platform for biomechanical data analysis hosted by Qualisys AB.
GLOSSARY 1037
Camera ray
The 2D position of a marker projected into the 3D space based on the position and ori-
entation of the camera.
Capture
Measurement which collects several frames at a fixed frame rate.
Capture file
A qtm-file with motion capture data (.qtm).
Capture rate
Frame rate in Hz that is used for the motion capture.
Capture view
View that is used during motion capture.
Coordinate system
A system of axes which describes the position of a point. In QTM all of the 3D coordin-
ate systems are orthogonal, right hand systems.
D/A board
Digital/analog board, which converts a digital value to an analog signal.
GLOSSARY 1038
E
Extended calibration
Method that extends the motion capture volume, when using Wand calibration.
External timebase
Device that controls the frame rate of the camera system.
External trigger
Device that triggers the motion capture system to start the measurement.
FBX
FBX (filmbox) is a widely used file format for exchange of 3D and skeleton data for
animation software.
Fill level
The percentage of the capture period in which a trajectory has 3D data (been visible).
Focus
Changes the focal length of the camera to achieve a clear image.
GLOSSARY 1039
Force plate
Plate that can measure the force that is applied on top of it.
Frame
Single exposure of the camera system.
Frame rate
Frequency of the motion capture.
Gap
Missing part within a trajectory.
Gap fill
Function that calculates a probable path between trajectory parts to associate them.
IR
Infrared
IR marker
A marker which reflects or transmits IR light.
GLOSSARY 1040
L
Label
Name of a trajectory in the Identification windows.
LED
Light Emitting Diode
Linearization
Correction data which is needed for each camera to make the capture as good as pos-
sible.
Marker
Item that is attached to the moving object to measure its position.
Marker – Active
Marker with an infrared LED that is activated by the camera’s flash in each frame.
Marker – Passive
Marker with reflective material.
GLOSSARY 1041
Marker discrimination
Option that reduces non-wanted reflections or marker sizes during capture.
Marker mode
The default mode in 2D view windows, which shows the markers captured by the cam-
era.
Markerless mocap
Video based motion tracking without the use of markers.
Max residual
Maximum distance for a 2D ray to be included in a 3D point during tracking.
Measurement computer
Computer which is connected to a camera system, which must have the QTM applic-
ation installed.
Measurement range
The range that is set with the boxes on the Timeline control bar. It defines the frames
which is included in the analysis.
Mesh
Wavefront 3D object (.obj ) files and associated .mtl and texture files for visualization
of 3D objects.
Motion capture
Measurement which records a motion.
Motion glove
Glove used for tracking finger motions.
GLOSSARY 1042
O
Open GL
Standard graphic presentation language.
PAF
Project Automation Framework for structured data collection, analysis and reporting.
Pitch
Rotation around the Y-axis in Qualisys standard rotation.
Plot window
Window with data plots.
Pretrigger
Setting in QTM where the frames before the trigger are saved in the camera’s memory
and are then added to the measurement when a trigger event occurs.
Preview mode
Mode when QTM is showing the measured data before the start of a capture.
QDevice API
API for the integration of external data aquisition devices into QTM.
QFI
Program for installing firmware in the Qualisys cameras.
GLOSSARY 1043
QTM Scripting Interface
API providing scripting support for QTM, implemented for Python, Lua and REST.
Reference marker
Special kind of active marker which is used in fixed camera systems and is visible at
long distances.
Reference structure
The L-shaped part in the calibration kit of the Wand calibration.
Remote computer
Computer which receives 6DOF data from the RT output.
Residual
In most cases in QTM this is the minimum distance between a 2D marker ray and its
corresponding 3D point or an average of this measure.
Residual (3D)
The average of the different residuals of the 2D marker rays that belongs to the same
3D point.
Residual (6DOF)
The average of the errors of each measured marker compared to the 6DOF body
definition.
Residual (calibration)
The Average residual in the Calibration results dialog is the average of the 3D resid-
uals of all the points measured by the camera during the calibration.
GLOSSARY 1044
Rigid body (6DOF body)
Body that is defined by points and a local coordinate system, i.e. location and rotation.
Roll
Rotation around the X-axis in Qualisys standard rotation.
SAL
Skeleton Assisted Labeling.
Scripting
Possibility to extend QTMs functionality through Python or Lua scripts, or REST API.
Skeleton
Series of segments organized in joint chains with hierarchical relationships.
Skeleton calibration
Function that creates a skeleton including degrees of freedom, joint boundaries, pos-
ition of segments origins and position of markers expressed in their respective seg-
ment coordinate system.
Skeleton solver
Function that fits a calibrated skeleton definition to measured 3D trajectories.
Smooth
Operation applied to data in order to reduce noise while preserving important pat-
terns.
Spike
Discontinuity between consecutive frames within a trajectory.
GLOSSARY 1045
Subpixel
Unit used to express marker position and size in the 2D data of a Qualisys camera.
The number of subpixels is obtained by multiplying the number of pixels in each
dimension by a factor 64.
TCP/IP
Protocol for communication between computers.
Trace
Traces the position of a trajectory in the 3D view.
Trace range
Range of frames for which the trace is shown.
Tracking
Process that calculates 3D data, 6DOF or skeleton data.
Trajectory
3D data of a marker in a series of frames.
Translate
Move the center of rotation and zoom in the current 2D plane of the 3D view.
GLOSSARY 1046
Traqr
Compact lightweight object with active or passive markers for rigid body tracking, part
of the Qualisys Traqr ecosystem.
Trigger
Signal that starts a measurement.
Tripod
A very stable three-legged stand.
Twin system
Connect two separate camera systems and merge their 3D data in one file. For
example with above and under water systems.
USB
Hardware interface for connecting peripheral units to a computer.
View window
Window in QTM which shows 2D, 3D or Video views.
GLOSSARY 1047
Virtual trajectory
Artificial trajectory added during post processing without a direct relation to meas-
ured 2D data.
Volume
The defined measurement’s height, length, depth.
Wand
T-shaped object that is used in Wand calibration to scale and define the axes of the
coordinate system of the motion capture.
Wand calibration
Calibration method which uses a wand and an L-shaped structure to calibrate.
WLAN
Wireless local area network.
Yaw
Rotation around the Z-axis in Qualisys standard rotation.
Zoom
Zoom in to get a close-up of your 3D view or zoom out to see more of the page at a
reduced size.
GLOSSARY 1048
Index
2D data
graphic display 84
plot 170
2D tracking 618
description 618
settings 330
2D view window 84
3D data 138
plot 151
outline 567
3D tracking 614
description 614
INDEX 1049
Maximum residual 326
parameters 325
settings 325
test 616
troubleshooting 1028
bones 119
contents 109
menu 131
Skeletons 123
trajectories 116
settings 389
creating 650
definition 653
settings 346
INDEX 1050
troubleshooting 1031
calculation 1011
description 665
settings 387
description 659
introduction 649
parameters 345
channels 297
connect 752
drivers 747
settings 291
USB-1608G 975
USB-2533 973
INDEX 1051
Active markers
Settings 531
AIM 624
settings 339
Skeleton 682
capture 752
plot 175
Analyze 153
Aperture
Arqus 447
Miqus 451
Oqus 458
Tips 481
Arqus
INDEX 1052
Camera Sync Unit (front) 950
Communication 976
Description 933
description 605
settings 322
installing 899
settings 901
Bones 119
create 119
export 727
settings 400
INDEX 1053
Calibration 543
perform 543
reprocess 563
result 558
settings 253
transformation 259
troubleshooting 1024
Description 950
Digital IO 953
Mechanics 953
Cameras
Communication 976
INDEX 1054
How to set up 441
Types 432
Underwater 978
Capture 566
guidelines 479
how to 567
troubleshooting 1025
Connection 477
troubleshooting 1022
local 658
transformation 259
INDEX 1055
Data export 710
description 710
menu 168
plot 168
2D data 169
3D data 138
window 137
INDEX 1056
E
Cometa 838
description 709
example 663
Events 706
Tips 483
settings 278
INDEX 1057
F
Settings 409
QFI 471
settings 257
Focus
Arqus 447
Miqus 451
Oqus 458
Tips 481
export 727
settings 362
troubleshooting 1030
Force plates
analog 790
Arsalis 759
digital 756
INDEX 1058
Kistler digital 768
How to 642
Kinematic 645
Linear 643
Methods 642
Polynomial 643
Relational 643
settings 338
Static 643
Virtual 644
Generic devices
Hardware
Arqus 933
INDEX 1059
EMG device 803
Miqus 944
Pretrigger 493
video 909
description 579
Settings sidebar 91
Identification 624
manual 620
Instrumented treadmills
INDEX 1060
Linearization 485
how to 485
settings 249
data 117
maintenance 1010
passive/active 529
placement 530
settings 227
sizes 529
Tips 483
Markerless mocap
Mesh
Static 424
INDEX 1061
Miqus
Communication 976
Description 944
Mixed system
Motion glove
Manus 889
Settings 344
StretchSense 892
Oqus
Communication 977
connectors 965
display 963
high-speed 579
INDEX 1062
How to connect 455
PCI boards
Pitch 1011
definition 1013
example 663
view 170
Plot 179
3D data 151
Panning 182
Shortcuts 215
window 179
Zooming 182
Pretrigger 277
INDEX 1063
Settings 277
Processing 600
AIM 624
settings 322
Description 1016
Resources 1016
Project view 62
Projects 60
backup 73
creating 69
presets 74
using 66
QDevice
Resources 1022
INDEX 1064
QDS 462
advanced 467
menu 462
QFI 471
calibration 576
Rays 130
description 590
Resources 1015
settings 387
Recalibration 563
Retracking 601
INDEX 1065
Rigid body (6DOF body) 650
creating 650
definition 653
settings 346
Roll 1011
definition 1013
example 663
view 170
description 709
example 663
Safety notices 44
Scripting
Description 1017
Resources 1019
Skeleton solver
Calibration 690
Data 172
INDEX 1066
FBX export 409, 742
Processing 700
Settings 341
T-pose 690
AIM 682
Introduction 671
Skeletons 123
Smoothing 646
Butterworth 647
Methods 646
SMPTE
Synchronization 512
Software masks
Spikes 645
Detection 646
Smoothing 646
Threshold 646
INDEX 1067
Streaming video 576
Settings sidebar 91
Synchronization 266
Audio 512
Pretrigger 493
Settings 266
SMPTE 512
settings 285
System
requirements 48
Timestamp 284
settings 284
Timing
Trace 111
range 133
INDEX 1068
Tracking
retracking 601
test 616
troubleshooting 1028
Trajectories 137
delete 152
display 144
in 3D views 116
overlapping 142
windows 137
Menu 165
Shortcuts 211
INDEX 1069
Smoothing 645
Spikes 645
Toolbar 161
data in 138
menu 144
Shortcuts 214
Traqr
Active 1000
Naked 1003
Trigger
External 273
Keyboard 267
Software 267
Wireless 267
Settings 273
export 711
settings 397
calibration 519
files 522
INDEX 1070
settings 333
setup 514
Upgrade firmware
QFI 471
USB-1608G 975
USB-2533 973
Video 898
compression 910
settings 304
View window 84
2D 84
3D 109
video 100
INDEX 1071
W
settings 253
tips 548
Window layout 82
Settings 267
Yaw 1011
definition 1013
example 663
view 170
INDEX 1072