0% found this document useful (0 votes)
36 views1,072 pages

QTM User Manual 8

The document is the manual for Qualisys Track Manager version 2025.1, detailing important information, system requirements, and user instructions for the software. It includes sections on installation, user interface, project management, and various functionalities related to motion capture systems. The manual also covers safety notices, licensing, and troubleshooting guidelines.

Uploaded by

Nicolas Cevennes
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views1,072 pages

QTM User Manual 8

The document is the manual for Qualisys Track Manager version 2025.1, detailing important information, system requirements, and user instructions for the software. It includes sections on installation, user interface, project management, and various functionalities related to motion capture systems. The manual also covers safety notices, licensing, and troubleshooting guidelines.

Uploaded by

Nicolas Cevennes
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1072

Qualisys Track Manager

Version: 2025.1

Qualisys AB, 4/30/2025

www.qualisys.com

No part of this publication may be reproduced, transmitted, transcribed, stored


in a retrieval system, or translated into any language in any form by any means
without the written permission of Qualisys AB.
In no event shall Qualisys AB be liable for any incidental, indirect, or con-
sequential damages whatsoever (including, without limitation, damages for loss
of business profits, business interruption, loss of business information, or any
other pecuniary loss) arising out of the use of or inability to use the software or
hardware.
Copyright © 2025 Qualisys AB ® All Rights Reserved.
Qualisys Track Manager and QTM are registered ® trademarks of Qualisys AB.

Manufactured by:
Qualisys AB
Kvarnbergsgatan 2
411 05 Göteborg
Sweden

ABOUT THIS MANUAL 2


Tab le of Con t en t s
General information 44
Important information 44
Intended use 44
Safety notices 44
IR radiation notice 44
User safety 44
Installation risks 45
Computer and internet 45
EU customer information 46
Waste Electrical and Electronic Equipment (WEEE) 46
Qualisys WEEE Collection Process 46
有害物质声明 47
System requirements 48
Qualisys system and computer 48
External devices 49
Hardware compatibility and version requirements 50
Getting started 52
Qualisys Motion Capture System 52
Qualisys cameras and devices 52
Qualisys software 52
Qualisys user account 53
Training resources 53
Software installation 54
QTM registration 55
Adding licenses 55
Import licenses from a file 57

ABOUT THIS MANUAL 3


QTM user interface 58
Running QTM 58
Starting QTM 58
QTM main functions 59
Projects 60
Project folder 61
Project view 62
Project data tree 62
Details 66
Using projects 66
Projects with qualified users 67
Projects with students 68
Using projects on multiple computers 68
Creating a new project 69
Manage projects 72
Backup of project settings 73
Project presets 74
Maintenance of presets 75
Opening a project, restoring or importing project settings 76
QTM windows 76
Overview of window types in QTM 76
Main window 76
Primary windows 78
Floating windows 78
Main status bar 79
Window handling 81
Docking and floating 81
Arranging windows 81
Arranging windows on multiple computer displays 82
Window layouts 82

ABOUT THIS MANUAL 4


View windows 84
2D view window 84
Camera feeds 85
Camera information 85
Image area 86
Selecting cameras, zooming and panning 87
2D view toolbar 89
Camera settings sidebar 91
Camera Mode 92
Marker settings 93
Streaming Video settings 96
Video settings 98
Lens Control 100
External video devices in 2D view 100
2D view window menu 103
3D data overlay 107
Review camera settings 108
3D view window 109
Overview of graphical elements 109
Volume and calibration 110
Cameras 110
Motion capture data 111
Other information 111
Navigating in the 3D view 112
Other useful mouse actions 113
Projections 113
3D view toolbar 114
Trajectories in 3D views 116
Bones in 3D views 119
Create bones 119
Modify bones 120

ABOUT THIS MANUAL 5


Bone menu 120
6DOF bodies in 3D views 121
Skeletons in 3D views 123
Volumes in 3D views 125
Camera view cones in 3D views 128
Rays in 3D views 130
3D View window menu 131
Timeline control bar 133
Timeline parameters 135
Events menu 136
Trajectory info windows 137
Data in Trajectory info windows 138
Sort trajectories 140
Select and move trajectories 141
Overlapping trajectory parts 142
Trajectory info window menu 144
Trajectory management 151
Plot 3D data 151
Split part after current frame 151
Swap parts 151
Delete trajectories 152
Analyze 153
Filters 154
Calculation 154
Output name 157
Label lists 158
Labels menu 158
Trajectory Editor window 159
Plot area 160
Trajectory Editor toolbar 161
Points of Interest sidebar 163

ABOUT THIS MANUAL 6


Settings sidebar 164
Trajectory Editor window menu 165
Trajectory Overview window 166
Data info window 167
Data info window menu 168
Data types 169
2D data information 169
6DOF data information 170
Skeleton data information 172
Analog data information 174
Force data information 176
Calculate 178
Messages window 179
Plot window 179
Plot menu 181
Zooming, panning and other plot interactions 182
Plotting from file or during preview 183
Recommendations when using saved Window layouts 184
Menus 184
File 184
C3D import 187
TRB/TRC import 188
Edit 188
View 190
File Information 192
Play 193
Capture 194
AIM 196
Skeleton 196
Tools 197
Window 197

ABOUT THIS MANUAL 7


Help 198
Toolbars 199
Standard toolbar 199
Playback toolbar 200
Capture toolbar 201
GUI Control toolbar 201
AIM toolbar 202
Trajectory/Views toolbar 203
Keyboard shortcuts 203
Menu shortcuts 203
Workflow shortcuts 203
Capture file shortcuts 204
Editing shortcuts 204
Display and window shortcuts 205
2D/3D view shortcuts 205
2D view shortcuts 206
3D view shortcuts 206
Trajectory info window shortcuts 208
Playback keys 209
Project options shortcuts 211
Trajectory Editor shortcuts 211
Keyboard shortcuts 211
Mouse gestures 213
Trajectory Overview shortcuts 214
Zooming 214
Panning 214
Timeline 214
Scrolling 214
Selection 215
Layout 215

ABOUT THIS MANUAL 8


Plot window shortcuts 215
Zooming 215
Panning 215
Legend 216
Time line 216
Dialogs 216
QTM dialogs 216
Project options dialog 217

Project options 218


Input Devices 218
Camera system 220
Marker capture frequency 221
Real time frequency 221
Camera system settings 222
Locate System 222
Finding camera system 223
Camera system information 224
Auto order 225
Cameras 225
Camera settings 227
Camera Mode 227
Marker settings 227
Capture rate 227
Exposure time 228
Marker threshold 228
Marker mode 229
Marker ID range 230
Image size 230
Marker masking 232

ABOUT THIS MANUAL 9


Marker circularity filtering (Oqus) 234
Marker limits 235
Exposure delay 236
Sensor mode 238
Video settings 238
Capture rate 238
Exposure time 239
Flash time 239
Gain 239
Image size 240
Image resolution 241
Video compression 242
Color/Grayscale 244
Auto exposure 244
Color temperature 245
Sensor mode 246
Active filtering 246
Lens aperture 247
Lens focus distance 248
2D view rotation 248
Start delay 248
Linearization 249
Camera linearization parameters 249
Managing the linearization files 250
Selecting and deselecting cameras for tracking 252
Calibration 253
Calibration type 253
Wand calibration 253
Calibration kit 254

ABOUT THIS MANUAL 10


Coordinate system orientation and translation 256
Maximum number of frames used as calibration input 256
Apply coordinate transformation 256
Fixed camera calibration 257
Reference marker locations 257
Camera locations and markers seen by each camera in order from
left to right 258
Apply coordinate transformation 259
Transformation 259
Rotate axis to line 261
Transform coordinate system to rigid body (floor calibration) 262
Current calibration 264
Calibration quality 265
Synchronization 266
Wireless/software Trigger 267
UDP start/stop 270
UDP packet parameters 271
QTM sends start packet 271
QTM sends stop packet 272
QTM receives start packet 272
QTM receives stop packet 273
Trigger ports 273
Event port (Camera Sync Unit) 276
Pretrigger 277
External timebase 278
Timestamp 284
Synchronization output 285
Camera frequency - Shutter out (Oqus) 286
Frequency multiplier/Frequency divisor 287
Independent frequency 288
Measurement time (Oqus) 288

ABOUT THIS MANUAL 11


100 Hz continuous 289
System live time 289
Wired synchronization of active markers (Oqus) 290
Controlled by Analog board 290
Measurement time (Camera Sync Unit) 290
Analog boards 291
Analog board settings 292
Sample rate 292
Board settings 294
Force plate control 295
Compensate for analog offset and drift 295
Channels 297
Force plate control settings 301
Analog board type 301
Force plate auto-zero 302
Force plate control list 302
AV devices 304
AV device settings 305
Force plates 305
AMTI Digital force plates 306
Arsalis 308
Bertec corporation device 311
Kistler Force Plates 313
Instrumented treadmills 315
Gaitway-3D 315
EMGs 317
Gloves 317
Manus Gloves 317
Generics 318
h/p/cosmos treadmill 318
Processing 320

ABOUT THIS MANUAL 12


2D Preprocessing and filtering 323
Non-circular marker settings (Oqus) 323
Filtering 324
3D Tracking 325
3D Tracker parameters 325
Prediction error 326
Maximum residual 326
Minimum trajectory length 327
Minimum ray count per marker 327
Ray length limits 328
Rays 328
Auto join 328
Bounding box restricting 3D data 329
Auto range 330
2D tracking 330
Tracking settings 331
Auto join 331
2D to 3D settings 332
Twin System 333
Twin Slave System 333
Capture Frequency 334
Trajectory Conflicts 335
Twin System Calibration 336
Twin System Calibration dialog 337
Trajectories 338
Gap Fill Settings 338
AIM 339
AIM models 339
AIM model application parameters 340
Skeleton solver 341
SAL 343

ABOUT THIS MANUAL 13


Glove 344
6DOF tracking 345
6DOF Tracker parameters 345
Rigid bodies 346
Translate body 350
Rotate body 352
Coordinate system for rigid body data 354
Acquire body 356
Smoothing 6DOF data 356
Rigid body Mesh Settings dialog 358
Force data 360
General settings 360
Force plates 361
Force plate settings 362
Force plate type 362
AMTI calibration and settings (6 channels) 362
AMTI force plate calibration parameters 363
Force plate dimensions 363
Inverted sensitivity matrix 363
AMTI force plate settings 364
AMTI portable calibration and settings (8 channels) 365
AMTI 8-channel force plate calibration parameters 365
Force plate dimensions 366
Calibration matrix 366
AMTI 8 Ch force plate settings 366
Bertec calibration and settings 367
Bertec force plate calibration parameters 368
Dimensions 368
Calibration matrix 368
Bertec force plate settings 369

ABOUT THIS MANUAL 14


Kistler calibration and settings 370
Kistler force plate calibration parameters 370
Force plate dimensions 370
Kistler COP Correction 371
Kistler scaling factors 371
Calculating scaling factors with external amplifier 372
Maximum measurable force 373
Kistler force plate settings 373
Generic 6 ch (c3d type-1) 374
Force plate calibration parameters 374
Force plate dimensions 375
Force plate settings 375
Generic 6 ch (c3d type-2) 376
Force plate calibration parameters 376
Force plate dimensions 376
Force plate settings 377
Generic 8 ch (c3d type-3) 378
Force plate calibration parameters 378
Force plate dimensions 378
COP correction 379
Force plate settings 379
Generic 6 ch with matrix (c3d type-4) 380
Force plate calibration 380
Force plate dimensions 381
Calibration matrix 381
Force plate settings 381
Force plate location 382
Generate force plate location from a capture 382
Manual revision and specification of force plate location 386
COP (Center Of Pressure) threshold 386

ABOUT THIS MANUAL 15


Force plate settings status window 387
Real-Time output 387
6DOF analog export 388
Analog channel settings 389
Test output 390
Range calibration 391
Euler angles 392
Select Euler angle definition 393
Definition of custom rotation axes 393
Type of Euler axes 393
Rotation order and angle conventions 394
TSV export 397
Data type 397
General settings 398
2D settings 399
3D Settings 399
C3D export 400
3D Data 400
Label Format 401
Event Output Format 401
Parameter Groups 401
Units 402
Matlab file export 402
AVI Export 404
Window settings 404
Video settings 406
FBX export 409
File type 409
Exported data 409
Naming convention 410
JSON export 411

ABOUT THIS MANUAL 16


TRC export 412
STO export 413
Start program 414
GUI 415
2D view settings 417
3D view settings 419
Static mesh objects 424
Static Mesh Settings dialog 425
Compatibility of meshes 426
Miscellaneous 427
Folder options 427
Startup 429
Events 430
Event shortcuts 430
Scripting 431

System setup 432


System hardware 432
Qualisys camera types 432
Marker cameras 432
Video cameras 434
Streaming video 434
Miqus Video 434
Oqus color video camera (2c-series) 435
Miqus Hybrid 435
Oqus high-speed video camera 436
Cameras for special environments 436
Weather protected cameras 436
Underwater systems 437
MRI compatible cameras 438
Qualisys accessories 438

ABOUT THIS MANUAL 17


Traqr Configuration Tool 439
Setting up the capture space 439
Camera positioning 439
3D motion capture 440
2D motion capture 440
Connecting the system 441
Connecting the cameras 441
Technical notes 441
Power and camera cable requirements 441
Mixing Arqus and Miqus 443
Adding Miqus Hybrid or Miqus Video cameras 443
Connecting a Qualisys system through an Ethernet switch 444
Connecting an Arqus system 445
The Arqus startup sequence 447
Setting aperture and focus 447
Connecting a Miqus system 449
The Miqus startup sequence 451
Setting aperture and focus 451
Connecting a Miqus Hybrid or Video system for markerless mocap 452
Connecting an Oqus system 455
Oqus startup sequence 457
Setting the aperture and focus 458
Setup Oqus system for wireless communication (deprecated) 458
Connecting a mixed system 459
Network configuration 461
Network card setup 461
QDS 462
QDS menu 462
Network configuration wizard 465
Advanced 467

ABOUT THIS MANUAL 18


Camera blocklist 469
QDS conflict 470
Firmware update 470
Firmware update when locating system 470
Firmware update when starting preview 471
How to use Qualisys Firmware Installer (QFI) 471
Advanced firmware installer settings 475
Starting up the system 477
Locating the camera system in QTM 477
Outline of how to locate the camera system 477
Starting a preview 478
Optimizing the camera settings 479
Identifying and ordering the cameras in QTM 479
Identifying the cameras with the identification tool 480
Automatic ordering of the cameras 481
Tips on setting aperture and focus 481
Tips on marker settings in QTM 483
Linearization of the cameras 485
About camera linearization 485
Linearization file warning 486
Linearization procedure and instructions 486
Linearization concept 486
Feedback during the linearization procedure 487
Linearization instructions 488
Preparations 488
Performing the linearization 489
Evaluating the linearization 490
Synchronization 492
Timing hardware 492
How to use external trigger 492

ABOUT THIS MANUAL 19


How to use pretrigger 493
Measurement with analog capture while using pretrigger 493
How to use external timebase 494
Using External timebase for synchronization to a periodic TTL signal 494
Real-time 495
Capture 496
Using External timebase for synchronization to a time code signal 497
SMPTE 497
IRIG 498
Timestamps 498
Using Qualisys video with External timebase 498
External timebase with bursts of signals with constant period (cycle)
time and with delays between the bursts 499
How to use PTP sync with an external clock master (Camera Sync Unit) 501
Connecting to an external clock master 502
How to use PTP sync with an external clock master (Oqus) 502
Connecting to an external clock master 504
Synchronizing external hardware 505
How to synchronize external hardware 505
Using Sync out for synchronization 505
Using Trig in for synchronization 508
Using Oqus sync unit for synchronization 509
Using SMPTE for synchronization with audio recordings 512
Audio recording with the MOTU 828mk3 513
Combining multiple systems 514
Twin systems 514
How to use frame synchronized twin systems with separate volumes 514
Requirements 514
Procedure 515
Performing a Twin calibration 519
Preparations 519

ABOUT THIS MANUAL 20


Performing a twin calibration 520
Reprocessing a twin calibration 521
Working with QTM twin files 522
Merging Twin files in reprocessing 523
Twin 3D data interpolation 524
Twin system with a shared volume 524
Setting up multiple video systems 525
Connect each video system 525
Configure each system to prepare for connecting them together 526
Connect the video systems together 527
Configure QTM for synchronous start 528

Running the system 529


Preparations 529
Choice of markers 529
Passive vs Active markers 529
Marker size 529
Marker placement 530
Qualisys active markers 531
How to use active markers 531
QTM settings for active markers 531
How to use sequential coding 532
How to use Active + Passive mode 533
Removing unwanted reflections 534
Delayed exposure to reduce reflections from other cameras 534
When to use delayed exposure 534
Setting up delayed exposure 535
Guidelines for use of exposure groups for fast movements 535
Marker masking 536
How to use marker masking 537
How to use auto marker masking 538

ABOUT THIS MANUAL 21


Active filtering for capturing outdoors 539
How to use active filtering 540
On-camera marker discrimination 540
Marker circularity filtering (Oqus only) 541
How to use non-circular marker settings 541
Calibration of the camera system 543
Introduction to calibration 543
Calibration dialog 545
Wand calibration method 547
Outline of how to calibrate (Wand calibration) 547
Calibration tips 548
How to move the wand 548
Extended calibration 549
Refine calibration 550
Advanced calibration (beta) 552
Requirements for the advanced calibration 552
How to perform an advanced calibration 553
Evaluation of the calibration quality 553
Restoring to factory linearizations in a project 555
Use scenarios 555
Starting to use the advanced calibration 555
Mixing advanced and standard calibrations 556
Using projects with different camera setups 556
Translating origin to the floor 556
Fixed camera calibration method 557
Calibration results 558
Quality results 559
View Calibration 560
Calibration failed 560
Calibration quality warning 561
Recalibration 563

ABOUT THIS MANUAL 22


Merge calibration files 565
Capturing data 566
Introduction to capture 566
Outline of how to capture 567
Start capture 569
Capture period 569
Capture delay and notification 570
Automatic capture control 570
Camera systems settings 571
Batch capture 571
Auto backup 572
Store real-time data 572
Qualisys video capture 574
Using and calibrating Qualisys video cameras 575
Calibrating Qualisys video cameras 576
Capture streaming video 576
In-camera MJPEG compression 576
Streaming Video settings 577
Maximum capture rate for streaming video 578
Miqus Video (VC+) 578
Miqus Video (VC, VM) 579
Oqus 2c 579
Capture high-speed video 579
Information about high-speed video capture 580
Outline of how to capture high-speed video 580
Codecs for Oqus high-speed video files 582
Recommended codecs 583
Video preview in QTM 584
Qualisys video files 586
3D data overlay on video 587

ABOUT THIS MANUAL 23


Capturing with multiple video systems for markerless tracking 588
Calibration 588
Setting up synchronization 589
Processing with Theia 3D markerless mocap software 589
Real-time streaming 590
Real time protocols 590
Resources 591
How real time works in QTM 592
Streaming data 592
Controlling QTM 593
Real-time frequency 593
Optimization of real-time performance 594
Real-time streaming of data from external input devices 594
Real time latency 595
Outline of how to use real time 596
Server mode 598

Processing data 600


Introduction to data processing 600
Reprocessing data 601
Reprocessing a file 601
Changing the calibration 604
Batch processing 605
Loading a saved calibration 608
Processing 2D data 609
How to use circularity filter (Oqus only) 610
How to use software marker masks 611
Adding and applying software masks 611
Undoing and removing software masks 612
How to use marker size filter 613
3D tracking measurements 614

ABOUT THIS MANUAL 24


Advice on how to set 3D Tracker parameters 614
3D tracking evaluation 616
Tracking and accuracy 617
2D tracking of data 618
Using 2D tracking 619
Identification of trajectories 620
Manual identification of trajectories 620
Quick identification method 620
Drag and drop method 621
Identify method 622
Tips and tricks for manual identification 622
Visualization tips 622
Identification methods 623
Other tips for navigation and trajectory management 623
Automatic Identification of Markers (AIM) 624
How AIM identifies trajectories 624
Generating an AIM model 625
General instructions 625
Create new model 628
Create new model based on marker connections from existing
AIM model 630
Add to existing model 630
Guidelines for data added to AIM models 631
How to verify and edit AIM bones 632
Applying an AIM model 634
AIM models for multiple subjects in the same measurement 637
Unique subjects 638
Similar subjects 639
Use generic AIM model to create unique models 639
Editing of trajectories 640
Gaps 641

ABOUT THIS MANUAL 25


Identification and selection of gaps 641
Filling of gaps 642
Spikes 645
Detection and selection of spikes 646
Smoothing 646
Adding virtual trajectories 648
Create virtual trajectories via the Trajectory info window menu 648
Create virtual trajectories using the Trajectory Editor 648
6DOF tracking of rigid bodies 649
6DOF versus 3D 649
Creating 6DOF bodies 650
How to design a 6DOF body 650
Physical design of the rigid body 650
Choice of markers 651
Marker configuration 651
Using extra markers for the rigid body definition 652
Definition of 6DOF bodies 653
Creating new rigid body definitions from a measurement 654
Preparations 654
Method I: Define rigid body (6DOF) 655
Method II: Define 6DOF bodies using Acquire body method 656
Adding an existing rigid body definition 657
Method III: Load from file 657
Method IV: Manually create a rigid body definition 657
Editing rigid bodies 657
Definition of local coordinate system 658
Creating an active Traqr rigid body 659
Tracking 6DOF bodies 659
Calculating 6DOF data 660
Virtual markers calculated from 6DOF data 662
Rotation angles in QTM 663

ABOUT THIS MANUAL 26


6DOF real-time and analog output 665
6DOF analog output 665
Rigid body meshes 667
Sharing a file with a rigid body mesh 668
Examples of how to use 6DOF bodies 668
How to use 6DOF bodies 668
How to use 6DOF bodies in an AIM model 669
How to use virtual markers in an AIM model 670
Tracking of skeletons 671
Marker sets for skeleton tracking 672
Qualisys Animation Marker Set 673
Qualisys Sports Marker Set 674
Traqr VR Marker Set 675
Qualisys Claw Marker Set 676
Qualisys Full Fingers Marker Set 677
Adding extra markers to a skeleton 679
Assigning extra markers to a segment 680
Skeleton marker label mapping 681
Using a custom skeleton definition 682
Automatic labeling of markers or Traqr configuration for skel-
eton tracking 682
Generating an AIM model for animation 683
Using AIM for sports and biomechanics 685
Setting up the Traqrs for VR skeleton tracking 687
Skeleton calibration 690
T-pose 691
Hand calibration poses 692
Partial skeleton calibration 692
Scale factor 693
How to modify the skeleton definition 694
Skeleton template 694

ABOUT THIS MANUAL 27


Manual editing of the skeleton definition 696
Skeleton XML editing 697
How to measure skeleton data 699
How to process skeleton data 700
How to use SAL 700
Marker count threshold 701
Export and streaming of skeleton data 702
Force data calculation 703
Calculating force data 703
Viewing force data 704
How to use events 706
Adding events 706
Viewing and editing events 708
Exporting events 709
How to use Euler angles 709
Data export to other applications 710
Batch exporting 710
Export to TSV format 711
Motion data (.tsv) 713
Header 713
Data 715
6DOF data format (_6d.tsv) 716
Header 716
Data 718
Skeleton data (_s.tsv) 720
Header 720
Data 722
Analog data (_a.tsv) 722
Header 722
Data 724

ABOUT THIS MANUAL 28


Force data (_f.tsv) 724
Header 724
Data 726
Eye tracker data (_g.tsv, _e.tsv) 727
Export to C3D format 727
C3D file format 728
Export to MAT format 729
MAT file format 730
Export to AVI file 739
Export to FBX file 742
Export to JSON file 743
Export to TRC file 744
Export to STO file 745

External devices and integrations 747


How to use analog boards 747
Installing drivers for the A/D board 747
Installing the USB-2533 board 748
Installing the USB-1608G board 750
Connection of analog board 752
Analog offset warning 753
Analog saturation warning 756
How to use force plates 756
Digital force plate integrations 756
Connecting AMTI Digital force plates 756
Connecting Arsalis force plates 759
Hardware connections 759
Set up a data stream 760
Set up and configuration in QTM 761
Capturing, viewing and exporting data 763
Connecting Digital Bertec force plates 764

ABOUT THIS MANUAL 29


Hardware connections 764
Software requirements 765
Set up and configuration in QTM 765
Add input device 765
Synchronization settings 766
Set up of force data 767
Capturing, viewing and exporting data 768
Connecting Kistler digital force plates 768
Hardware requirements 768
Software requirements 769
Configuration of force plates 770
Network configuration 770
Force plate configuration 772
Hardware setup 776
Set up and configuration in QTM 777
Add input device 777
Synchronization settings 778
Set up of force data 778
Capturing, viewing and exporting data 780
Connecting Kistler DAQ Type 5695B 780
Hardware requirements 780
Software requirements 781
Configuration of force plates 781
Configuration of the Kistler DAQ Type 5695B 781
Force plate configuration 782
Hardware setup 786
Setup and configuration in QTM 786
Add input device 786
Synchronization settings 787
Set up of force data 788
Capturing, viewing and exporting data 789

ABOUT THIS MANUAL 30


Connection of analog force plates 790
Connecting Kistler force plates 790
Connecting AMTI and Bertec force plates 795
How to use instrumented treadmills 797
Connecting a Gaitway-3D instrumented treadmill 797
Hardware connections 798
Set up a data stream 799
Set up and configuration in QTM 800
Capturing, viewing and exporting data 802
Decomposition of force data 803
How to use EMG 803
Introduction 803
Wireless EMG systems 804
Delsys Trigno Integration 804
Hardware requirements 804
Software requirements 805
Hardware setup 805
Delsys Trigno Centro 805
Delsys Research+ 806
Sensor configuration 807
Setup and configuration in QTM 809
Add input device 809
Device settings 809
Configuration in QTM 811
Setting up Delsys Trigno in QTM 811
Synchronization settings 812
Capturing, viewing and exporting data 813
Delsys Trigno (API integration) 813
Hardware requirements 814
Delsys hardware 814

ABOUT THIS MANUAL 31


Hardware needed for synchronization 814
EMG sensors 814
AUX sensors 815
Sensors with limited support 815
Version information 815
USB drivers 816
Setting up Delsys Trigno (API) in QTM 816
Connecting Delsys Trigno (API) 816
Measurement time synchronization 817
Trigger synchronization with Qualisys trigger button 817
Trigger synchronization with Delsys Trigger Module button 818
Polarity of synchronization signals 819
Configuration of Delsys Trigno (API) in QTM 820
Scanning of sensors 821
Configuration of sensors 822
Configuration of channels 823
Synchronization method 824
Notes on sample rate per channel 825
Capturing, viewing, and exporting data 825
Delsys Trigno EMG (SDK legacy integration) 826
Trigno Avanti sensors 826
Trigno sensors 826
Trigno installation 827
Trigno computer installation 828
Trigno synchronization connections 830
Trigno trigger connection 831
Trigno Measurement Time connection 832
Delsys Trigno QTM settings 834
Making a measurement with Trigno 836
Export Trigno EMG and auxiliary data 838
Cometa EMG 838

ABOUT THIS MANUAL 32


Cometa installation 839
Cometa synchronization setup 840
Cometa QTM settings 841
Making a measurement with Cometa 842
Export Cometa EMG and IMU data 844
Cometa Systems 844
Requirements 845
Hardware requirements 845
WavePlus 845
WaveX 845
Driver install 846
Software requirements 846
Hardware setup 847
How to connect 847
Sensor configuration 848
Setting up Cometa Systems in QTM 850
Add input device 850
Device settings 850
Configuration in QTM 852
Select sensor configuration 852
Synchronization settings 853
Capturing, viewing and exporting data 853
Calibration of IMU sensors 853
Viewing data 854
Exporting data 854
Noraxon EMG 854
Requirements 854
Hardware requirements 854
Software requirements 855
Hardware setup and configuration 855
Add input device 855

ABOUT THIS MANUAL 33


Hardware setup 856
Noraxon Ultium EMG 856
Hardware connections 856
Device and sensor configuration 857
Noraxon Dekstop DTS 859
Hardware connections 859
Device and sensor configuration 859
Configuration in QTM 862
Device settings 862
Synchronization settings 863
Capturing, viewing and exporting data 864
How to use eye tracker devices 864
Eye tracking hardware in QTM 864
Tobii eye trackers 865
Setting up Tobii Pro Glasses 2 in QTM 866
Set up and configuration in QTM 866
Setting up hardware synchronization 867
Setting up Tobii Pro Glasses 3 in QTM 868
Set up and configuration in QTM 868
Connecting a single Glasses 3 device 869
Connecting multiple Glasses 3 devices 869
Adding Tobii Pro Glasses 3 to QTM 870
Tobii Pro Glasses 3 device settings 870
Calibrating the glasses 871
Latency information 872
Setting up hardware synchronization with Tobii Pro Glasses 3 872
How to use gaze vectors in QTM 873
Setting up a gaze vector in QTM 873
Tobii rigid body definitions 874
Tips for improving the tracking of the glasses 875
Refinement of the Tobii rigid body definition 875

ABOUT THIS MANUAL 34


Tips for compensating latency 877
Making a measurement with Tobii 878
Process and export Tobii gaze vector data 879
Reprocessing of gaze vector data 879
Export formats for gaze and eye tracker data 880
TSV 880
MAT 881
Tobii data in QTM 883
Gaze vector data 884
Eye Tracker data 885
Tobii Pro Glasses 2 885
Tobii Pro Glasses 3 886
Gaze vector settings 887
ASL EyeHead integration 889
How to use motion gloves 889
Connecting Manus gloves 889
Overview 889
Components and requirements 889
MANUS setup 889
Connect the Manus Gloves 890
Setup of Manus Gloves in QTM 890
Making a measurement with Manus Gloves 891
Create a skeleton 891
Create bindings 891
Capturing, viewing and exporting data 891
Connecting StretchSense gloves 892
Components and requirements 892
Installing the StretchSense Gloves Integration 892
Setting up the StretchSense gloves 892
Connecting the gloves to the computer 892
Glove calibration and mapping 893

ABOUT THIS MANUAL 35


Setup and configuration in QTM 895
Add input device 895
Device settings 896
Configuration in QTM 897
Setting up the device in QTM 897
Processing settings 898
Capturing, viewing and exporting data 898
How to use external video devices 898
Video capture with Blackmagic Design cards 899
Installing the BlackMagic Design card 899
Connecting a video source to the Intensity Pro card 900
Connecting a video source to the Decklink Mini Recorder card 901
Using Blackmagic Design video source in QTM 901
Settings on Sony HDR-CX330 904
HDMI Resolution 904
Other recommended settings 905
Demo mode 905
Face detection 906
Settings on Sony HDR-CX430V 906
HDMI Resolution 907
Other recommended settings 908
Demo mode 908
Face detection 909
Panasonic AW-HE2 909
DV/webcam devices 909
Compression of video from video cameras 910
Video offset 911
Selecting audio source 911
Import video link 912
How to use generic devices 912

ABOUT THIS MANUAL 36


Connecting the h/p/cosmos treadmill 912
Hardware connections 912
Set up and configuration in QTM 913
Add input device 913
Capturing, viewing and exporting data 913

Applications 915
Analysis Modules 915
PAF module installation 916
Downloading installation files 916
New installation 916
Upgrading an existing installation 919
PAF Project view 921
Calqulus 923
Qualisys Cloud ecosystem 924

Technical reference 926


Qualisys technical reference 926
Overview of camera models and sensor specifications 926
Qualisys camera sensor specifications (marker mode) 926
Qualisys video sensor specifications (in-camera MJPEG) 927
Device specifications and features 929
Arqus cameras 929
General specifications and features 929
Arqus model specifications 931
Description of Arqus cameras 933
Arqus camera: front side 933
Arqus camera: back side 934
Mechanics 935
Physical specifications 935
Physical dimensions 936
Mounting 936

ABOUT THIS MANUAL 37


Optics and strobe 937
How to adjust aperture and focus 937
How to change the strobe unit 937
How to add or remove the sun filter 937
How to change the lens 937
Electrical specifications 938
Power supply 938
Power consumption 938
Communication 939
Digital IO 939
Miqus cameras 939
General specifications and features 939
Miqus model specifications 941
Miqus Video specifications 942
Description of Miqus cameras 944
Miqus camera: front side 944
Miqus camera: back side 945
Mechanics 946
Physical specifications 946
Miqus dimensions 947
Mounting 947
Optics and strobe 947
How to adjust aperture and focus 947
How to change the strobe unit 948
How to change the lens 948
Electrical specifications 948
Power supply 948
Power consumption 948
Communication 949
Digital IO 949
Camera Sync Unit 949

ABOUT THIS MANUAL 38


Specifications and features 949
Description of Camera Sync Unit 950
Camera Sync Unit: front side 950
Camera Sync Unit: back side 952
Mechanical end electrical specifications 953
Physical specifications 953
Mounting 953
Electrical specifications 953
Digital IO 953
Trigger inputs 953
Event/IRIG input 954
Synchronization input 954
SMPTE input 954
Genlock input 955
Synchronization outputs 955
Oqus cameras 955
General specifications 955
Oqus model specifications and features 957
Oqus video specifications 960
Streaming video 960
High-speed video 960
Description of Oqus devices 963
Oqus camera display 963
Oqus camera connectors 965
Oqus Sync Unit 967
Mechanics 968
Physical specifications 968
Mounting 969
Optics and strobe 970
How to adjust aperture and focus 970
How to change strobe unit 970

ABOUT THIS MANUAL 39


How to change lens 971
Electrical specifications 972
Power supply 972
Power consumption 972
Digital IO 972
Control connections 973
Analog boards 973
USB-2533 973
USB-1608G 975
USB-1608G specifications 976
Communication 976
Arqus and Miqus 976
Ethernet (Gigabit) 977
Oqus 977
Ethernet (Oqus) 977
WLAN (Oqus) 978
Environmental protection 978
Underwater 978
Cameras and specifications 979
Marker cameras 979
Arqus underwater 979
Miqus underwater 981
Oqus underwater 983
Video cameras 985
How to connect 987
Arqus and Miqus 987
Oqus 988
Mechanics and physical specifications 988
Arqus physical specifications 988
Housing and mount foot 988
Miqus physical specifications 989

ABOUT THIS MANUAL 40


Housing 990
Connection unit 990
Mechanics 990
Underwater mounting options 991
Arqus wall mount 991
Miqus wall mount 992
Mount foot 992
Wall mount 992
Quick Attach mount 992
Quick Attach mount base 993
Arqus Quick Attach mount 994
Miqus Quick Attach mount 994
Qualisys accessories specifications and features 995
Qualisys calibration kits 995
Carbon fiber 600 mm wand kit 996
Active 500 mm wand kit 996
Indicators and buttons 996
Assembly and disassembly 997
How to use the active 500 mm calibration kit 998
Configuration 999
Battery and charging 1000
Active marker types 1000
The Active Traqr 1000
Active Traqr description 1001
Active Traqr specifications 1002
The Naked Traqr 1003
Naked Traqr description 1004
Naked Traqr specifications 1005
The Short Range Active Marker 1005
Short range driver 1006
Active markers and IR eye 1008

ABOUT THIS MANUAL 41


Battery and charging 1009
Short range active marker specifications 1009
The Long Range Active Marker 1010
Marker maintenance 1010
Rotation angle calculations in QTM 1010
6DOF tracking output 1011
Calculation of rotation angles from the rotation matrix
(Qualisys standard) 1011
Calculation of other rotation matrices 1013
Developers' resources 1015
Real-time protocol 1015
Project Automation Framework (PAF) 1016
QTM Scripting Interface 1017
Scripting components in QTM 1018
Scripting resources 1019
Using scripts 1019
Creating your own scripts 1020
Use of external packages 1021
Installing external packages for Python 1021
External packages for Lua 1022
QDevice API 1022
Troubleshooting QTM 1022
Troubleshooting connection 1022
Troubleshooting calibration 1024
Troubleshooting capture 1025
Troubleshooting tracking 1028
Troubleshooting reflections 1029
Troubleshooting force calculation 1030
Troubleshooting 6DOF 1031
Troubleshooting update 1032
Troubleshooting other 1033

ABOUT THIS MANUAL 42


Glossary 1035

Index 1049

ABOUT THIS MANUAL 43


General inform at ion

Important information

Intended use
Qualisys cameras are high-performance motion capture cameras intended to
be used within optical motion capture systems that capture three-dimensional
trajectories of reflective markers or objects. Software is available to calculate
derived measures based on this data.
If the motion capture system is intended for Clinical use, characterized by per-
forming measurements on patients to benefit the clinical assessment of that
individual, Qualisys recommends QCS (Qualisys Clinical System) set-up to be
used. If you would like to make a transition to the QCS, please contact sup-
[email protected].

Safety notices
IR radiation notice

The Qualisys camera uses short but quite strong infrared flashes to illuminate
the markers. The flash is generated by LEDs on the front of the camera. The
Qualisys cameras belong to the exempt group according to IEC / SS-EN 62471,
which means that the LED radiation is not considered to be hazardous.
However, any light of high intensity might be harmful to your eyes. Because
infrared light is invisible to the human eye, you can be exposed to IR light
without noticing. Therefore we recommend that you do not stare directly at the
LEDs at a short distance for a prolonged time period when the camera is run-
ning.
User safety

If any serious incident has occurred to a patient/user/other person possibly


related to the usage of the product, please report this to us (the manufacturer)
and the applicable competent authority of the state in which the user and/or
patient is established.

GENERAL INFORMATION 44
Installation risks

Prioritizing safety is crucial when it comes to installing cameras. To minimize


the risk of injury from falling cameras, it is important to follow proper pro-
cedures and precautions. Working at heights shall follow local routines and reg-
ulations.
Here are some guidelines to ensure a safe camera installation process:
l Do not stand below someone mounting a camera.

l Secure cameras placed at heights over 3 meters, for example using a Kens-
ington lock.
l Use high-quality tripod heads (ask for advice from Qualisys AB if needed).

l Ensure mounts are securely attached to walls or ceilings.

Computer and internet


Computers purchased from Qualisys AB come with reliable hardware that
works well together with compatible external devices, such as force plates,
EMG devices and other compatible third-party hardware. Connecting the com-
puter to the internet is optional. The security and safety of the data generated
by the Qualisys system relies on the safety measures built into the Windows
operating system and its proper use. Regardless if the computer is connected
to the internet or not, make sure the following safety measures are followed:
l Make sure user accounts are setup with password requirements that fol-
low standards set by the IT department. In absence of such, we recom-
mend minimum 10 character passwords, containing both lowercase,
uppercase, numbers and special characters. Passwords should be
changed at least every 6 months.
l Do not use accounts with administrative rights in the daily operation of
the system. Administrative accounts should only be used when updating
or servicing the software and/or the computer.
l Set the computer to lock the screen after at most 15 minutes of inactivity.

GENERAL INFORMATION 45
l Make sure Windows Defender is activated and up-to-date. If your IT
department requires third party firewalls or antivirus software, contact
Qualisys support at [email protected] to make sure they will not
interfere with the operation of the Qualisys system.
l To minimize the risk of external threats to the computer, it is recom-
mended not to check email or receive messages through other channels
on the QTM computer.
l Be sure to regularly, preferably every day, back up data/files to a server or
an external hard drive.
l Software updates are available in the client login area of our website
(https://www.qualisys.com).

EU customer information
Waste Electrical and Electronic Equipment (WEEE)

In the European Union (EU), waste from electrical and electronic equipment
(WEEE) is now subject to regulation designed to prevent the disposal of such
waste and to encourage prior treatment measures to minimize the amount of
waste ultimately disposed. In particular, the EU WEEE Directive 2002/96/EC
requires that producers of electronic equipment be responsible for the col-
lection, reuse, recycling and treatment of WEEE which the producer places on
the EU market after August 13, 2005. Qualisys is providing the following col-
lection process to comply with the WEEE Directive.
Qualisys WEEE Collection Process

If you have purchased Qualisys products in the EU on and after August 13,
2005, and are intending to discard these products at the end of their useful life,
please do not dispose of them in a landfill or with household or municipal
waste. Qualisys has labeled its electronic products with the WEEE label to alert
our customers that products bearing this label should not be disposed with
waste in the EU. Instead, Qualisys requests you to return those products using
the instructions provided here, so that the products can be collected, dis-
mantled for reuse and recycled, and properly disposed.

GENERAL INFORMATION 46
Qualisys will take back WEEE, i.e. all of the electrical equipment which is part of
Qualisys equipment, from its customers within the EU. Please visit the website
www.qualisys.com/weee or contact Qualisys AB at [email protected] for
information on how to return your WEEE.

有害物质声明
按 照 中 华 人 民 共 和 国 电 子 工 业 标 准 SJ/T11364 2006的 要 求 ,此 文 档 提 供 了 由
Qualisys AB生 产 制 造 的 Oqus-,Miqus-和 Arqus-系 列 的 危 险 材 料 声 明 。

部件 有毒有害物质或元素
名称
铅 汞 镉 六价 多溴 多溴
铬 联苯 二苯

印制 x o o o o o
电路
配件
显示 x o o o o o

按钮 o o o o o o
内部 o o o o o o
配线
围栏 o o o o o o
镜头 x o o o o o
外接 o o o o o o
电缆
及端

交流 o o o o o o
电 /直
流电
电源
纸质 o o o o o o
说明

CD说 o o o o o o
明书

GENERAL INFORMATION 47
O:表 示 该 有 毒 有 害 物 质 在 该 部 件 所 有 均 质 材 料 中 的 含 量 均 在 SJ/T 11363-2006标 准 规 定 的 限
量要求以下。
X:表 示 该 有 毒 有 害 物 质 至 少 在 该 部 件 的 某 一 均 质 材 料 中 的 含 量 超 出 SJ/T 11363-2006标 准 规
定的限量要求。

System requirements

Qualisys system and computer


The measurement system consists of the following parts:

1. A Qualisys camera system.

2. A calibration kit.

3. A stationary or portable measurement computer meeting at least the


below requirements. Contact Qualisys AB for more specific recom-
mendations of configurations for a good user experience.
a. Operating system for running QTM: Windows 10 or 11 (64-bit). The
minimum required version is Windows 10, Version 20H1 OS Build
10.0.19041 (also known as version 2004). The Home edition and the
Windows 11 SE Education edition are not supported.
b. For external graphic boards (Nvidia or ATI) it is required to have at
least 512 MB built-in memory and support OpenGL 2.1 or higher. For
integrated Intel graphics the version must be HD Graphics 4600 or
later.

NOTE: QTM will start a wizard with fixes for the graphic
board, if a problem with the graphic board is detected. Follow
the instructions in the wizard to fix the problem.

c. It is required to have Dual/Multi Core processor of at least 2 GHz


each.
d. It is recommended to have at least 8 GB of internal memory and it is
required to have 2 GB for linearization of cameras.

GENERAL INFORMATION 48
e. It is required to have at least 900 pixels in vertical resolution to
make sure that all of the camera settings are visible in the 2D view
window.
f. There must be an Ethernet card that supports 1000Base-T.

g. For the built-in help to work you need a web browser that can
handle basic HTML5.

WARNING: QTM does not support the use of network drives or syn-
chronized drives, such as OneDrive. The use of such storage facilities may
lead to corrupted data files.

External devices
The following additional equipment can be used with Qualisys systems. For
Arqus or Miqus systems a Camera Sync Unit is required for synchronizing or
triggering external equipment. Alternatively, for Oqus systems without a Cam-
era Sync Unit a trigger/sync splitter cable or an Oqus Sync Unit connected to
the control port of one of the cameras can be used.
l Analog interfaces (Measurement Computing), which allow capture of up to 64
channels of analog data (e.g. force plate data, EMG sensor data or other user
specific analog data).

l Force plates, digital integration (AMTI Gen5/Optima, Arsalis, Bertec, Kistler,


Gaitway3D).

l Wireless EMG devices (Cometa, Delsys and Noraxon).

l Blackmagic Intensity Pro and Decklink Mini Recorder cards for video capture,
DirectShow compatible DV cameras or standard USB Web camera, with which
QTM can record video sequences for documentation.

l Eye tracking glasses (Tobii)

GENERAL INFORMATION 49
Hardware compatibility and version requirements

Cameras

Win 11 Win 10

Arqus Yes Yes

Miqus Yes Yes

Oqus Yes Yes

Analog boards/Force plates

Win 11 Win 10

USB-25331) Yes Yes

USB-1608G1) Yes Yes

PCI-DAC67031) 2) Not tested Yes

AMTI Gen5/Optima Yes Yes

Gaitway 3D3) Yes Yes

Arsalis4) Yes Yes

Bertec5) Yes Yes

Kistler DAQ 5695 A/B Yes Yes

Kistler Digital force Yes Yes


plates

1. With Instacal 6.73 or later (required for Win 11).

2. Analog output only.

3. Requires Gaitway-3D version 1.7.1 or later.

4. Requires 3D-ForcePlate version 1.1.5 or later.

5. For AM6500 and AM6800 amplifiers, the minimum required firmware ver-
sion is 1130 (shown as "46A" on the AM6800 LED display).

GENERAL INFORMATION 50
Other hardware

Win 11 Win 10

Cometa EMG Yes Yes

Delsys Trigno SDK (Legacy sensors)1) Yes Yes

Delsys Trigno API (Avanti style sensors)2) Yes Yes


Blackmagic Intensity Pro3) Yes Yes
Noraxon Desktop DTS Yes Yes
Noraxon Ultium EMG Yes Yes

Blackmagic Decklink Mini Recorder4) Yes Yes

Tobii Pro Glasses 2 Yes Yes

Tobii Glasses 3 Yes Yes

1. It is recommend to download the Delsys SDK via QTM via Project


Options>Input Devices>Download device drivers to make sure that you
use a compatible version. For the current version of QTM the Delsys
SDK version is 3.6. The SDK integration requires that the 32-bit version of the
Delsys USB drivers is installed.

2. Requires base station firmware MA2919-BE1506-DS0806-US2008-


DA0901/0000 and sensor firmware 40.49 or later. The API integration
requires that the 64-bit version of the Delsys USB drivers is installed.

3. Requires a computer with a PCI Express slot and that the graphic board can
handle hardware acceleration. Contact Qualisys AB to make sure that it
works on your computer.

4. Requires Blackmagic drivers 9.7.2 or later and a computer with a PCI Express
slot and that the graphic board can handle hardware acceleration. Contact
Qualisys AB to make sure that it works on your computer.

GENERAL INFORMATION 51
Getting started

Qualisys Motion Capture System


A Qualisys motion capture system consists of Qualisys cameras connected to a
computer running Qualisys Track Manager (QTM) software. The system cap-
tures and streams various types of motion data. It can also integrate with third-
party data acquisition devices, such as force plates, EMG systems, and eye track-
ers.
Qualisys cameras and devices

Qualisys offers “marker cameras” for marker detection and “video cameras”
for synchronized video recording. The camera system can be calibrated
through various methods, primarily using wand calibration with a “Qualisys
calibration kit.” To synchronize with external devices, a “Qualisys Camera
Sync Unit” must be included in the system.
For detailed information about setting up a Qualisys motion capture system,
please refer to the System setup chapter.
Qualisys software

Qualisys offers the following software and resources.

Qualisys Track Manager (QTM)


The main software for Qualisys Motion Capture systems.

QTM Connect
Real-time QTM clients for specific external programs, such as Matlab,
LabView, MotionBuilder, etc.

Analysis modules
Predefined applications based on the Project Automation Framework
(PAF) in QTM, see chapter Applications.

QCloud
Online resources, including online processing and the web report center,
see chapter Applications.

GENERAL INFORMATION 52
Developers' resources
SDKs for building real-time clients for QTM, QTM scripting resources, and
Open Project Automation Framework (OpenPAF) resources, available via
the Qualisys GitHub page at https://github.com/qualisys.

The online resources for Qualisys users are available via the user dashboard at
https://www.qualisys.com/my/. Additional resources are available via
https://www.qualisys.com/downloads/.
For an overview of help and training resources, see chapter "Training
resources" below.
Qualisys user account

To access the Qualisys user dashboard, you need a Qualisys user account. To
create an account associated with your QTM license, follow the instructions
below:

1. Navigate in your browser to Qualisys.com, and hover over the lock icon to
log in or to sign up for an account.
2. When you log in for the first time, enter your QTM username and license
key so that you can access the relevant content. If you are a lab manager,
selecting the checkbox will let you add team members and customize the
online report center for your lab.
If you are new in a lab that already uses Qualisys, ask your lab manager to cre-
ate a new account.

Training resources
Besides the QTM manual the following training resources are available.

Getting Started with your Qualisys System


The Getting Started with your Qualisys System guide (PDF) is included in
the welcome information for new customers. The Getting Started guide is
based on QAcademy tutorials, which can be referred to for more detailed
information. An online version of the Getting Started guide is available via
https://docs.qualisys.com/getting-started/content/getting_started.

GENERAL INFORMATION 53
QAcademy
QAcademy is the official Qualisys online training library, including video
tutorials, courses and guides covering a wide range of topics from basic
camera setup to advanced data processing. Most video tutorials also
include a written manual. A selection of basic tutorials is publicly avail-
able. For access to all QAcademy resources, an active support contract is
required. QAcademy can be accessed online via https://www.qualisys.-
com/my/qacademy/#!/.

Documentation
The installation of QTM contains a Documentation folder, including several
PDF documents, for example, the latest Getting Started guide, a keyboard
shortcut reference, and marker set guides for sports and animation. Docu-
mentation about the real time protocol of QTM is available in the
RT Protocol folder or can be accessed online via https://-
docs.qualisys.com/qtm-rt-protocol/.

Software installation
Make sure you are logged in with an administrator account before you start
installing QTM. To install the software, insert the USB installation stick, and loc-
ate and execute the QTM installer (QTM_yyyy_x_Build_xxxx_Setup_
xxxxxxxx.exe). You can also download the QTM installer via your registered cli-
ent account at http://www.qualisys.com/my/.
Follow the instructions given during the installation. In the installer you can
select the languages for the menus and dialogs in QTM. There are three avail-
able languages: English (default), Chinese and Japanese.
During the installation you can select the components that you want to include.
The following components can be selected:
l Instacal (A/D board driver).

Enter the user name and the license id, that you have received from Qualisys
AB, see chapter "QTM registration" on the next page.

GENERAL INFORMATION 54
If there is an internet connection, QTM will automatically check for updates
when it is started. You can also use the Check for updates option on the Help
menu. You can also find the latest software updates by logging in with your
registered client account at http://www.qualisys.com/my/.
QTM registration

The first time QTM is started you must enter a user name and a license key.
This is provided on the front cover of the QTM installation USB.

NOTE: If the license is time limited you must check the Time limited
checkbox and enter the correct expiration date.

Once you have registered QTM you can proceed to create a project, see chapter
"Starting QTM" on page 58.
Adding licenses

For some analysis modules or plug-ins a license request will appear when you
start QTM. In those cases just enter the user name and license key in the dia-
log. However other plug-ins must be installed after QTM has started, e.g. the
MotionBuilder plug-in. To enter a plug-in license in QTM click on About
Qualisys Track Manager in the Help menu.

GENERAL INFORMATION 55
In the About Qualisys Track Manager dialog you can see information about
the current version of QTM. Click on Licenses to view the installed licenses and
add new licenses.

Click on Add plug-in license to install a new license.

GENERAL INFORMATION 56
Enter the license key in the dialog and then click OK.

NOTE: If the license is time limited you must check the Time limited
checkbox and enter the correct expiration date.

Import licenses from a file

Alternatively, the licenses can be imported from a text file (*.licenses). The
information in the file is organized per row in the following format (replace text
with registration data, using exact names for the QTM user name and the plu-
gins, which you can find at http://www.qualisys.com/my/):

NOTE: File import is not possible for time limited licenses.

GENERAL INFORMATION 57
QTM user int erface

Running QTM

Starting QTM
The first time you start QTM on a new computer, QTM will prompt you to create
a project.

There are three options:

Create project
This is the default option because you must have a project to capture data
in QTM, see chapter "Creating a new project" on page 69.

Open project...
Use this option to open a project folder that has been copied from
another computer.

No project
If you only want to open QTM files you can start QTM without a project,
but you will not be able to capture any data or change any project
options.

Once you have created one or more projects on the computer, QTM will open
by default with the Manage projects dialog to select a project. For more
information about the dialog see chapter "Manage projects" on page 72.

QTM USER INTERFACE 58


It is also possible to choose to open the most recent project or a selected pro-
ject when starting QTM. To set these click the button, see chapter "Star-
tup" on page 429 .
You can also start QTM with a project by double clicking the Settings.qtmproj file
in the project folder.

NOTE: QTM will use the latest calibration made on the computer, that
was made with the same cameras (placed in the same order), even if it is
not included in the current project.

QTM main functions


After opening QTM with a project, the QTM main window is displayed.

You can now start managing your project and capturing data. The main func-
tions in QTM are:

QTM USER INTERFACE 59


1. Open the Project Options
This is the place where you can manage your camera system and other
input devices, set up your processing steps and configure your
QTM workspace, see chapter Project options.

2. Start a preview
Press the New measurement button (Ctrl + N) to start the cameras in
Preview (real-time) mode. This requires that a Qualisys camera system
is connected.
In Preview mode the motion capture data is displayed in real-time in the
2D and/or 3D View windows, see chapter "View windows" on page 84. The
data can also be accessed via a real time TCP/IP protocol, see chapter
"Real-time streaming" on page 590.

3. Start a calibration
Calibrate your camera system for capturing 3D motion data, see chapter
"Calibration of the camera system" on page 543.

4. Start a capture
Start a capture to record your motion capture data in a file, see chapter
"Capturing data" on page 566.

5. Open a file
Open an existing file with recorded motion capture data. QTM will display
the data in File mode in which you can process, manage and edit the
recorded motion capture data, see chapter Processing data.

Projects
QTM needs a project to capture measurement data. The project is a folder that
contains all the files and information needed for QTM to process the data. A
project can therefore be easily transferred to for example another computer
with all the settings and files needed for the processing. To create and use pro-
jects follow the instructions in the chapters below.

QTM USER INTERFACE 60


Project folder
The project folder is used for organizing a QTM project. The folder is displayed
in Windows explorer with a special icon to indicate that it is a QTM project
folder. It contains the following files and folders:

Data
This is the default location for the captured QTM files. You can create sub-
folders in this folder if you want to sort the files, for example for different
subjects.

AIM models
This folder contains all of the AIM models created in the current project.

Calibrations
This folder contains all of the calibrations made in the current project.

Meshes
This folder contains all of the meshes associated with the current project.

Messages
This is the folder for the messages log files.

Settings
This folder contains the backups of project settings.

Settings.qtmproj
This file contains the current settings of the project.

NOTE: The project file may change format with a new version of
QTM. A backup is saved of the previous version of the file, it can be
named for example Settings.qtmproj.ver_100-101.
The project file in before QTM 2.9 was called settings.qps. A backup
(.bak) is saved of this file when it is converted to the qtmproj format.

QTM creates a default project folder called Qualisys in My Documents when


installing QTM on a new computer. However you can create a new project

QTM USER INTERFACE 61


folder anywhere. For example if you want different users on the computer to
access the same project it can be saved under the Public user in Windows 7.
Copy the whole folder if you want to share the project with someone else.
There are other settings and files which are needed for all users that use QTM
on a computer. To see where these are saved check the Folder options page in
the Project options, see chapter "Folder options" on page 427.

NOTE: The QTM program and other components installed by the QTM
installer are placed in Qualisys Track Manager folder under \Program
Files\Qualisys.

WARNING: QTM does not support the use of network drives or syn-
chronized drives, such as OneDrive. The use of such storage facilities may
lead to corrupted data files.

Project view
The Project view window displays the data files included in the current project.
The view consists of two parts: Project data tree and Details. The name of the
current project is displayed at the top of the Project view.
To open the Project view window go to View menu and select Project view,
the shortcut to toggle the window is Ctrl + R.
Project data tree

The Project data tree window is used to display and open the files in the Data
folder of the current project. This includes folders and QTM files, but also all
other files. You can drag and drop files to the Project data tree from for
example Windows explorer. If the file is dragged from another folder on the
same hard drive it is moved to Data folder. You can toggle whether to copy or
move the file with the Ctrl key. If the file is on another hard drive or the net-
work then a new copy is made in the Data folder.

QTM USER INTERFACE 62


The following buttons are available at the bottom of the data tree: There are
two commands at the bottom of the data tree: Add (New folder or New meas-
urement) and Open.

Add

New folder
When adding a folder it is placed in the currently selected folder, if
none is selected it is placed in the root of the Data folder.

New measurement
Adding a new measurement is the same as starting a new capture
with the Capture command on the Capture menu and the folder to
save the file in is set to the currently selected folder in the project
view.

Open
The Open command is only available when a file is selected and it will
then open that file. If it is a QTM file it is opened in the current QTM win-
dow or a new window, depending on the current settings on the GUI page
in Project options. Other files are opened with its corresponding Win-
dows program.

QTM USER INTERFACE 63


Find
The data tree can be searched with Find (Ctrl + F). When searching the
first occurrence of the search term will be highlighted in the data tree.
The Find function is also available via the Edit menu.

Find next
Use Find Next (or F3) to search for the next occurrence of the current
search term.

The QTM file that is open in the current QTM window is displayed in bold. If you
open a file that is already open in another QTM window you will switch to that
window.

Right-click on a file or a folder to get the following options.

Open (only available for files)


Opens the selected file in QTM if it is a QTM file. Otherwise it is opened by
its corresponding Windows program.

Open in New window (only available for QTM files)


Opens the QTM file in a new QTM window. It overrides the Close current
file option on the GUI page.

Open folder in explorer


Open the currently selected folder in Windows explorer so that you can
for example move the files.

QTM USER INTERFACE 64


Copy folder path
Copy the folder path for the currently selected folder.

Copy file path


Copy the file path for the currently selected file.

Rename
Rename the selected file or folder.

Batch Process...
Batch process the currently selected files. It will open the Batch pro-
cessing dialog so that you can select which processing steps that you
want to apply, for more information about batch processing, see chapter
"Batch processing" on page 605.

Batch Export...
Batch export the currently selected files. It will open the Batch exporting
dialog for selecting the export formats and their options. For more inform-
ation about batch exporting, see chapter "Batch exporting" on page 710.

Delete
Delete the selected file or folder.

Add

New folder
Add a new folder in currently selected folder. If no folder is selected
it is placed in the root of the Data folder.

New measurement
Add a new measurement, which is the same as Capture on the Cap-
ture menu and the folder to save the file in is set to the currently
selected folder in the project view.

Refresh
Refresh the content in the Project data tree so that it matches the con-
tent of the project data folder. This is usually done automatically by QTM.

QTM USER INTERFACE 65


Details

The Details window displays information about the currently selected file. The
displayed data is:

File name

File type

File size

File created
This is not the time when the data was captured, but when the file was cre-
ated on the computer which is not the same if the file has been copied.

File modified

Full path

Using projects
The following sections describe some typical use scenarios and recom-
mendations on how to use projects. However, projects can be used in many dif-
ferent ways, depending on how you want to organize and share your settings.
Think about the following when deciding how to use the projects.

QTM USER INTERFACE 66


l When to create a new project.
l It is usually recommended to use one project per marker setup.
Then you can save different subject in sub-folders of the project
data folder.
l You may of course create any amount of projects based on for
example the people using the lab.
l Settings to use when creating a new project, see chapter "Creating a new
project" on page 69 and "Project presets" on page 74.
l It is recommended to save all of the measurement in the project data
folder, because then you can browse them in the Project view window,
see chapter "Project view" on page 62.
l Do you want to backup settings, see chapter "Backup of project settings"
on page 73.
l Do you want to view the list of recent projects at startup or always open a
certain project, see chapter "Starting QTM" on page 58.
Projects with qualified users

For qualified users, projects are helpful to manage the settings of multiple pro-
jects or studies. Below follow some suggestions.
l Create a new project when there is a new study, for example with a spe-
cific marker set.
l Save all of the QTM files in the project data folder as you can then browse
the data in the Project view in QTM, see chapter "Project view" on
page 62.
l Make a backup of the project settings if you like to be sure that you can
always go back to the settings you know are correct, see chapter "Backup
of project settings" on page 73.
l Pin projects that are often used in the Open project dialog, see chapter
"Manage projects" on page 72.
l Create a Project preset if you create a lot of new projects and want to
use the same settings when it is created, see chapter "Project presets" on
page 74.

QTM USER INTERFACE 67


NOTE: If someone uses a project, any changes they do to the settings are
saved automatically.
If you want to completely protect your projects from other users it must
be done via Windows, for example by having multiple logins to the com-
puter.

Projects with students

When working with students using QTM, projects can be used as follows.
l Create a Project preset so that the students can start with the same set-
tings, see chapter "Project presets" on page 74.
l Tell the students to create a new project with the Project preset and save
them at a specific place. The default folder is Documents in Windows, but
it can be changed, see chapter "Folder options" on page 427.
l Make sure that the QTM files are saved in the project data folder. It is the
default path if nothing has been changed in the project settings.

NOTE: It is a good idea to use a different Windows login for the students
so that they cannot access other projects.
If the students can access the other projects make sure that you backup
the settings of the projects, see chapter "Backup of project settings" on
page 73.

Using projects on multiple computers

If you use a project on multiple computers, you can either synchronize the
whole project, including the project settings, or only selected folders, for
example the Data and AIM folders.
l Synchronizing the Settings.qtmproj file involves the risk of losing settings,
for example recent changes of the camera settings.
l On the other hand, when not synchronizing the Settings.qtmproj file, you
have to remember to change settings in both projects manually, if
needed.

QTM USER INTERFACE 68


If you move a complete project to a computer with a different camera system,
you need to decide on how to handle the camera settings.
l When you press the New button, QTM will automatically detect if the cam-
eras have changed and you will get the option to import all the settings
from another project. Doing so will overwrite all the settings in the cur-
rent project, so you will need to manually change any settings that differ
again, for example the used AIM models.
l If you locate the system first, QTM will not give a warning for a new cam-
era system with the option to import settings. In this case, you will need
to do the cameras settings from scratch.
If you just move or synchronize the data files, there is no problem in opening
them on the other computer because all of the information needed to repro-
cess the file is included in the file.

WARNING: QTM does not support the use of network drives or syn-
chronized drives, such as OneDrive. The use of such storage facilities may
lead to corrupted data files.

Creating a new project


A new project is created with the New project option on the File menu or in
the Switch project dialog. The following dialog is opened with options to cre-
ate a new project.

Enter the following information to create the project:

QTM USER INTERFACE 69


Project name
The project name will also be the name of the project folder . The folder
is created automatically if it does not exist at path specified in Root
folder path.

NOTE: To change the project name after it has been created use
the Rename Project option on the File menu.

Project folder path


This is the path of the project folder. The default is the Documents folder
in Windows. Click Browse to change the path.

NOTE: The Default path for new projects can be set on the
Folder options page in the Project options dialog.

NOTE: If you want several computer users to access the same pro-
ject, it must be saved in a folder which all users have access to, for
example in the C:\Users\Public folders on the computer.

Base the new project on:


Select the source of the settings that are used when creating the project
with the drop-down box.

Custom preset: ...


Any project presets available on the computer are displayed at the
top of the list, for more information about the presets see chapter
"Project presets" on page 74.

QTM USER INTERFACE 70


NOTE: QTM will use the latest calibration made on the com-
puter, that was made with the same cameras (placed in the
same order). It means that the calibration may be loaded from
another project when switching projects.

The current settings


The settings are copied from current settings in Project options.
This is the default option when creating a project with the New pro-
ject option on the File menu.

NOTE: There is no current project when you start QTM there-


fore the default option is then Settings imported from another
project.

Settings imported from another project


Browse for the project that you want to base the new project on and
open the project file.

Default settings
All settings are set to the default values and all of the current set-
tings are deleted.

Use PAF module


Check if you want to create a project with project automation. QTM
includes example projects for Excel, Matlab and Python. If you have PAF
analysis modules installed on your computer, they will show up in the list.
For more information about PAF, see chapter "Analysis Modules" on
page 915.

NOTE: If you select any of the options other than the Current settings,
you will lose the current settings in Project options. If you haven't got
any projects in the recent projects list, then the settings are saved in a
backup in C:\ProgramData\Qualisys\Workspace backups.

QTM USER INTERFACE 71


Manage projects
Use Manage projects on the File menu to manage the projects, for example to
switch to a recently used project or open another project. When switching pro-
ject all of the currently open files will be closed and you have to save any
unsaved files. Then the settings in Project options and other settings such as
those in the Start capture dialog are loaded in QTM from the new project.

NOTE: QTM files outside the project can still be opened and processed if
you like. QTM will then note that the file is not in the current project in
the title bar.

NOTE: QTM will use the latest calibration made on the computer, that
was made with the same cameras (placed in the same order). It means
that the calibration may be loaded from another project when switching
projects.
The calibration is also checked if the camera configuration has changed
at Locate or New. If there is a matching calibration for the new camera
configuration, it will be loaded in the project.

The list in the Manage projects dialog displays the 100 most recently used pro-
jects. Double-click on a project to open it. You can pin projects so that they
are always on the top of the list, you have to click Open for the pinning to
change.
The current project is shown when with [Current project] next to the name.
The current project is also displayed in the title bar of QTM and Project
options.

QTM USER INTERFACE 72


NOTE: When QTM starts up there is no current project, since it has not
been loaded yet.

The following options are available in the dialog:

Browse
Use this option to open a project that is not in the list above, e.g. one
copied from another computer.

New project
Create a new project, see chapter "Creating a new project" on page 69.

Open settings for the startup of QTM, see chapter "Startup" on page 429.

Open
Open the project selected in the list above.

Backup of project settings


The project settings can be saved in a backup file. The backup can then be used
if you want to use two different sets of settings in the same project or if you
want to be sure that the settings are not changed by someone else, because
the project settings are saved automatically when they are changed in QTM.
Click on Settings backup and restore in the File menu to create and manage
the backups.

QTM USER INTERFACE 73


Backup
Enter the Backup name and click Create backup to save a backup of the
current settings in Project options.

Restore
The list displays all of the backups saved in the current project. Select one
and click Restore backup to copy the settings to Project options.

WARNING: All current settings in Project options will be replaced


with the settings from the backup.

NOTE: QTM will use the latest calibration made on the computer,
that was made with the same cameras (placed in the same order). It
means that the calibration may be loaded from another project
when switching projects.

Project presets
The project presets can be used when creating projects to make sure that you
start with the same settings. The project preset contains all of the settings QTM
(Project options and other settings such as those in the Start capture dialog).
The preset also contains any AIM model that was in the project AIM folder
when creating the preset.

NOTE: The presets are saved globally, in the folder C:\User-


s\...\AppData\Roaming\Qualisys\Project presets\, so that it can always be
accessed when creating projects in QTM.

QTM USER INTERFACE 74


To create a preset follow these steps:

1. Open a project with the settings and AIM models that you want to use.

2. Go to the File menu and select Settings management/Project presets to


open the Project presets dialog.

3. Enter the name of the preset and click on Create preset.

To use a preset follow these steps:

1. Go to the File menu and select New project.

2. Enter a Project name and select the settings to Base the new project on
from the drop-down list. The presets are listed as Custom preset: followed
by the name.

Maintenance of presets

If you want to change any settings in a preset you need to create a project with
the preset and then change the settings. Then create the preset again with the
same name.
A preset can be deleted from the Project presets dialog.

QTM USER INTERFACE 75


Opening a project, restoring or importing project settings
When opening a project, restoring project settings that have been saved as a
backup, or importing project settings from another project, all of the settings in
the Project options are changed. The only exception is the calibration. QTM
will use the latest calibration made on the computer, that was made with the
same cameras (placed in the same order). It means that the calibration may be
loaded from another project when switching projects.
The calibration is also checked if the camera configuration has changed when
locating the camera system or starting the cameras in preview mode. If there is
a matching calibration for the new camera configuration, it will be loaded into
the project.

QTM windows
QTM can display various types of windows. Some windows are confined to the
QTM main window, whereas other windows can be either floating or docked.
Floating windows are not confined to the QTM main window and can be freely
positioned anywhere on the computer screen, even when using multiple dis-
plays. Floating windows can also be docked to specific docking locations in the
QTM main window or another floating window. The maximum number of win-
dows that can be simultaneously displayed in QTM is 30.

Overview of window types in QTM


Main window

The main window is opened when QTM is started.

QTM USER INTERFACE 76


The main window contains the following elements:

Title bar
Title bar of the QTM window displaying the current measurement or file,
the current project, and the current user in case a user has logged in to
QTM.

Menu bar
List of drop-down menus located at the top of the main window for access
to all QTM commands, see chapter "Menus" on page 184.

Toolbars
Collection of toolbars located below the menu bar for access to the most
important QTM commands, see chapter "Toolbars" on page 199.

QTM USER INTERFACE 77


Status bar
Status bar located at the bottom of the main window, showing real-time
system information during a measurement, see chapter "Main status bar"
on the next page.

Timeline control bar


Control bar for time navigation when in file mode, see chapter "Timeline
control bar" on page 133.
Primary windows

The following windows are shown within the QTM main window:

Primary 2D or 3D View window


The primary 2D or 3D View window is always displayed when measuring
or when a file is open. For more information, see "View windows" on
page 84

Project view window


Window showing the content of the Data folder in the current
QTM project, see chapter "Project view" on page 62. The Project view win-
dow is always displayed on the left side of the QTM main window when
activated.

Messages window
Log of events since the start of QTM, see chapter "Messages window" on
page 179. The Messages window is always displayed at the bottom of the
QTM main window when activated.
Floating windows

The following windows can be used as floating or docked windows:

Additional 2D or 3D View windows


Additional 2D or 3D View windows can be opened via the Window menu
or the Standard toolbar. By default, the first two additional windows will
be docked to the right of the primary View window.

QTM USER INTERFACE 78


Trajectory info windows
Windows showing information about 3D trajectories, see chapter "Tra-
jectory info windows" on page 137. The default docking location is on the
right side of the QTM main window.

Data info windows


Windows displaying of various types of data, see chapter "Data info win-
dow" on page 167. The default docking location is on left side of the
primary View window.

Plot windows
Windows containing graphs of data, see chapter "Plot window" on
page 179. The default docking location is on the left side of the primary
View window.

Trajectory Editor window


Window for displaying and editing 3D trajectories, see chapter "Trajectory
Editor window" on page 159. The default docking location is at the bottom
of the primary View window.

Trajectory Overview window


Window for displaying an overview of labeled trajectories, see chapter
"Trajectory Overview window" on page 166. The default docking location
is at the bottom of the primary View window.
Main status bar

The main Status bar contains messages about what is going on in QTM, e.g.
when QTM is capturing or processing data. There can also be status messages
for the real time processing and the camera synchronization.
It also shows the latency and the different frequencies during real-time and
when capturing a measurement. The frequencies are updated continuously so
that if the computer cannot process the data fast enough the frequencies will
decrease. Next to the frequencies is a symbol that shows the status of the cal-
ibration, see chapter "Introduction to calibration" on page 543.

Time code
Displays current value of incoming time code (SMPTE, IRIG, Camera time).

QTM USER INTERFACE 79


Latency
The current real-time latency. Only displayed if enabled on the GUI page
in the Project options dialog. For more information see chapter "Real
time latency" on page 595.

GUI
This is the update frequency of the QTM GUI. It can be changed on the
GUI page in the Project options dialog.

RT
This is how fast the data is processed by QTM in real-time. The frequency
is set on the Camera system page in the Project options dialog or for a
measurement in the Start capture dialog.
The RT frequency can be lower than the camera frequency in two cases.
First in RT/preview and Capture mode if the camera frequency is too high
so that the computer cannot process all of the data. The second case is
during a measurement in capture mode if Reduced real time frequency
is selected.

NOTE: During real-time QTM can display a text with a warning in


the main status bar, if not all frames are received from cameras
(dropped frames) or if some cameras are lagging behind (frame mis-
match). Both of these problems are caused by a too fast real-time
frequency for what you are measuring on.

This is how fast the data is captured by the cameras. In RT/preview mode
the frequency depends on the Real time frequency setting on the Cam-
era system page in the Project options dialog. When Reduced real
time frequency is selected the frequency will be displayed as reduced in
RT in the status bar. During a measurement the displayed frequency is
always the same Marker capture frequency.

QTM USER INTERFACE 80


NOTE: When external timebase is used EXT will be displayed in
front of the frequency.

Window handling
Docking and floating

You can float docked windows by clicking and holding the title bar of the win-
dow and dragging them from their current dock. For tabbed windows, you can
click the tab and drag it from the dock.
To dock a window, click the title bar, drag it to the workspace you want to dock
it, and drop it onto one of the dock symbols that appear in the interface. When
hovering with the mouse over a dock symbol, the dock position of the window
is indicated by a colored area.
The dock locations are:

Dock in the center of the workspace as a tab

Dock on the left side of the workspace

Dock on the right side of the workspace

Dock on the top side of the workspace

Dock on the bottom side of the workspace

Arranging windows

Floating windows can be manually resized and freely arranged on the com-
puter screen. Specific window arrangements and customizations can be easily
restored by saving them as a Window layout.

QTM USER INTERFACE 81


TIP: An easy way to create floating windows of the same size is by first
adding them as tabs in a floating window and then untabbing them.

Arranging windows on multiple computer displays

It is also possible to distribute floating windows across multiple computer dis-


plays with display extension and store the arrangements as window layouts in
QTM.

NOTE: When using multiple displays, all displays must have the same
scaling in the Windows display settings.

Window layouts

With the window layouts you can save customized layouts, which include the
placements of all QTM windows, both docked and floating. The layouts are
saved in the project and can therefore be reused on any capture file.
To use window layouts click Window layouts on the Window menu. There are
5 shortcut layouts, which can also be applied with keyboard shortcuts (Ctrl + 1-
5), and two default layouts. The default layouts are for file and capture mode
and they are used when opening a saved file respectively when opening a new
capture file before a measurement.

To save the current layout click Save as and then the desired layout. The 5
shortcut layouts can also be saved with the keyboard shortcuts Ctrl + Shift + 1-
5.

QTM USER INTERFACE 82


The list below shows which objects that are saved in a window layout. For all of
the objects the placement and size is saved. For some of the windows, other
properties are also saved in a layout as specified below.
2D view windows
l The displayed cameras and their zoom.

3D view windows
l Zoom and orientation of the coordinate system and the trace range.

Data info window


l The data type

NOTE: If the selected data type is not available in the file the
2D data is shown instead.

Plot windows
l The analysis or data plot which was used when saving the layout.
The measurement must have labeled trajectories with the same
name as in the saved layout or the same data in the Data info win-
dow.

NOTE: If all labeled trajectories were selected for the plot, the
labels of the trajectories are insignificant. And therefore the
layout will work for any file with labeled trajectories.

l Modified settings in the Plot menu.

l Plot style (dark or light mode).

QTM USER INTERFACE 83


Trajectory info windows

Trajectory Editor window

Trajectory Overview window

Timeline
l The display settings for the Timeline, not for example the meas-
urement range.

Toolbars

View windows
In a View window the motion capture data can be viewed in 2D, 3D view. The
video data of Oqus and DV/webcam is displayed in the 2D view window.
For each view there is a View window menu with settings. The menu is
accessed by right-clicking in a View window.
The Timeline control bar is common for all View windows and placed at the
bottom of the QTM window, see chapters "Timeline control bar" on page 133.
2D view window

QTM USER INTERFACE 84


The 2D view window contains the camera feeds of the cameras connected to
QTM. The different types of data that can be displayed are:
l Marker data of Qualisys cameras in Marker mode.

l Colored intensity maps of Qualisys cameras in Intensity mode.

l Video images of Qualisys cameras in Video mode or external video devices.

Camera feeds

The following items are displayed in the 2D view for a camera.

Camera information

Camera ID
The number in the lower left corner of the 2D view of a camera is the cam-
era id. A motion capture camera is displayed as for example #1, and a
DV/webcam camera is displayed as for example #1V.

Camera type
After the camera ID, the type of camera is indicated for Qualisys cameras.

Exposure group
When delayed exposure is enabled, the exposure group number is dis-
played after the camera type, e.g. (exp. group: 1) for cameras in expos-
ure group 1.

Number of detected markers


The number in the lower right corner is the current number of markers
seen by the camera.
l A V is displayed instead, if a motion capture camera is in video
mode.

A red warning sign


If the camera encounters a problem with the detection of markers, a red
warning sign is displayed in the top left corner. Hover the mouse over the
warning sign to get more information.

QTM USER INTERFACE 85


NOTE: The most common problem is that the camera does not
have time to calculate all of the markers. This is actually caused by
too much background light. If this happens, reduce the Exposure
time or increase the Marker threshold.

Image area

Image size
The current Image size of the camera is shown as a red square and the
part of the image that is outside the image size is greyed out. For most
types of Qualisys cameras the Image size can be changed with the Image
size tool on the 2D view toolbar, see chapter "2D view toolbar" on
page 89.

NOTE: In a file, only the active part of the sensor is displayed. I.e. if
the Image size has been reduced, then the aspect ratio of that 2D
view will match the reduced image size.

Marker masks
l Green squares indicate camera marker masks, see chapter "Marker
masking" on page 536.
l Blue squares indicate software marker masks, see chapter "How to
use software marker masks" on page 611.

Detected markers

Position and size


Position and size of detected markers for Qualisys cameras in
marker mode. The default color of the detected markers is white. In
case rays are displayed the colors of the markers correspond to the
associated 3D trajectories. The center of the 2D markers is indicated
by a crosshair.

QTM USER INTERFACE 86


Marker colors
The default color of the detected markers is white. In case rays are
displayed the colors of the markers correspond to the associated 3D
trajectories.

Marker segments
The 2D markers are color coded when marker filtering is activated
(only available for Oqus cameras), see chapter "Marker circularity fil-
tering (Oqus only)" on page 541.

NOTE: If the markers are grey and it says Not used for tracking in
the middle of the view, that camera has been deactivated on the
page in the Project options dialog.

Video feed
The image from Qualisys cameras in video mode is also shown in the 2D
view window, both in preview and in a file. This means that all of the
actions, like zoom and 3D overlay, can be performed both in preview and
in a file. How to capture video using Qualisys cameras is described in the
chapter "Qualisys video capture" on page 574.

NOTE: In a file, only the active part of the video image is displayed.
If the Image size has been reduced, then the aspect ratio of that
video view will match the reduced image size.

External video
The external video devices are displayed after the motion capture cam-
eras in the 2D view window. You can use zoom on the video in both pre-
view and file mode. The video cameras will appear in the same order as
they are on the Video devices page in the Project options dialog. For
more information about video devices see chapter "External video devices
in 2D view" on page 100.

Selecting cameras, zooming and panning

The appearance of the 2D view can be modified with the following options.

QTM USER INTERFACE 87


l Use the camera buttons at the bottom of the window to select which cam-
eras to display. When a button is pressed down the corresponding cam-
era is displayed in the 2D view. To hide a camera release the button. The
All button will show all cameras and None will hide all cameras.

NOTE: The DV/webcam cameras are called for example '1V' and are
always placed last in the list.

l Hold the Ctrl key and click on a camera button to display just that cam-
era. When only one camera is displayed hold Ctrl key and click on the but-
ton for that camera to display all cameras.
l Double-click in the area of a camera to just display that camera in the 2D
view window. Use the arrow buttons to step to the next and previous
camera in the system.
l Use the mouse or the buttons on the 2D view toolbar to change the zoom
and translation of a camera view. The 2D views can be zoomed and trans-
lated individually.

Zoom in/out the 2D view


Hold down both mouse buttons in the 2D view window and move
the mouse backward and forward, the view is zoomed out respect-
ively in, in reference to the original position of the cursor. The zoom
position is displayed in a view in the top right-corner of the camera.

The mouse wheel can also be used to zoom. Click on the Zoom but-
ton to use the left mouse button for zoom.

NOTE: With the mouse buttons the zoom is continuous, while


with the mouse wheel it is done in steps.

QTM USER INTERFACE 88


Translate the 2D view
Translate can only be used if the 2D view is zoomed in. Hold down
the right mouse button in the 2D view window and move the mouse
to translate. Click on the Translate button to use the left mouse but-
ton for translation.

2D view toolbar

The 2D view toolbar contains settings for manipulating the 2D view of the dif-
ferent cameras and for switching between 2D and 3D view. From left to right
the icons have the following use.

3D View Button
Switch to 3D view.

Selection
Use the normal mouse behavior.

Translation
Use the left mouse button to translate the 2D view.

Zoom
Use the left mouse button to zoom the 2D view.

QTM USER INTERFACE 89


NOTE: You can also use the mouse wheel for zooming.

Image Size Tool


Use the left mouse button to draw a new image size for a camera. The
new image size will be marked with a red frame and everything that is out-
side the frame is greyed out. To modify an existing image size frame use
the normal cursor and drag the edges or move the whole frame. For more
information about image size see chapter "Image size" on page 230.

NOTE: When changing the capture rates from the Camera settings
sidebar the image size is reduced automatically if the frequency is
higher than max frequency at full image size. However when chan-
ging the capture rate from the Project options dialog the image
size must be reduced first. For example if you have one camera in
video mode you still have to reduce the image size in video mode
for all of the cameras.

NOTE: When showing a marker view or video image in a file, only


the active part of the marker view or video image is displayed. I.e. if
the Image size of the camera has been reduced, then the aspect
ratio of that view will match the reduced image size.

Marker Mask Tool


Use the left mouse button to draw a new marker mask for a camera.
Marker masks can either be added while in preview mode (Marker or
Marker intensity) and is then added to the list of camera masks, or when
added to a camera in a file then the masks are only used in reprocessing
of that file.
A new camera marker mask will be added as a green area and the mark-
ers inside the mask will be removed immediately, see chapter "Marker
masking" on page 536.

QTM USER INTERFACE 90


A new software marker mask is added as a blue area and the 2D data is
not affected until the file is reprocessed, see chapter "How to use soft-
ware marker masks" on page 611.

Reorder Tool
Use the left mouse button and drag and drop the whole camera view to
change the camera order in QTM. Use this cursor to change the number
on the camera display so that they come in the order that you want.

Auto Exposure Tool (only available for Miqus Video and Oqus 2c cam-
eras)
Draw the Auto exposure area with this tool. The area is displayed as a
gray rectangle when the tool is active. By default the area is maximized to
the current image size, but if for example there is a very bright part of the
image the auto exposure will work better if the area is reduced. For Miqus
Video Color the auto exposure area is also used to set the white balance.

Identification Tool
Click to switch on the green LED ring on the selected Arqus or Miqus cam-
eras.

3D Overlay
Click to turn on/off 3D overlay for the selected cameras.

Camera settings sidebar

The Camera settings sidebar contains the basic camera settings for the
Qualisys cameras. It is displayed at the right side of the 2D view window when
the camera system is live. Use these settings when setting up the camera sys-
tem to get the best data. The settings are also available on the Cameras page
in the Project options dialog, refer to chapter "Cameras" on page 225 for more
details about the settings.
The sidebar is pinned by default, so that it is always visible. When unpinned, it
slides out when you move the mouse on the edge to the right in the 2D view
window.
The sidebar will only display settings that are available for the currently visible
(selected) cameras. For example, when all cameras are in marker mode the
Video settings are hidden. All of the settings, except the Marker Capture Rate,

QTM USER INTERFACE 91


apply only to the currently visible cameras in the 2D view window. Therefore, it
is important to use the buttons at the bottom of the 2D view window to select
the cameras for which you want to change a setting.

NOTE: If the currently visible cameras have different settings it will say
Differs. When changing such value it will set all the currently visible cam-
eras to the same setting.

The Marker and Video settings apply to all marker cameras models, as well as
Oqus high-speed video cameras. The settings of the streaming video cameras
(Miqus Video and Oqus 2c) are available under Streaming Video.
The following settings are available on the sidebar.

Camera Mode

These settings change the mode of the camera and you can also activate some
other options.

Marker, Intensity, Video


Switch between the different camera modes. For a description of the dif-
ferent camera modes see chapter "Video preview in QTM" on page 584.
The mode can also be changed individually on a camera from the 2D view
window menu.

3D Overlay
Toggle the 3D overlay on and off.
The 3D overlay can also be turned on individually for a camera from the
2D view window menu.

QTM USER INTERFACE 92


Active Filtering (not available for Oqus 3 and 5 series)
Enable the Continuous setting for the Active filtering mode, the cam-
eras will capture two images to remove the background light. For inform-
ation on how to use active filtering see chapter "Active filtering for
capturing outdoors" on page 539.

NOTE: Active filtering is not available for Oqus 3 and 5 series cam-
eras. If you have a system that partly consists of these camera
types, you can still turn on active filtering for the others.

Advanced...
Clicking the Advanced… link will open the Cameras page in the Project
Options. The currently selected cameras will be selected in the camera
list.

Marker settings

These settings change marker settings for the visible cameras.

Capture Rate
The capture rate that is used by cameras measuring markers. The capture
rate applies to all cameras.

QTM USER INTERFACE 93


The maximum capture rate shown above the capture rate buttons applies
to the cameras at full image size. If the camera system includes different
camera types, the maximum capture rate is determined by the camera
with the lowest maximum.
Buttons with a value exceeding the maximum capture rate are indicated
in red. When selecting a frequency beyond the maximum capture rate,
the image size of the cameras of which the maximum is exceeded is auto-
matically reduced.

NOTE: When the image size is automatically reduced, the aspect


ratio is preserved. However, for most camera types, the capture
rate is only dependent on the vertical dimension. You can increase
the width of the image manually by dragging the sides of the red
rectangle.

Exposure and Flash Time


The time used by the camera to capture the image in marker mode, for
advice on this setting see chapter "Tips on marker settings in QTM" on
page 483. The current maximum Exposure and Flash Time is displayed
with a dark blue bar.

NOTE: If the camera system includes different camera types, then


the maximum Exposure and Flash Time can differ depending on
which cameras are displayed in the 2D view.

Marker Threshold
The intensity level in the image used to detect markers, where the default
value is 17. For example a lower value means that areas of less bright
pixels will become markers, for advice on this setting see chapter "Tips on
marker settings in QTM" on page 483.
Below the slider is the color scale which used for color-coding the video
image in Marker intensity mode. The image will be green at the marker
threshold and then blue below and yellow to red above threshold.

QTM USER INTERFACE 94


Marker Masks
Toggle the marker masks. If deselected the marker masks in the current
cameras will not be used by the camera. The masks will be grey and the
markers below them will appear.

Auto-Mask
Create masks over all of the visible markers in the current cameras.
It is important to make sure that it is only unwanted reflections that
are visible when pressing the button. For more information about
marker masking see chapter "How to use auto marker masking" on
page 538.

NOTE: The number of masks per camera is limited to 20 for


Arqus and Miqus cameras and 5 for Oqus cameras. If there are
more unwanted reflections you can manually edit the masks.

Sensor Mode
Switch Sensor mode for the current cameras in marker mode. You can
select between a full size mode and high speed sensor modes. Use the
high speed modes for example if you need to capture at higher fre-
quencies and still want the full FOV, but you do not need the full res-
olution. For an overview of available sensor modes per camera type, refer
to the table in "Qualisys camera sensor specifications (marker mode)" on
page 926.

QTM USER INTERFACE 95


Streaming Video settings

These settings change streaming video settings for the visible streaming video
cameras (Miqus Video or Oqus 2c).

Capture Rate
The capture rate that is used by cameras capturing video. The video cap-
ture rate can be set by pressing one of the available buttons. The buttons
show commonly used values for video capture rate. Integer divisions or
multiples of the current marker capture rate are indicated bold. The num-
ber of buttons and their exact values depend on the current maximum
video capture rate. The maximum capture rate for the current settings
(resolution and aspect ratio) is displayed above the buttons.

NOTE: The video capture rate can differ between cameras. When
changing the video capture rate, the change is applied to the cur-
rently selected cameras in the 2D View window.

Resolution
Set the resolution of the video image by pressing one of the four buttons.
The available values are 1080p, 720p, 540p and 480p (the values indicate
the vertical dimension of the image in pixels).

QTM USER INTERFACE 96


Aspect ratio
Set the aspect ratio of the video image by pressing one of the three but-
tons. The available aspect ratios are 16:9, 4:3 and 1:1.

Auto Exposure
Check to use automatic exposure for streaming video cameras. When
activated the Exposure Time and Gain options are hidden and controlled
by the auto exposure. Optionally, use the Auto Exposure Tool to limit the
image area used to set the exposure (see chapter "2D view toolbar" on
page 89).

Auto white balance


Check to use automatic white balance. This option is only available for the
Miqus Video Color camera. Optionally, use the Auto Exposure Tool to
limit the image area used to set the white balance (see chapter "2D view
toolbar" on page 89).

Exposure Compensation (auto exposure only)


The Exposure Compensation option can be used to adjust the exposure
of the image when using Auto Exposure. Increase the value (positive EV)
for a brighter image or decrease the value (negative EV) for a darker
image.

Exposure Time (manual exposure only)


The exposure time used by the selected streaming video cameras. The cur-
rent maximum Exposure time is displayed with a dark blue bar.

Gain (manual exposure only)


The Gain option can be used to change the sensitivity of the sensor.
Increase the gain for a brighter image. The available values are 1, 2, 4
(default), 8 and 16. Note that the image quality may decrease at high gain
values.

QTM USER INTERFACE 97


Video settings

These settings change video settings for the visible cameras when in video
mode (uncompressed video).

Capture Rate
The capture rate that is used by cameras capturing video. The maximum
capture rate at full image size is displayed above the slider. If the camera
system includes different camera types, the maximum capture rate is
determined by the camera with the lowest maximum, also indicated by
the dark blue bar in the slider.

When setting a frequency beyond the maximum capture rate, the image
size of the cameras of which the maximum is exceeded is automatically
reduced. If the image size has been set manually for a camera the
reduced image size will have the same relations for x and y.

NOTE: The video capture rate can differ between cameras. When
changing the video capture rate, the change is applied to the cur-
rently selected cameras in the 2D View window.

Exposure time
The time used by the cameras in video mode. Set it to a value where the
image is bright enough, for more information see chapter "Outline of how
to capture high-speed video" on page 580. The current maximum

QTM USER INTERFACE 98


Exposure time is displayed with a dark blue bar.

NOTE: If the camera system includes different camera types, the


maximum Exposure time can differ depending on which cameras
are selected in the 2D view.

Flash time
The time of the IR flash in video mode. This setting can be set to Off (0)
microseconds unless you have markers placed on the subject that you
want to be visible in the video. The current maximum Flash time is dis-
played with a dark blue bar.

Gain
Set the gain for the current cameras in video mode to get a brighter video
image. Depending on the camera type you can use gain values of 1, 2 or 4
and for some cameras also 8 and 16.

Compression
The Compression setting can be used to switch between None, In-cam-
era MJPEG and Software compression. The default for Oqus 2c, 5+, and
7+ is In-camera MJPEG, which is the recommended setting for those cam-
eras. For the other cameras the default is None, however, most of the
time it is recommended to select Software and a Codec to reduce the
video file size.

Sensor Mode
Switch sensor mode for the current cameras in video mode. You can
select between a full size mode and high speed sensor modes. Use the
modes for example if you need to capture at higher frequencies and still
want the full FOV, but you do not need the full resolution. For an overview
of available sensor modes per camera type, refer to the tables in
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927 and
"High-speed video" on page 960.

QTM USER INTERFACE 99


Lens Control

The lens control settings are only available for cameras with a motorized lens.
The type of lens on the camera is shown above the settings.

Focus
Set the focus for the current cameras. The same setting is used for
marker and video mode. The minimum distance is 1 m and furthest dis-
tance is 20 m.

Aperture
Set the aperture for the current cameras. The same setting is used for
marker and video mode. The available aperture values are dependent on
the lens in the camera.

NOTE: Once focus and aperture have been set for the cameras, it is pos-
sible to disable lens control using the Qualisys Firmware Installer. This
way the current lens settings will be fixed. For more information, see
"How to use Qualisys Firmware Installer (QFI)" on page 471.

External video devices in 2D view

The external video devices, such as video from Blackmagic cards or web cam-
eras, are displayed last in the 2D view window. The video data can be used for
documentation purposes, but it is not used in the calculation of trajectories.

QTM USER INTERFACE 100


You need to save a default capture and file layout with a 2D view window to see
the video automatically during the capture and in the file. To open a 2D view
window with only the video cameras activated you can click on the Video but-
ton . Place the windows as you want them and then save the layout on the
Window menu.

NOTE: You can have one video camera per 2D view window if you like.
Use the camera buttons at the bottom of the 2D view window to select
which cameras to view.

In preview mode the picture is the current view of the web camera. In file mode
the picture is taken from the saved video file and it shows the video frame that
corresponds to the capture frame. The video probably has fewer frames than
the motion capture and therefore the numbers will not be the same. For
information on how to record a video file see chapter "How to use external
video devices" on page 898.

QTM USER INTERFACE 101


In file mode the current frame number and time is displayed in the upper left
corner, e.g. Frame: 10 and Time: 0.1 s. This is probably not the same frame as
the motion capture frame. The time does not have to be the same as the meas-
urement time either, if the video offset has been changed, because the time
that is displayed is the time from the start of the video file.
The settings for the video cameras are presented in the View window menu,
which is opened by right-clicking on a video camera in the 2D view window. It is
important to notice that the settings are individual per video camera so that
you have to right-click in all of the video cameras if you want to change settings
on several video cameras.

The following options are different from the options on the regular 2D view
window menu. The available options depend on if QTM is in RT/preview or File
mode.

Video camera settings (Preview mode)


These settings are specific for each type of video camera. For more
information,see "How to use external video devices" on page 898 or the
manual of the camera for camera-specific information.

Connect to audio source (Preview mode)


Choose a DirectShow audio source that is associated with the video cam-
era. For more information see chapter "Selecting audio source" on
page 911.

Choose video compressor (Preview mode)


Choose a video codec from a list of the installed codecs on the computer.
The compressor is selected individually per DV/webcam. For more inform-
ation see chapter "Compression of video from video cameras" on
page 910.

QTM USER INTERFACE 102


Video compressor settings (Preview mode)
Open the video compressor settings for the currently selected codec of
the camera.

Use alternate start method (Preview mode)


Use this alternative if the video capture does not start or does not start at
the correct time.

NOTE: Only try this alternative if the default for some reason
doesn't work.

Set time offset ...


Set the time offset between the video file and marker data, see chapter
"Video offset" on page 911.

Remove this file link (File mode)


Delete this video file from the QTM file. The AVI-file will not be deleted
only the link in the QTM file. A video file can also be imported with
File/Import/Add link to video file.

2D view window menu

The options for individual cameras are presented in the 2D view window
menu, which is opened by right-clicking in a 2D view. It can also be opened on
the View menu. The available options depend on if QTM is in RT/preview or File
mode.
The below options are available for Qualisys cameras. For external video cam-
eras (DV/webcam), additional options are available, see chapter "External video
devices in 2D view" on page 100.

QTM USER INTERFACE 103


The following options are available in the 2D view:

Switch to 3D view
Switch the View window to 3D view.

Mode (Preview mode)


For a description of the different camera modes see chapter "Video pre-
view in QTM" on page 584.

NOTE: This option is only available when QTM is in preview mode


before you start a capture.

Markers
Switch to the default Markers mode.

Marker intensity
Switch to the Marker intensity mode.

Video
Switch to the Video mode.

Show 3D data overlay


Overlay the 3D data on the 2D view. This can be done on any camera inde-
pendent of whether it is Marker or Video mode. For more information
see chapter "3D data overlay" on page 107.

QTM USER INTERFACE 104


Reset image size for this camera (Preview mode)
Reset to full image size for a camera with reduced image size.

Reset image size for all cameras (Preview mode)


Reset all cameras to full image size.

Zoom to full view


Reset the zoom so that the whole image is shown for the camera. This is
the same as double-clicking on the miniature in the top right corner.

Set exposure group


Select the exposure group for the camera view that you right-clicked on,
see chapter "Delayed exposure to reduce reflections from other cameras"
on page 534.

NOTE: The option is only available when the Exposure delay mode
called Camera groups is selected on the Cameras page in Project
options,

Export view to AVI


Export the camera view that you right-clicked on to an AVI file, see chapter
"Export to AVI file" on page 739.

QTM USER INTERFACE 105


Choose video compressor (Preview mode)
Choose a video codec from a list of the installed codecs on the computer.
The compressor is selected for all cameras in Video mode. For more
information see chapter "Video compression" on page 242.

Video compressor settings (Preview mode)


Open the video compressor settings for the currently selected codec of
the camera.

Camera information (Preview mode)


Show the camera information in a dialog, i.e. Camera type, IP address, etc.

Review camera settings (File mode)


Show the settings used for the camera in the current file, for more inform-
ation see chapter "Review camera settings" on page 108.

Rotate view
Change the rotation of the displayed 2D view so that it matches the cam-
era rotation. For example if the camera is placed upside down you can
use the 180 degree rotation.

The rotation is stored with the file, so you can rotate the cameras after
you have calibrated the camera system. The 2D view rotations will then be
stored in the QTM file. It is also possible to rotate the 2D views in a QTM
file, but to save it you must make sure to make another modification to
the file as well.

NOTE: To change multiple cameras at the same time you can use
the 2D view rotation setting on the Cameras page in Project
options.

QTM USER INTERFACE 106


2D View settings...
Open the 2D view settings page in the Project options dialog so that you
can change the 2D view settings, for example the 3D overlay settings. For
more information see chapter "2D view settings" on page 417.

3D data overlay

The 3D data can be overlayed on the 2D image area to show the 3D view from
the camera viewpoint. This can for example be used for showing the force
arrow in the video of someone stepping on a force plate or to check which 2D
markers that are used for the 3D calculation.
Follow these steps to activate the 3D data overlay:

1. Calibrate the camera system, including the Qualisys cameras that are
used in video mode.
2. Open a 2D view window in RT/preview or a file.

3. Right-click on the camera where you want to turn on the overlay and
select Show 3D data overlay.

NOTE: The 3D data can also be overlayed on the 2D data, for


example to check how much of the measurement volume that is
seen by the camera.

4. The 3D elements displayed in the overlay and the opacity can be changed
on the 2D view settings page in the Project options dialog, see chapter
"2D view settings" on page 417.

NOTE: The marker and video data display in the 2D view can be
switched between linearized and unlinearized on the 2D view set-
tings page. To match the 2D data with the 3D data the data must be
linearized.

QTM USER INTERFACE 107


Review camera settings

To review the camera settings in a file right-click on a camera 2D view and the
select the Review camera settings option. This opens the Review settings
dialog, which shows the settings used during the capture for the camera sys-
tem and other input devices.

The settings are greyed out so that they cannot be changed, but otherwise you
can navigate in the same way as in the Project options dialog. Any setting that
were not available in the QTM version in which the file was capture will be set
to its default value. For more information about the settings, refer to the
chapters "Cameras" on page 225, "Synchronization" on page 266, and the
chapters of the respective input devices.

QTM USER INTERFACE 108


3D view window

In 3D view window the motion capture data is shown in 3D using the coordin-
ate system that was defined in the calibration.

Overview of graphical elements

The following graphical elements can be shown in a 3D View window.


All graphical elements can be enabled or disabled and modified in the 3D View
Settings page. The 3D View settings can be accessed by right-clicking in the 3D
View window and selecting 3D View Settings... in the 3D View window menu.
For detailed information about the 3D View settings, refer to chapter "3D view
settings" on page 419.

QTM USER INTERFACE 109


Volume and calibration

Axes
Axes of the coordinate system of the motion capture (X = red, Y = cyan
and Z = dark blue).

Grid
A grid showing the floor of the measurement volume (e.g. Z = 0, if Z is the
vertical axis of the measurement setup).

Volumes
Covered and calibrated volumes, see chapter "Volumes in 3D views" on
page 125.

Bounding box
A white outlined box for the bounding box used by the 3D tracker, see
chapter "Bounding box restricting 3D data" on page 329.

Force plates
Rectangular areas showing the location of the force plates.

Mesh objects
Static mesh objects as defined in the Static Mesh Objects settings page
or Rigid Body mesh objects, see chapters "Static mesh objects" on
page 424 and "Rigid body meshes" on page 667.

Cameras

Cameras
The position and orientation of each camera in this specific setup.

Rays
Rays showing which cameras contributed to the selected trajectories, see
chapter "Rays in 3D views" on page 130.

Camera view cones


Camera view cones showing the volume that is covered by individual cam-
eras, see chapter "Camera view cones in 3D views" on page 128.

QTM USER INTERFACE 110


Camera tooltip
Hover with the mouse over a camera to show camera information. The
tracking information is the same as in the File information dialog, see
chapter "File Information" on page 192 .

Motion capture data

Markers
Markers for the current position of the trajectories, see chapter "Tra-
jectories in 3D views" on page 116.

Traces (File mode)


Traces of the trajectories.

Bones
Bones between markers, see chapter "Bones in 3D views" on page 119.

Rigid bodies
6DOF bodies, see chapter "6DOF bodies in 3D views" on page 121.

Skeletons
Skeleton segments and segment markers, see chapter "Skeletons in 3D
views" on page 123.

Force vectors
Force vectors (red) displaying the current forces on the force plates, see
chapter "Viewing force data" on page 704.

Force traces (File mode)


Force traces (blue) displaying the force applied to the force plates during
the measurement.

Gaze vectors
Gaze vectors (yellow) displaying the current gaze vectors of eye tracking
devices.

Other information

Text labels
Text labels can be enabled for some of the graphical elements.

QTM USER INTERFACE 111


Trajectory count
Trajectory count showing the number of selected trajectories and the
total number of trajectories in the current frame.

Labeled trajectory information


Labeled trajectory information showing the current number of labeled tra-
jectories and the total number of labels.

Tooltips
Hover with the mouse on an object to display detailed information.

Navigating in the 3D view

The 3D view can be rotated, translated or zoomed so that the data can be seen
from the viewpoint that you want. For most actions which change the view of
the 3D view a red crosshair is shown indicating the center of the 3D view. By
double clicking anywhere in the background, the center is automatically moved
to the geometrical center of all 3D points at the current frame.

The Selection cursor is the default cursor. With modifiers it can be used to
perform a wide range of tasks in the 3D view, e.g., selecting and identifying tra-
jectories or navigating in the 3D view.
The following actions can be performed with the mouse to navigate in the 3D
view:

Rotate the 3D view


Hold down the left mouse button in the 3D view window and move the
mouse to rotate the view around the red crosshair. The rotation is limited
to azumuth and elevation.

Translate the 3D view


Hold down the right mouse button in the 3D view window and move the
mouse to move the center in the 3D view. Other ways to move the center
are:
l Hold down Shift + C and click on a trajectory to move the center to
that trajectory.
l Double click to move the center to the geometrical center of all tra-
jectories in the current frame.

QTM USER INTERFACE 112


Zoom in/out the 3D view
Hold down both mouse buttons in the 3D view window and move the
mouse backward and forward. The view is zoomed out and in, respect-
ively, relative to the red crosshair. The mouse wheel can also be used to
zoom. Hold down Ctrl to zoom relative to the mouse position.

NOTE: With the mouse buttons the zoom is continuous, while with
the mouse wheel it is done in steps.

In addition, it is possible to follow the movement of one or more selected mark-


ers by enabling the Follow Selected Markers function. This function can be
enabled by pressing the Follow Selected Markers button in the Gui Control
toolbar or the keyboard shortcut Ctrl + Alt + 0. When this function is enabled,
the focus of the 3D view will be continuously translated to the geometric center
of the selected marker(s).

Other useful mouse actions

The following mouse actions can be useful when manually checking and man-
aging trajectories in measurement files.

Scrubbing
Hold the Ctrl key, press left mouse button anywhere in an empty part of
the 3D view and drag sideways to move forward or backward through the
measurement's time line.

Trace range zooming


Hold the Shift key and roll the mouse wheel to increase or decrease the
trace range.

Projections

The default projection used for the 3D view is Perspective. You can change the
projection to specific orthogonal projections. To change the projection, right-
click to open the 3D view window menu and select on a projection under
View. Alternatively, press the P key to toggle between Perspective and the
closest orthogonal projection.

QTM USER INTERFACE 113


NOTE: Rotation actions are not possible when using orthogonal pro-
jections.

3D view toolbar

The 3D view toolbar contains tools for manipulating the 3D view and the tra-
jectories and for switching between 2D and 3D view. From top to bottom the
icons have the following use.

2D button
Switch to 2D.

3D button
Switch to 3D.

Selection cursor
Use the normal mouse behavior. It has the following keyboard modifiers:
l Shift - Add trajectories to the selection.

l Ctrl - Add/Remove trajectories to the selection.

QTM USER INTERFACE 114


l Shift + drag - Select trajectories with an area. If there are traces
below the area these trajectories will be selected as well.
l Alt - Select a trace.

l Alt + Shift - Add trace to the selection.

Rotate cursor
Use the left mouse button to rotate the 2D view. The difference to the
Selection cursor is that you can no longer select anything with this
cursor. It has the following keyboard modifier:

Translate cursor
Use the left mouse button to translate the 2D view. This is the same altern-
ative as using the right mouse button with the Selection cursor.

Zoom cursor
Use the left mouse button to zoom the 2D view. This is the same altern-
ative as using both mouse buttons or the mouse wheel with the Selection
cursor. It has the following modifier:
l Ctrl - Use the alternative zoom method. For the default settings the
alternative method is zoom to the current position of the cursor.

Center trajectory cursor


The view is centered on the trajectory that you select. This is the same as
using the Shift + C key with the Selection cursor.

Quick identification cursor


Identify the trajectory that you select as the selected label in the Labeled
trajectories window, for more information see chapter "Manual iden-
tification of trajectories" on page 620. This is the same as holding down
Ctrl + Alt with the Selection cursor. It has the following modifier:
l Shift - Join the trajectory to the previous label in the Labeled tra-
jectories window. This option can for example be used if the last
identified trajectory only covers a part of the measurement range,
then you can add the rest by holding down Shift when clicking on
the next part.

QTM USER INTERFACE 115


Cut trajectory trace cursor
Click on a trace to cut it in two parts. This is the same as using the Shift +
X key with the Selection cursor.

NOTE: The gap between the two parts is there to visualize that
there are two parts it is not a missing frame.

Create bones sequence cursor


Click on trajectories in sequence in the 3D view to create bones between
them. This is the same as using the Shift + B key with the Selection
cursor. Just click in the background to restart a new series of trajectories.

View cones menu


Click on the icon to open the Camera view cones menu, for more inform-
ation see chapter "Camera view cones in 3D views" on page 128.

Volume vizualizations menu


Click on the icon to open the Volumes menu, for more information see
chapter "Volumes in 3D views" on page 125.

Trajectories in 3D views

The trajectories of a measurement are shown with markers on their current


position in the 3D view, where the colors of the trajectories are the same as in
the Trajectory info windows.
In File mode, the position throughout the measurement can be displayed with a
trace, where the length of the traces is controlled by the bottom sliders on the
Timeline control bar. The trace of a trajectory is thicker when the focus is set
on its trajectory in the Trajectory window, i.e. there is a dashed box around it.

QTM USER INTERFACE 116


You can change settings for the marker and trace display on the 3D view set-
tings page in the Project options dialog. Among other things you can turn on
the display of the marker labels.
The trajectories can either be created with 3D or 2D tracking, but the display in
the 3D view window is the same except that 2D tracked trajectories are dis-
played in a plane. The data of the marker is shown in a Trajectory info win-
dow, see chapter "Trajectory info windows" on page 137.
The following actions can be performed with the Selection cursor on a tra-
jectory in the 3D view.

Select
Click on the marker or the trace of the trajectory to select it. Use the fol-
lowing keys to modify the action when you click on a trajectory.
l Hold down Alt to only select a part of the trajectory.

l Use Shift and Ctrl to add and delete (only Ctrl) trajectories to the
selection.
l Hold down Shift and drag the mouse to use area selection.

Drag and drop for identification of trajectories


Trajectories can be drag-and-dropped from the 3D view to labeled tra-
jectories to change their identity. For example, drop an unidentified tra-
jectory from the 3D view onto an empty label in the Labeled trajectories
window to identify it. In case the labeled trajectory is not empty, this
requires that there is no overlap between the trajectories.
A trajectory can also be dropped on the trace of another trajectory in the
3D view to join them. This can be used in combination with scrubbing
(hold down Control and drag the mouse in empty 3D space) and trace
range zooming (hold down Shift and roll the mouse wheel).
When using skeletons, the trajectories can be dropped on available (red)
segment markers to label them.

NOTE: The file must be reprocessed to update the 6DOF or Ske-


leton data.

QTM USER INTERFACE 117


Trajectory info window menu
Right-click on a trajectory to open the Trajectory info window menu for
the selected trajectories, see chapter "Trajectory info window menu" on
page 144.

Information
Place the cursor over a marker or a trace to see a tool-tip with inform-
ation about the trajectory.

Delete
Use the Delete key to delete selected trajectories and parts of trajectories
directly in the 3D view window.

NOTE: Trajectories in the Discarded trajectories window are hid-


den by default.

Quick identification
The quick identification method provides an easy way to manually identify
the markers. First select an empty label. Then hold down Ctrl + Alt or
select the Quick identification cursor to activate this method. Then click
on a marker and it will be identified as the currently selected empty label
in the Labeled trajectories window. For more information about quick
identification, see section "Manual identification of trajectories" on
page 620.

Create bones
To create a bone, hold down Shift and select a pair of labeled trajectories
by clicking in the 3D view window. Then press B and there will be bone
between the pair. Several bones can be created with the Create bones

QTM USER INTERFACE 118


sequence tool.

Center on trajectory
Select a trajectory and press C to change the viewpoint so that it centers
on that marker. You can also use the Center trajectory cursor and click
on the marker.

Bones in 3D views

Bones are used to visualize the connection between two markers in 3D views,
e.g. if the measurement is done on a leg, bones can connect hip to knee, knee
to foot and so on. The bones can have different colors and the bone colors that
are used are saved with the AIM model.

Create bones

To create bones you need to select at least two labeled trajectories, e.g. by hold-
ing Shift and clicking in the 3D view window. Then press B or click Create bone
in the Bone menu to create bones between the selected trajectories. Several
bones can be created in succession by using the Create bones sequence tool
in the 3D view toolbar. The bones will then be created between the trajectories
in the order that you click on them.

QTM USER INTERFACE 119


NOTE: If bones are included in the capture file that is used to generate
an AIM model, the bones and their colors will be created again when the
AIM model is applied. The bones are also saved in label lists, so that they
are loaded if a label list is loaded.

Modify bones

Right clicking on a bone opens the Bone menu. The bone menu can be used to
change the bone color or delete bones. The action can be applied to several
bones at once. Multiple bones can be selected as follows:
l Hold Ctrl and click on successive bones to add them to the selection.

l Select one bone, and then press Shift and drag the mouse over the area
in which you want to select the bones.
l To deselect a bone, hold Ctrl and press on the selected bone.

To delete one or more bones, select the bones. Then press delete or right-click
on the bone and click Delete bone. To delete all bones use Delete all bones
on the Bone menu.
The bones visualization settings are set on the 3D view settings page in the
Project options dialog.

NOTE: You can use the Bones button on the GUI Control toolbar or Alt
+ 2 to show or hide the bones.

Bone menu

The Bone menu is opened when right-clicking on a bone, but it can also be
opened on the Edit menu and in the 3D view window menu.

QTM USER INTERFACE 120


Create bone
Create bones between all selected labeled trajectories.

Delete bone
Delete the selected bones.

Delete all bones


Delete all existing bones.

Change bone color


Select a bone color for the selected bones.

Use trajectory color


Use the color of the first trajectory used to create the selected bone.

Set random color


Select a random color for the selected bones.

6DOF bodies in 3D views

QTM USER INTERFACE 121


Measured 6DOF bodies are displayed as separate local coordinate systems in a
3D view window, where the origin is placed at the position of the local origin of
the 6DOF body. The axes of the local coordinate system have the same color
codes as the coordinate system of the motion capture.
Virtual markers corresponding to the points of the 6DOF body definition are
also displayed for each 6DOF body. The markers will therefore be displayed
with two colors in the 3D view. The colors of the rigid body are set for each
6DOF body definition on the 6DOF Tracking page in the Project option dialog.
The actual trajectories will automatically get a slightly brighter color.
You can change settings for the 6DOF body display on the 3D view settings
page in the Project options dialog.
The 6DOF data can be viewed in the Data info window or exported to a TSV file
and Matlab, see chapter "6DOF data information" on page 170 respectively
chapters "Export to TSV format" on page 711 and "Export to MAT format" on
page 729.
Place the mouse over the local coordinate system to see information about the
body in the current frame.

NOTE: In a capture file, definition, name and color of a 6DOF body can
only be changed by reprocessing the file, see chapter "Reprocessing a
file" on page 601.

QTM USER INTERFACE 122


Skeletons in 3D views

Skeletons are displayed as sets of connected cone-shaped segments. The graph-


ical properties of skeletons can be changed in the 3D view settings. The visual
elements of the skeletons are:

Segments
Cones representing the position and orientation of the rigid parts of the
skeleton, with the base indicating the proximal ends and the apex the
distal ends. Place the mouse over the segment to view the segment name,
the skeleton it belongs to, the segment markers used by the segment and
the DOFs used for the segment.

Segment coordinate system axes


X, Y and Z axis representing the position and orientation of the segments.

Skeleton labels
Text labels attached to skeletons showing their name.

QTM USER INTERFACE 123


Segment labels
Text labels attached to the respective segments showing their name.

Segment markers
Three dimensional cross-shaped markers indicating the position of the
markers according to the calibrated skeleton definition. This element is
only used for the marker-based skeletons. The segment markers can have
three different colors depending on their status in the current frame.
l White: Default color, indicating a good fit with the measured marker
position.
l Yellow: Indication of a bad fit (deviation of 5 cm or more) with the
corresponding measured marker position.
l Red: Corresponding marker is missing.

Place the mouse over the segment marker to view the segment marker
name and the skeleton and segment it belongs to.

Segment rigid body


Three dimensional cross-shaped marker with a coordinate system, indic-
ating the position of the rigid bodies according to the calibrated skeleton
definition. This element is only used for the rigid-body-based skeletons.
The segment rigid bodies use the same color scheme as the segment
marker.
Place the mouse over the segment rigid body to view the segment rigid
body name and the skeleton and segment it belongs to.

QTM USER INTERFACE 124


Volumes in 3D views

QTM can help you see the volume in which you will be able to measure by cal-
culating the covered and the calibrated volumes. The view cones can also be
used for visualizing the FOV, see chapter "Camera view cones in 3D views" on
page 128.

The covered volume is displayed as light blue cubes. It is the volume that is
seen by a certain number of cameras, specified by the user with the Cameras
required to consider volume covered setting on the 3D view settings page
in the Project options dialog. The covered volume can be used to determine
where 3D data can be measured and is calculated by combining the view cones
and is therefore affected by the length of them, i.e. the Smallest marker size
visible setting.

QTM USER INTERFACE 125


NOTE: You can also use the Volume buttons on the GUI Control toolbar
to toggle the display of the volumes.

The default marker size differs between the camera models. e.g. for Oqus
3-series the default marker size is 12 mm. The covered volume is also cut
default by the floor level, but that can be changed by disabling the Cut
covered volume at floor level option on the 3D view settings page in
the Project options dialog.

The default required number of cameras is three cameras for calculating


the covered volume, which most of the time gives the most likely covered
volume. If you use only two cameras to calculate the volume there will be
some parts that are actually very difficult to reach with the wand.

It is important to notice that the covered volume does not consider


whether it is likely that markers are occluded by the subject or not. To sim-
ulate this you can use the camera selection in the Volumes menu see
below.

The calibrated volume is displayed as light red cubes. It is the volume that the
wand moved through when the camera system was calibrated. It therefore
indicates where the most accurate 3D results can be expected and can be used

QTM USER INTERFACE 126


to evaluate if the wand needs to be moved in a larger volume. 3D data will how-
ever be calculated outside the calibrated volume as well. This is usually not a
problem as long as the markers are within a few decimeters of the calibrated
volume. Use the 3D residual to evaluate if the data is good enough.

NOTE: The calibrated volume can only be displayed in files calibrated


with a calibration file which is processed in QTM 2.3 or later.

The volumes can be enabled from the Volumes menu on the 3D view toolbar.

Click on the Volumes button to open the dialog and enable the volumes
with the Show calibrated volume checkbox and the Show covered volume
checkbox. The features can also be enabled on the 3D view settings page in
the Project options dialog.

The maximum distance from the cameras is determined by the Smallest


marker size visible setting, which defines the marker size that should be vis-
ible in the entire volume.
Select which cameras that are used in the calculation with the All/None but-
tons and camera check boxes. By choosing which cameras are considered
when creating the covered volume, you can for example determine what hap-
pens to the volume when one camera is occluded. Another case that can be
simulated is when the cameras are mounted on two sides and can only see
markers on one side of the subject, e.g. on a human walking through the
volume. Then you can turn off all of the cameras on one of the sides to see the
volume where the subject can be viewed by the rest of the cameras.

QTM USER INTERFACE 127


Camera view cones in 3D views

The camera view cones display the field of view of the camera, i.e. what a cam-
era will be able to measure. It can be used to evaluate how the cameras are
placed in the system so that the placement can be better optimized. By
enabling multiple cones, you can also study what is covered by a certain subset
of the camera system.

The view cones can be enabled per camera from the Camera view cones
menu on the 3D view toolbar. Click on the Camera view cones button to
open the dialog and enable the cones with the Show camera view cones
checkbox. The feature can also be enabled on the 3D view settings page in the
Project options dialog.

NOTE: You can also use the Camera view cones button on the
GUI Control toolbar to toggle the display of the cones.

QTM USER INTERFACE 128


The length of the cones is set in meters by the setting. The cones are by default
cut at the floor level, but that can be changed by disabling the Cut covered
volume at floor level option on the 3D view settings page in the Project
options dialog.
Select which camera cones that are to be shown with the All/None buttons and
camera check boxes.

QTM USER INTERFACE 129


Rays in 3D views

Camera rays in the 3D view window show which cameras have contributed to
the selected trajectories. The colors of the rays correspond to the colors of the
selected trajectories.
The camera rays are based on a mapping between 2D data of the cameras and
the 3D trajectories. Rays are only shown for cameras that have actually con-
tributed to the 3D tracking. The rays may not perfectly intersect with the cal-
culated 3D position of the trajectories since they represent the actual projected
2D position on the sensor based on the used calibration. That means that the
rays can be used to visualize the residual of the trajectories.

QTM USER INTERFACE 130


Rays can only be shown if the rays were stored during the processing of the
capture. To store the rays, make sure that the Store option is enabled in the
Rays section under Project Options > Processing > 3D Tracking, see chapter
"Rays" on page 328.
To show rays in the 3D view window, make sure that the Camera Rays button
on the GUI Control toolbar is enabled, or that the option Enable camera
tracking rays is enabled in the Rays section in the 3D view settings.

NOTE: Reprocessing of 3D tracking with the ray Store option enabled is


required to show rays in files captured in QTM versions before QTM
2018.1. Note that by reprocessing the file all manual editing will be lost.

3D View window menu

The View window menu is opened by right-clicking in a 3D View window. It


can also be opened on the View menu. Note that if you right-click on bones or
markers in the 3D view then the Bone menu respectively Trajectory info win-
dow menu is opened instead.

In a 3D View window the following actions can be performed:

Switch to 2D View
Switch the View window to 2D view.

QTM USER INTERFACE 131


3D View Settings...
Open the 3D view settings page in the Project options dialog so that you
can change the display options for the objects in the 3D view, see chapter
"3D view settings" on page 419.

Reset Viewpoint
Reset the camera viewpoint to center of mass of the tracked markers
(same as double clicking in the 3D view window).

Export Window to AVI


Export the current 3D view window to an AVI file, see chapter "Export to
AVI file" on page 739.

View
Toggle projection between Perspective and Orthogonal (keyboard short-
cut P), and choose viewpoint of orthogonal projection (orthogonal front,
back, etc.). When toggling the projection from Perspective to
Orthogonal, the viewpoint snaps to the closest orthogonal projection.
The Grid Rotation option rotates the grid in accordance with the selected
orthogonal view when checked (only available in orthogonal projection
mode).

Trajectory
Open the Trajectory info window menu for the selected markers, see
chapter "Trajectory info window menu" on page 144.

Bone
Create and delete bones and change the bone color. For more inform-
ation see chapters "Bone menu" on page 120 and "Bones in 3D views" on
page 119.

QTM USER INTERFACE 132


Rigid Body
Change mesh settings or change color of selected rigid body. The changes
apply to the rigid body definition in the file, not the one in the project.

Timeline control bar

The Timeline control bar is shown at the bottom of the QTM window in file
mode. It is used to indicate and to select current frame, trace range, meas-
urement range and events.
The current frame is indicated by the top slider , the exact frame is shown in
the text below the control bar. To go to a frame, left-click on the desired pos-
ition on the timeline, or drag the top slider with the mouse. In the 3D view win-
dow you can also use the scrubbing feature (Ctrl + drag) to browse through
the measurement.
The trace range is the amount of trace that is shown in the 3D view window
and it is selected with the two bottom sliders . Drag the sliders to the desired
positions for the trace range, the exact position of the sliders is shown in the
Status bar. You can also use the trace range zooming feature (Shift + Mouse
wheel) to increase/decrease the trace range.
The measurement range is the amount of the original measurement that is
used in the analysis, i.e. when plotting, exporting or playing the data. It is selec-
ted with the two scroll boxes at the ends of the time range. Drag the boxes to

QTM USER INTERFACE 133


the desired positions for the measurement range. The exact measurement
range can be found in the dialog, just double-click on the Timeline control bar
to open the dialog.
The events are displayed in the timeline as red triangles above the timeline.
Place the mouse over the event on the timeline to see information about the
event. Right-click on an event to open the Events menu, see chapter "Events
menu" on page 136. For more information about how to use events see
chapter "How to use events" on page 706.

Use the Timeline menu to modify the Timeline control bar.

Set timeline parameters


Set the parameters of the Timeline control bar, see chapter "Timeline
parameters" on the next page. The dialog can also be opened by double-
clicking on the Timeline control bar.

Use time scale


Show time scale in seconds in the timeline.

Use frame scale


Show marker frames in the timeline.

No scale
Do not display any scale.

Show text
Toggle the display of information about Marker frames, Marker trace,
Video frames, Time and time stamp (SMPTE/IRIG/Camera time) in the
timeline.

QTM USER INTERFACE 134


No trace range
Reset the trace range to no trace.

Reset measurement range


Reset the measurement range so that it contains the whole meas-
urement.

Events
Open the Events menu, see chapter "Events menu" on the next page.
Timeline parameters

In the Timeline parameters dialog you can set the parameters of the Timeline
control bar.

The parameters in the dialog are the same which can be set manually in the
bar. The numbers inside the parenthesis show the possible values of that para-
meter. When setting the different parameters you will be warned if the para-
meter is outside its possible range. The parameters are as follows:

Current position at frame


The current frame of the measurement, indicated by the top slider .

First selected frame


The first frame in the measurement range, indicated by the scroll box .

Last selected frame


The last frame in the measurement range, indicated by the scroll box .

Trace frames before current frame


The start of the trace range, indicated by the left bottom slider .

QTM USER INTERFACE 135


Trace frames after current frame
The end of the trace range, indicated by the right bottom slider .

NOTE: The trace parameters can be negative, which then means


that the trace can start after the current position or stop before the
current position.

Events menu

The Events menu is opened by right-clicking on an event in the Timeline con-


trol bar. The Events menu is also available on the Timeline menu and Edit
menu, but then the actions are limited to the ones not related to a current
event.

Edit event
Edit the current event. It will open the Edit Event dialog, where you can
edit Label, Time and Frame.

Go to event
Move the current frame of the measurement to the current event.

Remove event
Remove the current event.

Create new event


Open the Add event dialog to create new event on the current frame, for
more information see chapter "Adding events" on page 706.

QTM USER INTERFACE 136


[Event shortcuts] (if any are defined)
The event shortcuts that are listed on the Events page in the Project
options dialog are available on the menu. You can use them to easily cre-
ate an event with that name and color.

Edit event list


Open the Edit event list dialog to edit all of the events in the current file,
for more information see chapter "Viewing and editing events" on
page 708.

Remove all events


Remove all events in the current file.

Set range start


Set the start of the measurement range to the frame of the current event.

Set range stop


Set the stop of the measurement range to the frame of the current event.

Trajectory info windows


In the Trajectory info windows the trajectories of the measurement are listed.
The trajectories can either be created with 3D or 2D tracking, but they are
handled in the same way in the Trajectory info windows. The following three
Trajectory info windows are used in QTM:

Labeled trajectories window


Contains the trajectories that have been identified (labeled).

Unidentified trajectories window


Contains unidentified trajectories.

Discarded trajectories window


Contains trajectories that have been manually deleted.

The windows are tool windows and the Labeled and Unidentified trajectories
window can be opened in preview mode. By default the Trajectory info win-
dows are placed on the right side of the main window, but they can be floated
or docked in other locations, see chapter "Window handling" on page 81.

QTM USER INTERFACE 137


Data in Trajectory info windows

The Trajectory info windows displays the data about each trajectory for the cur-
rent frame. The data can be sorted by clicking on the column headers, see
chapter "Sort trajectories" on page 140.
The following data is listed in all of the Trajectory info windows:
Trajectory
The label of the trajectory and the color of its marker in 3D views. In addi-
tion to the color the symbol next to the label also shows if the trajectory
and its trace are displayed in 3D views. The following symbols are used:
Both trajectory and trace are displayed.
Just the trajectory is displayed.

When a trajectory is empty at the current frame, this is indicated by an


open circle, for example .
If a trajectory consists of more than one part there is plus sign (+) in front
of the label. The parts are shown by clicking the plus sign or by using the
left and right arrows.

NOTE: In the Labeled trajectories window the label can be edited


by double-clicking on the label. However, in the other two windows
the labels are always Unidentified and Discarded followed by a
sequential number, which can be used to separate the trajectories.

Type
The type of trajectory, which can be one of the following types.
Measured
A trajectory or a part that has been tracked from the measurement
data.

QTM USER INTERFACE 138


Mixed
A trajectory with a mix of the other trajectory types.

Gap-filled
A trajectory or part that has been calculated with a gap fill function.

Virtual
A trajectory or part that has been calculated from 6DOF body.

Edited
A trajectory or part that has been edited using a smoothing function.

Measured slave
A trajectory or part that has been imported from a Twin slave file.

Gap-filled slave
Gap-filled trajectory or part that has been imported from a Twin
slave file.

ID#
The sequential id of the short range active marker.

Fill level
The percentage of the current measurement range that the trajectory or
part is visible. The measurement range is selected using the Timeline con-
trol bar.

Range
The range of frames, within the measurement range, with data for the tra-
jectory or part.

X, Y and Z
The position (in mm) of the trajectory in the current frame. The coordin-
ates use the coordinate system of the motion capture system set up when
the system was calibrated.

Residual
The average of the different residuals of the 3D point. This is a quality
check of the point’s measured position.

QTM USER INTERFACE 139


NOTE: The residual is not available for 2D tracked trajectories.
The residual is 0 for edited, virtual, gap-filled and interpolated twin
slave trajectories.

NOTE: The number after the window name is the number of trajectories
in the window.

Sort trajectories

The data in the Trajectory info windows can be sorted by clicking the column
headers. The trajectories are then sorted in ascending order for that column.
Click on the column header to toggle between ascending and descending
order.

To revert to the manual sorting in the Labeled trajectories window, right-click


on the column header to open the Labels menu of the window and then select
the Revert to manual sorting option. The window will then show the tra-
jectories in the order that the file was saved with.

QTM USER INTERFACE 140


NOTE: The sorting is only changed temporarily when looking at an open
file. The sorting is reverted back to the manual sorting as soon as you
close the file. If you want to change the manual sorting of the trajectories
in a file you have to use the Move label up and Move label down
options on the Trajectory info window menu, see chapter "Trajectory
info window menu" on page 144.
If you for some reason cannot revert to manual sorting, then try sorting
the trajectories again to activate the Revert to manual sorting option.

Select and move trajectories

Select a trajectory in the Trajectory info window by clicking on the trajectory.


It is possible to select multiple trajectories with standard Windows multiple
selection methods, i.e. Ctrl+Left-click, Shift+Left-click and Ctrl+A (select all).
To deselect all trajectories, either click on the Trajectory heading to deselect all
or use Ctrl+Left-click on the selected trajectories. The keyboard can also be
used to select and deselect trajectories, use Ctrl+Space to select a trajectory
and the up and down arrows to move in the list.
The trajectories can be moved to another Trajectory info window by a simple
drag and drop, either one by one or in a group. If the trajectory is dropped in
the Labeled trajectories window the trajectory is labeled 'New 0000', where
the number depends on how many trajectories there are in the window. Use
the options Move label up and Move label down in the Trajectory info win-
dow menu to change a position of a trajectory in the Labeled trajectories win-
dow
Trajectories can also be joined by drag and drop. Select a trajectory and drag
and drop it on another trajectory or on an empty label. The trajectory will then
contain a new part. If the trajectories overlap there will be a clash error. Tra-
jectories where one of the trajectories or a part is completely overlapped by
the data of the other trajectory cannot be joined at all.

NOTE: The trajectories can also be drag and dropped from the 3D view
window, see chapter "Trajectories in 3D views" on page 116.

QTM USER INTERFACE 141


Overlapping trajectory parts

Two trajectories cannot be joined as long as they have data in the same frames.
This will result in a clash and then the Overlapping ranges dialog is displayed.

Click OK to open to the Overlapping trajectory parts dialog, with which the
correct data can be chosen for the joined trajectory.

The data of the trajectories are shown in the chart area of the Overlapping tra-
jectory parts dialog. This is used to visualize the data which will be included in
the joined trajectory. The data is shown for the X, Y and Z coordinates, where
the coordinate that is displayed is chosen with the Plot coordinate option. The
names and colors of the trajectories are shown in the legend box below the
plot area.

QTM USER INTERFACE 142


Move the range selectors (the yellow lines) to choose the data that will be
included in the joined trajectory. Choose a range where you delete the data
that seems to be wrong. It is sometimes better to delete some extra frames
and gap fill that part instead of using the original data. The area between the
range selectors can either be gap filled or deleted, which is determined by the
Use Gap fill option. If the gap fill function is used the gap filled part is shown
as a yellow line between the range selectors.
To zoom in on an area of the plot, click and drag a rectangle in the plot area.
Zoom out to the previous zoom by right-clicking in the area.
If there is more than one clash, the Next and Previous options can be used to
step between the gap ranges. The range and the number of the gap are shown
below the plot.

NOTE: The area between the range selectors must always include at
least two frames, which must be gap filled or deleted.

QTM USER INTERFACE 143


Trajectory info window menu

The Trajectory info window menu contains options for the tracked tra-
jectories in a file. Most of the options are not available in real-time. The menu
can be accessed by right-clicking in the following places:
l On a trajectory or a part in one of the Trajectory info windows.

l On a marker or a trace in a 3D view window.

l It is also available on the Edit menu.

When multiple trajectories are selected the options on the menu is applied to
all of the trajectories. The following options are available on the menu:
Show trace
Toggle the display of the trace of the selected trajectories.

QTM USER INTERFACE 144


NOTE: There are also settings for the display of traces on the 3D
view settings page in Project options.

Center trajectory in 3D
Center the 3D view on the selected trajectory or part. The time will be
moved to the first frame of the selected marker, if the trajectory is not vis-
ible in the current frame.

NOTE: This option is available in RT/preview mode.

Jump to
Jump in time to the next unidentified trajectory or part (see illustration
below). The trajectory or part is selected and the 3D view is centered.
Next unidentified trajectory
Jump to the next unidentified trajectory (keyboard shortcut J).

Next unidentified part


Jump to the next unidentified part (keyboard shortcut Shift+J)

Change trajectory color


Change the color of the selected trajectories.

Set different colors


Change the color of the selected trajectories to different colors. It steps
through 256 colors, but with different steps depending on the number of
trajectories.

Set random color


Change the color of the selected trajectories to random colors, this can
result in similar colors.

QTM USER INTERFACE 145


Plot
Plot the 3D data of the selected trajectories, see chapter "Plot 3D data" on
page 151.

NOTE: This option is available in RT/preview mode.

Identify
Identify a selected trajectory by choosing an available label from the list. If
you have selected multiple trajectories, you can move them to another
Trajectory info window.

Rename
Rename the selected trajectory. This is the same as double-clicking on the
label.

NOTE: This option can only be used on one trajectory at a time.

Add prefix to selection...


Add a prefix to the selected trajectories.

Remove prefix from selection...


Remove a prefix from the selected trajectories.

Change prefix for all...


Change the prefix for all trajectories with the same prefix as the selected
trajectory. Note that the prefix separator must be an underscore so that
QTM can detect what to change. In the dialog replace the current prefix
with the new prefix. Do not add an underscore at the end.

QTM USER INTERFACE 146


Move label up
Move the label up one step in the Labeled trajectories list.

Move label down


Move the label down one step in the Labeled trajectories list.

Split part after current frame


Split a trajectory or part of a trajectory after the current frame into two
parts, see chapter "Split part after current frame" on page 151.

Swap parts
Swap parts between two selected trajectories, see chapter "Swap parts"
on page 151.

NOTE: The parts that are swapped must not overlap any other
parts in the trajectories.

Swap all parts


Swap all selected parts in the two trajectories (keyboard shortcut: S).

Swap current frame part


Swap the parts of the current frame in the two trajectories (key-
board shortcut: W).

Delete
Delete the selected trajectories or parts. If you delete a trajectory in the
Labeled trajectories window, it is moved to the Unidentified tra-
jectories window. More information see chapter "Delete trajectories" on
page 152.

Delete gap-filled parts


Delete the gap-filled parts of the selected trajectories or parts.

QTM USER INTERFACE 147


Delete virtual parts
Delete virtual parts of the selected trajectories or parts.

Add new trajectory


Add a new trajectory. The options are:
Empty: Create a new empty trajectory.

Virtual (Average of selected trajectories): Create a new virtual tra-


jectory based on the geometrical average of the currently selected
trajectories.

Virtual (Static from current frame): Create one or more new static
virtual trajectories at the current frame position of the currently
selected trajectory or trajectories.

Gap-fill trajectory
Fill the gaps of the selected trajectories using one of the following gap-fill
methods.

Linear
Fill the gaps of the selected trajectories using linear gap filling.

Polynomial
Fill the gaps of the selected trajectories using polynomial gap filling.
The maximum gap size is set to the Max frame gap setting under
Project Options, Trajectories (default 10 frames) used at the time
the file was being processed. For gap filling single trajectories with
alternative gap fill settings or methods, see chapter "Filling of gaps"
on page 642.

Relational
Apply gap filling of relational type according to the order of the selec-
tion. There is no maximum gap limitation. The selection order is as
follows:
1. Trajectory to be filled (required)

2. Trajectory used as origin (required)

QTM USER INTERFACE 148


3. Trajectory used to define the X axis

4. Trajectory used to the define the XY-plane

For more information about relational gap filling, see chapter "Filling
of gaps" on page 642.

Relational (Rigid body)


Apply relational gap-filling to trajectories which are in a rigid relation
to each other. Select at least four markers to apply Rigid body rela-
tional gap-fill. Gaps in any of the selected trajectories are filled using
the relation to the other trajectories. If there are overlapping gaps in
some of the trajectories then they are not gap-filled.

Kinematic
Fill gaps of selected trajectories using kinematic gap filling based on
current skeleton data. The selected trajectories must be included in
a skeleton definition.

Smooth trajectory (moving average)


Apply moving average smoothing to selected trajectories. The window
size is taken from the Smooth settings in the Settings sidebar of the Tra-
jectory Editor window, see chapter "Settings sidebar" on page 164.

Smooth trajectory (butterworth)


Apply butterworth smoothing to selected trajectories. The cutoff fre-
quency is taken from the Smooth settings in the Settings sidebar of the
Trajectory Editor window, see chapter "Settings sidebar" on page 164.

Generate AIM model from selection


Generate an AIM model using the currently selected trajectories only. For
more information, see chapter "Generating an AIM model" on page 625.

Define rigid body (6DOF)


Create a new 6DOF body on the 6DOF bodies page. The available options
are:

QTM USER INTERFACE 149


Current Frame: Create rigid body definition of selected trajectories
based on the current frame. This method is automatically applied
when in Preview mode.

Average of frames: Create rigid body definition based on the aver-


age relative positions of the selected trajectories across the capture.
The advantage of this method is that the statistics of the marker pos-
itions are taken into account. The Bone tolerance setting of the
body will be based on these statistics.

The 6DOF body will be added both to the 6DOF Tracking page in Project
Options and in the current file. The origin of the rigid body will be the geo-
metric average of the included marker positions. The orientation cor-
responds to the orientation of the rigid body at the first frame of the file.

NOTE: Make sure that the body includes at least three points.

NOTE: The trajectories will keep their label name unless they are
named 'New XXXX' or are unidentified.

Add to rigid body (6DOF)


Add current frame positions of selected markers to a rigid body. The point
(s) will be added to the rigid body definition in the current file and to that
in the Project Options if it exists.

Analyze…
Calculate parameters from the 3D data and/or filter the 3D data, see
chapter "Analyze" on page 153. Analysis can also be accessed from the
Analyze trajectory button on the AIM toolbar.

QTM USER INTERFACE 150


Trajectory management

Plot 3D data

To plot 3D data select one or more trajectories, click Plot on the Trajectory
info window menu and then the type of data. The data that can be plotted are:
3D X-position, 3D Y-position, 3D Z-position and Residual. For information
about the Plot window see chapter "Plot window" on page 179.
The curve of each trajectory will have the same color as in the Trajectory info
windows. Trajectories in the Unidentified trajectories window can be sep-
arated by their sequential number.
If trajectories with the same or similar colors are plotted the plot will auto-
matically use different colors. This means that if you used Set different colors
on a large number of trajectories then the color of two trajectories close to
each other in the list may change their color in the plot.

Split part after current frame

Split part after current frame (shortcut X) on the Trajectory info window
menu splits a trajectory or a part of trajectory into new parts after the current
frame. The first part will end at the current frame and the other part will start
on the frame after the current frame. This means that there is no gap between
the two parts, but in the trace in the 3D view it is visualized as a gap between
the two parts to show the split.
You can also use the Cut trajectory trace tool in the 3D view window to split a
trajectory. Click on a trace with the tool to split the trajectory at that position.

Swap parts

The Swap parts function on the Trajectory info window menu makes it easy
to swap parts that have been added to the wrong trajectories. Follow these
steps to the swap parts.

1. Expand the trajectories in which you want to swap parts, by clicking on


the + sign next to the label.
2. In the 3D view check where the tracking error starts and stops.

QTM USER INTERFACE 151


a. Make sure to check that there is a new part at the start and stop.
Otherwise you need to split that part for the swap to work.
3. Select the parts that needs to be swapped and use the Swap parts/Swap
all parts option on the Trajectory info window menu (shortcut S).
a. If it is just one part in each trajectory that needs to be swapped you
can use the Swap current frame part option instead (shortcut W).
This option will automatically swap the parts of the current frame.
b. Parts can be selected directly in the 3D view window, by holding
down Alt+Shift while clicking on a trace.
c. Make sure that you delete any gap filled parts at the start and end of
the swap.

NOTE: If there is any overlap with parts outside the parts that are
swapped, then QTM will give you a warning and you may select to
swap all of the parts that are overlapped.

Delete trajectories

To delete a trajectory or a part of a trajectory select it in a Trajectory info win-


dow and click Delete on the Trajectory info window menu. You can also use
Delete on the keyboard, this will also work in a 3D view window. When a tra-
jectory is deleted from the Labeled trajectories, the trajectory is moved to the
Unidentified trajectories window. Then if you delete it in Unidentified tra-
jectories window, the trajectory is moved to the Discarded trajectories win-
dow.
When Delete is used on a trajectory in the Discarded trajectories window, the
trajectory will be deleted from the file. A confirmation dialog is displayed
before the action is performed.
The trajectories in the Discarded trajectories window are not shown in the 3D
view windows and are not included in exported data. To show a discarded tra-
jectory and its trace in the 3D view, select the trajectory in the Discarded tra-
jectories window, see picture below.

QTM USER INTERFACE 152


Analyze

Calculations and filters can be applied to the trajectory data with Analyze on
the Trajectory info window menu, . Select the trajectories that you want to
analyze and click on Analyze to open the Analyze dialog. The dialog can also
be opened with the Analyze trajectory button on the AIM toolbar. To
export the results to TSV file, check the Export to TSV file option.

QTM USER INTERFACE 153


Filters

Filters can be applied both before and after the calculation, by selecting Use fil-
ter under the respective heading. Set the number of frames that are used for
each point in the filter with the Frames in filter window option. The number
of frames must have an odd value.
There are two available filters:
Fit to 2nd degree curve
This filter uses a 2nd degree curve when processing the data. For each
frame, the filter will first find the 2nd degree curve that best fits the data
in the filter window around the current frame. Then the data of the cur-
rent frame is set to the value of that curve at the current frame.

Moving average
For each frame, this filter first finds the average of the data in the filter
window around the current frame. Then the filter sets the data of the cur-
rent frame to the average found. (This can also be seen as fitting the data
of the filter window to a polynomial of degree zero.)

Calculation

Under the Calculation heading there are five parameters that can be cal-
culated:
Position
No calculation is performed. Use this setting to filter the motion data,
together with selecting Use filter either before or after calculation.
Unless the data is filtered or you select the Magnitude radio button there
is no difference between the result of the analysis and the original data.

Velocity
Calculates the velocity of the trajectories. Select Magnitude to calculate
the speed.

Acceleration
Calculates the acceleration of the trajectories.

Distance
Calculates the distance between two trajectories over time.

QTM USER INTERFACE 154


NOTE: Distance can only be calculated when just two trajectories
are selected.

Distance traveled
Calculate the distance that the trajectory has traveled in the meas-
urement. The distance will increase with every frame that marker position
has changed more than 0.2 mm compared with the last time the marker
moved. The 0.2 mm hysteresis is there to remove the static noise, oth-
erwise a marker that is static will still get a distance traveled over time. If
there is a gap in the marker data the distance traveled will restart on 0.

IMPORTANT: Because noise will be added in this sort of analysis


the accuracy can be low depending on the settings. The error will be
larger with a high capture rate, because for every sample more
noise is added to the distance. And even with the hysteresis it is
recommended to use a filter to remove noise. Also remove any data
in the trajectory that you do not want to add to the measurement,
because spikes in the static data will not be removed by the filter or
the hysteresis.

Angle
Calculates the angle between two lines in space, using either two, three or
four trajectories. Select the trajectories that you want to match with the
markers in the diagram to calculate the angle between the lines. For three
and four markers the angles are calculated in the range 0 – 180 degrees.
When four trajectories are selected the dialog looks like this:

When three trajectories have been selected the dialog looks like this:

QTM USER INTERFACE 155


When two trajectories have been selected the dialog looks like this:

For two markers the angle is calculated between the line and the planes
of the coordinate system. Select which planes you want to use with the
checkboxes XY, XZ and YZ. The angle is calculated between -90 and 90
degrees. Which sign the angle has depends on how line from Center Tra-
jectory to Distal Trajectory is pointing. For example if you are using the
XY plane then the positive side is when the line from central is pointing in
the positive Z direction.
The angle between the line and an coordinate axis is the complementary
angle to the perpendicular plane according to this formula, "angle to axis"-
"="angle to perpendicular plane" - 90 degrees.

Angular velocity
Calculates angular velocity for an angle defined in the same way as for the
angle calculations (see above). The angular velocity is the first derivative
of an angle, i.e. the rate of change of an angle, in degrees per second.

NOTE: Two to four trajectories must be selected to calculate angu-


lar velocity.

Angular acceleration
Calculates angular acceleration for an angle defined in the same way as
for the angle calculations (see above). The angular acceleration is the
second derivative of an angle, i.e. the rate of change of an angular velo-
city, in degrees per second.

QTM USER INTERFACE 156


NOTE: Two to four trajectories must be selected to calculate angu-
lar acceleration.

For each of the calculations it is possible to choose whether the output will be
the Magnitude or the Components of the results.
l For Position, Velocity, Acceleration and Distance, the Components means
that the result is shown for the three coordinates (X, Y and Z) separately. The
components of Distance are the distance projected on the three axes (X, Y
and Z).
The Magnitude means that the result is shown as the length of the vector
made up of the three components. For Position this means the distance in
mm from the origin of the coordinate system and for Velocity this means
the speed (in mm/s) of the trajectories. For Acceleration it does not have a
separate name, it is simply the magnitude of the acceleration. For Distance,
the magnitude is probably the result that is most appropriate, since it is the
actual distance between the two trajectories.

l For Angle and Angular velocity, the Components means that the angle is
projected onto the three perpendicular planes (YZ, XZ and XY respectively),
while the Magnitude is simply the angle between the two arms.

Output name

The name under the Output name heading is used in the Plot window and in
the TSV file. When the Add as suffix to the trajectory name setting is selec-
ted the trajectory name is added before the Output name.
The Add as suffix to the trajectory name setting can only be changed when
a single trajectory is selected. If more than one trajectory is selected and Pos-
ition, Velocity or Acceleration calculations are performed, the setting is
always selected. Otherwise all the results would have the same name. On the
other hand, when Angle, Angular Velocity or Distance calculation is per-
formed, the setting is always deselected, since there are two, three or four tra-
jectories selected but the result is a single value.

QTM USER INTERFACE 157


Label lists

The labels of the trajectories in the Labeled trajectories window can be saved
in label lists. The label list contains information about the names and colors of
the trajectory labels and the bones between the trajectories in the 3D view.
The label lists are controlled from the Labels menu.
To save a label list from the current capture file, click Save on the Labels
menu. Enter the name of the list and click Save. By default, the label list
file is saved as an XML file in the project folder.

To load a label list, click Load on the Labels menu and open the label list
file. Any existing trajectories in the Labeled trajectories window will be
discarded when a label list is loaded.

Labels menu

The Labels menu can be opened from the Edit menu or by either right-clicking
in the empty area of the Labeled trajectories window or on the names of the
columns.

Add new trajectory


Add a new empty trajectory to the Labeled trajectories window.

Load label list...


Load a label list to the Labeled trajectories window.

NOTE: Any existing trajectories in the Labeled trajectories win-


dow will be discarded when a label list is loaded.

Save label list...


Save the label list to an XML file.

QTM USER INTERFACE 158


Revert to manual sorting
Revert the sorting of the Labeled trajectories window to the sorting that
the file was saved with.

Trajectory Editor window


The Trajectory Editor window can be used for display and editing of tra-
jectory data. The editing facilities include several methods for gap filling and
smoothing of data.
For information on how to use the Trajectory Editor, see chapter "Editing of
trajectories" on page 640.
The Trajectory Editor window can be docked into QTM or it can be used as a
floating window, and the Window arrangements can be saved as Window lay-
outs. For detailed information, see chapter "Window handling" on page 81.

For an overview of keyboard shortcuts and mouse gestures, see chapter "Tra-
jectory Editor shortcuts" on page 211.

QTM USER INTERFACE 159


Plot area

The plot area of the Trajectory Editor window shows the selected data series.
The lay out of the plot area depends on the view settings. The view can be mod-
ified via the buttons on the toolbar, see chapter "Trajectory Editor toolbar" on
the next page. The plot area contains the following elements.
Data series
Red, green and blue lines for X, Y and Z, respectively.

Vertical axis (optional)


Values of respective data series. The units are mm, m/s and m/s2 for pos-
ition, velocity and acceleration, respectively.

Time axis (frames or time in seconds)


Horizontal axis with frame or time values.

Tool tip
Show data value at current mouse pointer position

Current frame
Gray line and frame/time value

Selection
Gray area, edge values and width

Gap indicators
Amber and blue indicators below the time axis indicating unfilled and
filled gaps, respectively, and corresponding areas in the plot area.

Spike indicators
Red indicators below the time axis indicating detected spikes.

QTM USER INTERFACE 160


Trajectory Editor toolbar

The toolbar of the Trajectory Editor window contains the following elements.
The left side of the toolbar shows the label of the currently selected trajectory.
The lock button to the left of the label can be used to lock the currently selec-
ted trajectory.
To the right of the label the action buttons are located. The action applies to
the current selection of the trajectory. The label of the selected trajectory is
shown on the right side of the action buttons. If needed, the latest actions can
be undone by pressing Undo in the Edit menu or Ctrl + Z.
Delete (Del)
Delete current selection. The selection will be permanently deleted.

Smooth (S)
Smooth current selection of trajectory using the smoothing method spe-
cified in the Trajectory Editor settings sidebar.

Fill (F)
Fill gaps within current selection using the fill method specified in the Tra-
jectory Editor settings sidebar. When expanding the button, you can
choose to fill all gaps (F) or only unfilled gaps (Shift + F).

The remaining buttons can be used to modify the view of the data series and
the lay out of the Trajectory editor window.
Vertical Axis
Toggle visibility of vertical axes.

Auto Zoom Vertical (A)


Toggle zoom vertical axis automatically.

Auto Zoom Horizontal (B)


Toggle zoom horizontal axis automatically when selecting trajectories.

Zoom To Extents (X)


Zoom horizontal and vertical axis to data extents.

Zoom Horizontal (H)


Zoom horizontal axis to data extents

QTM USER INTERFACE 161


Zoom Vertical (V)
Zoom vertical axis to data extents

Center Current Frame (C)


Center visual range on current frame

Zoom To Selection (Z)


Zoom to currently selected range

Zoom to Point of Interest(G)


Toggle automatic zoom to selected point of interest.

Series
Select data series to show in the chart (X, Y, Z, (R)esidual, (V)elocity, (A)c-
celeration)

View
The options are: (1) Combined view, (2) Component view; (3) Merged view.

Points of Interest Sidebar


Show/hide Points of Interest Sidebar.

Settings Sidebar
Show/hide Settings Sidebar.

QTM USER INTERFACE 162


Points of Interest sidebar

The Points of Interest sidebar contains information about gaps and detected
spikes.
The Gaps pane shows the start and end frames of gaps contained in the selec-
ted trajectory as well as their fill status.
The Spikes pane shows the start and end frames of detected spikes, as well as
their width (number of frames). The detection is based on the acceleration
level, which can be set in the Trajectory Editor settings.

QTM USER INTERFACE 163


Settings sidebar

In the settings sidebar of the Trajectory Editor window you can select meth-
ods for gap filling and smoothing, and change their parameters. In addition,
you can set the spikes detection level.
The following gap fill types are available. For more information on how to use
the different gap fill types, see chapter "Filling of gaps" on page 642.
Static
Gap filling by adding a fixed 3D position (virtual point).

Linear
Gap filling by means of linear interpolation.

Polynomial (default)
Gap filling by means of a cubic polynomial interpolation.

Relational
Gap filling by interpolation based on the movement of surrounding mark-
ers selected by the user.

Virtual
Gap filling by adding a virtual trajectory based on the movement of sur-
rounding markers selected by the user and an optional offset.

Kinematic
Kinematic gap fill of markers associated with skeleton segments or rigid
bodies.

QTM USER INTERFACE 164


The following smoothing methods are available. For more detailed information
about smoothing, see chapter "Smoothing" on page 646.
Moving average
Smoothing by means of an unweighted moving average.

Butterworth
Smoothing by means of a Butterworth filter.

The spikes detection is based on a threshold of the acceleration magnitude.


The threshold level can be adjusted to control the strictness of the spike detec-
tion.
Trajectory Editor window menu

The Trajectory Editor window menu contains options for the selected tra-
jectory. The menu can be accessed by right-clicking in the plot area of the Tra-
jectory Editor.
The following options are available:
Fill
Fill all gaps included in the selected frame range using the current gap fill
settings.

Fill Unfilled
Fill all unfilled gaps included in the selected frame range using the current
gap fill settings.

Smooth
Smooth the data included in the selected frame range using the current
smoothing settings.

QTM USER INTERFACE 165


Smooth Spikes
Smooth the spikes included in the selected frame range using the current
smoothing settings.

Delete
Delete the data included in the selected frame range.

Delete Frame xxx


Delete the frame at the mouse pointer position.

Delete Spikes
Delete spikes included in current selection. Only the frames with accel-
eration values above the current acceleration threshold will be deleted,
excluding the additional margins.

Set Current Frame


Set current frame to mouse pointer position.

Next Gap
Select next gap.

Previous gap
Select previous gap.

Next Spike
Select next spike.

Previous Spike
Select previous spike.

Vertical Axis
Show/hide vertical axis information.

Horizontal Time Axis


Toggle between frame or time (seconds) units for the horizontal axis.

Trajectory Overview window


The Trajectory Overview window can be used to get an overview of the tra-
jectories.

QTM USER INTERFACE 166


The Trajectory Overview window displays the following features:
Trajectory labels are shown in the left pane. Selected trajectories are
highlighted.
The trajectories are shown as bars spanning the time line.
Gaps and filled gaps are indicated by amber and blue areas, respect-
ively.
Spikes are indicated by red markings.
The Trajectory Overview window can be docked into QTM or it can be used as a
floating window, and the Window arrangements can be saved as Window lay-
outs. For detailed information, see chapter "Window handling" on page 81.
The Trajectory Overview window supports several interactive mouse gestures
for zooming, panning, scrolling, etc., see chapter "Trajectory Overview short-
cuts" on page 214 for a comprehensive list.

Data info window

QTM USER INTERFACE 167


The data for the current frame can be viewed in the Data info windows. There
can be three Data info windows open at the same time, so that different types
of data can be displayed. Open the windows by clicking Data info 1 to Data
info 3 on the View menu. Use Ctrl+D to toggle the display of the Data info 1.
The Data info windows can be opened both in file mode and in preview mode.
It is a tool window that can be placed anywhere and is by default placed on the
left side of the main window.
Data info window menu

The Data info window menu is accessed by right-clicking in the Data info win-
dow. With the Data info window menu the data can be switched between the
data types and plotted. The following options are available on the menu:
Display 2D data
Current camera
Display the current camera, i.e. the last the camera that you clicked
on, in the 2D view window.

Camera …
Choose which camera’s 2D data that will be displayed.

Display 6DOF data


Display the 6DOF data of the motion capture.

Display Analog data


Display the analog data of the analog board.

Display Force data


Choose which force-plate’s force data that will be displayed.

Plot
Plot the selected data, see chapter "Plot window" on page 179.

QTM USER INTERFACE 168


Plot Filtered
Filtered plot for the selected data, see chapter "Plot window" on page 179.

Calculate
Calculate magnitude of distance from origin of a 6DOF body, see chapter
"Calculate" on page 178.

Zero All Force Plates (only when displaying force data)


While in preview zero all of the force plates, used for Kistler and digital
AMTI plates.

Data types

2D data information

Click Display 2D data in the Data info window menu and then click a camera
to show its 2D data in the Data info window. Data can only be shown for one
camera at a time. Use Current camera to display the data for the current cam-
era in the 2D view window.

QTM USER INTERFACE 169


When 2D data is displayed, each row in the Data info window represents a
marker. The data of the markers is for the current frame and it is displayed in
the following four columns:
label
The labeled 3D trajectory (current frame) associated with the marker. The
label can only be displayed if the option to store rays is checked in the on
the 3D Tracking page under Project Options.

x, y
The position of the marker on the sensor in subpixels. The 2D data in this
window is unlinearized.

xSize, ySize
The size of the marker in subpixels.

To plot the data select the data for one or more markers, click Plot or Plot
filtered on the Data info window menu and then the type of data. With Plot
filtered you can apply a Fit to 2nd degree curve or Moving average filter.
For information about the Plot window see chapter "Plot window" on page 179.

6DOF data information

The data of the 6DOF bodies in the current frame can be viewed in the Data
info window. The bodies will be shown in the same order as on the 6DOF
Tracking page. The data is relative to the reference coordinate system as
defined for the respective rigid bodies (global coordinate system by default),
see chapter "Coordinate system for rigid body data" on page 354. The angles
are expressed in Euler angles according to the definition on the Euler angles
page.
Click Display 6DOF data in the Data info window menu to show the 6DOF
data in the following eight columns:

QTM USER INTERFACE 170


Body
The name of the 6DOF body.

x, y and z
The position (in mm) of the origin of the measured rigid body’s local
coordinate system relative to its reference coordinate system.

If the 6DOF data cannot be calculated, the x column displays "Not found".
When the 6DOF body is disabled in the 6DOF Tracking settings, the x
column displays "Disabled".

Roll, Pitch and Yaw


The rotation (in degrees) of the measured rigid body relative to its ref-
erence coordinate system, expressed in Euler angles as defined on the
Euler angles page.

Residual
The rid body residual, calculated as the average of the errors (distance) in
mm of the measured markers compared to the points in the rigid body
definition.

To plot the data select the data for one or more 6DOF bodies, click Plot or Plot
filtered on the Data info window menu and then the type of data. With Plot
filtered you can apply a Fit to 2nd degree curve or Moving average filter.

You can also plot the velocity and acceleration in the three directions of the
coordinate system for rigid body data and the angular velocity and acceleration
for the three rotation angles. It is recommended to use Plot filtered and apply
the filter before the calculation for Velocity and Acceleration, because the
noise is amplified by the calculations.

QTM USER INTERFACE 171


For information about the Plot window see chapter "Plot window" on page 179.

Skeleton data information

The data of the skeletons in the current frame can be viewed in the Data info
window. The skeletons will be shown in the same order as on the Skeleton
solver page and the data will use the definitions for angles and local coordinate
system on the Euler angles page.
Click Display Skeleton data in the Data info window menu to show the skel-
eton data in the following eight columns:
Skeleton
Name of the skeleton.

Segment
Name of the skeleton segment.

X, Y, Z
Segment 3D positions.

QTM USER INTERFACE 172


Roll, Pitch, Yaw
Segment orientation expressed in Euler angles. For custom Euler angle
definitions the order and names of these columns are as defined by the
user.

The following menu options are available when right clicking in the Skeleton
data information window.

Local Coordinates
Display local coordinates of segments relative to their respective parent
segment when checked. Display global coordinates of segments when
unchecked.

NOTE: Local coordinates with the Qualisys Sports Marker set gives
you the joint angles in Roll, Pitch and Yaw.

To plot skeleton segment data, select one or more segments, click Plot or Plot
filtered in the Data info window menu and select the type of data to plot.
With Plot filtered you can apply a Fit to 2nd degree curve or Moving aver-
age filter.

QTM USER INTERFACE 173


You can also plot the velocity and acceleration in the three directions of the
coordinate system for segment data and the angular velocity and acceleration
for the three rotation angles. It is recommended to use Plot filtered and apply
the filter before the calculation for Velocity and Acceleration, because the
noise is amplified by the calculations.

For information about the Plot window, see chapter "Plot window" on
page 179.

Analog data information

Click Display Analog data in the Data info window menu to show the analog
data of the analog board in the Data info window. It will then show the data
for the current frame in the following two columns:
Channel
The name of the analog channel.

QTM USER INTERFACE 174


NOTE: The names can be changed under the Analog boards
branch in the Project options dialog.

Value
The voltage input to the analog board.

NOTE: If the data is not in V then the unit is displayed after the
data. If you have a sensor on the EMG that gives you other data
than V then it is in the SI unit of that type of data, except for accel-
erometers that are in g.

Board Name
The name of the analog board or of other analog device.

Channel No
The channel number on the analog device.

Right click on a analog channel to open the Data info window menu with the
following settings.

To plot the voltage, select one or more channels, click Plot or Plot
filtered on the Data info window menu. With Plot filtered you can
apply a Fit to 2nd degree curve or Moving average filter. For inform-
ation about the Plot window see chapter "Plot window" on page 179.

NOTE: The number of seconds in the analog RT plot is specified on


the GUI page in the Project options dialog, see chapter "GUI" on
page 415.

QTM USER INTERFACE 175


Apply offset compensation
Activate/deactivate this option to apply/remove offset compensation for
the selected channels, i.e. where the line is blue. The option can only be
changed in a file and the settings used will be those set before the meas-
urement on the Analog board page in the Project options dialog, see
chapter "Compensate for analog offset and drift" on page 295.
In RT/preview however it will be greyed out and display the status of the
compensation on the Analog board page. I.e. if the compensation is
turned on the option is activated to display that it is applied in RT/-
preview.

Apply drift compensation


Activate/deactivate this option to apply/remove drift compensation for
the selected channels, i.e. where the line is blue. The option can only be
changed in a file and settings used will be those set before the meas-
urement on the Analog board page in the Project options dialog, see
chapter "Compensate for analog offset and drift" on page 295.
In RT/preview the option is greyed out and deactivated since drift com-
pensation is never applied in that mode.

Force data information

Click Display Force data in the Data info window menu, to show the force
data for all force plates in the Data info window.

QTM USER INTERFACE 176


NOTE: To show the force data the calibration settings of the force plate
must have been set on the Force plate page in the Project options dia-
log and the Calculate force data option must have been used during
processing or Recalculate forces applied on the capture file, see chapter
"Calculating force data" on page 703.

The Data info window will then contain data for the force plate's Force,
Moment and COP (Center Of Pressure) vectors. The data is for the actual force
on the force-plate. The displayed data is for the current frame in the following
five columns:
Parameter
The name of the vector.

X, Y, Z
The size (in N, Nm respectively mm) of the vector in the internal coordin-
ate system of the force plate, if you are using the Local (Force plate) set-
ting. The Z direction of the force plate coordinate system is pointing
down.

Force plate
The name of the force plate for the data in that row.

The coordinate system that is used is displayed in the title of the window,
plate or lab. You can switch between the coordinate system on the Force
data page in Project options dialog, see chapter "General settings" on
page 360.

QTM USER INTERFACE 177


NOTE: The Force and the COP is also visualized with a vector in the
3D view window. The vector is the opposite force applied to the
force plate, i.e. opposite to that shown in the Data info window.

To plot a vector select one parameter, click Plot on the Data info window
menu and then click Parameter. The data of the X, Y and Z directions are plot-
ted for the selected parameter in one window. It is not possible to plot more
than one parameter in one window. For information about the Plot window
see chapter "Plot window" on the next page.
The number of seconds in the force plot is specified on the GUI page in the Pro-
ject options dialog, see chapter "GUI" on page 415.
Calculate

The Calculate dialog is used to calculate the magnitude of the distance from
the origin of the reference coordinate system to the current position, under
Location of body heading. Or the current rotation of the body under the
Angle of body heading. Click on Calculate under the respective heading to get
the result, the value will only be updated when you click Calculate.
The distance can be calculated for different planes, select the plane with the X,
Y and Z options. The angle can only be calculated in one of the three rotations
at a time. The data is calculate as the average of the data in the number of
frames specified in the Calculate the mean of the last option.

QTM USER INTERFACE 178


Messages window

The Messages window contains the processing messages, camera error mes-
sages and some status information about what has happened in QTM, since it
was last started. The messages are displayed with the oldest first in the list,
scroll down to the bottom to see the latest messages.
The following mouse actions are supported:
Use the scroll wheel to scroll up and down the message list.

Double click on an error to open an error dialog with more specific


information.

Right click and select Clear list to clear the message list.

Plot window
The Plot window is used for displaying graphs of various types of data via the
Plot or Analyze Trajectory commands.
The Plot window can be docked into QTM or it can be used as a floating win-
dow, and the Window arrangements can be saved as Window layouts. For
detailed information, see chapter "Window handling" on page 81.

QTM USER INTERFACE 179


The Plot window contains the following elements:
The title showing what is plotted in the graph.

The axes showing the X and Y values of the data and their units. The X
units can be Frame or Time, dependent on the type of data displayed.

The plot area showing:


Data series
Line plot of plotted data. The colors of the lines correspond to the
trajectory colors. For other types of data, the color cycles through a
preset color map.

Events
Vertical lines representing events on the time line. The colors cor-
respond to the event colors.

Legend
The legend shows the names and colors of the plotted data series.
The legend box allows for the following interactions:
l Hover with the mouse over a label to highlight it in the graph.

l Click on a label to show or hide the data series.

QTM USER INTERFACE 180


Time cursor
The time cursor shows the position of the current frame. Interactions:
l Click and drag to move the time cursor.

Tool tip
When moving the mouse in the plot area a tool tip shows the Y values of
the displayed data series at the current X position of the mouse. The color
of the tool tip corresponds to that of the data series.

Mouse position
When mowing the mouse in the plot area, the X and Y values of the
mouse pointer position is shown in the lower-right corner.

Plot range
Areas outside the measurement range are grayed out. If the range filter in
the plot menu is disabled, data outside of the selected measurement
range is shown in the gray area. The plot range allows for the following
interactions:
l Click and drag the plot range edges to modify the measurement
range.
Plot menu

The Plot menu contains the settings and layout options of the plot. It can be
accessed by right-clicking anywhere in the Plot window.
The following options are available:
Edit Title
Open a dialog for modifying the plot title. Use the Reset button to revert to
the initial title of the plot.

QTM USER INTERFACE 181


X Axis / Y Axis
View or modify the respective axis settings. Edit Min or Max to set custom
plot limits. Tick the check boxes for Min or Max to use fixed values for the
plot limits, or Auto-Fit for automatically adapting to the data range.

Legend
Show or hide the legend.

Range Filter
Enable or disable the range filter. When the Range Filter is enabled, only
data within the selected measurement range is shown. When disabled, all
data within the measurement range is shown, and data outside of the
selected range is shown in the gray area.

Events
Show or hide events.

Style
Choose between dark or light mode.

When using Window layouts, Plot menu settings are stored as part of the Win-
dows layout.
Zooming, panning and other plot interactions

The Plot window supports the following interactions.


Zooming
The zooming behavior depends on the axis settings in the Plot menu.
When checking Min, Max or Auto-Fit for an axis, the zooming behavior is
restricted. The following zooming interactions are supported:
Scroll wheel (plot area)
Zoom in/out relative to the current mouse position.

QTM USER INTERFACE 182


Scroll wheel (axis area)
Zoom in/out horizontally (X Axis) or vertically (Y Axis) relative to the
current mouse position.

Click and drag


Zoom to selected area.

Double click (plot area)


Reset to data limits.

Double click (axis area)


Reset to horizontal (X Axis) or vertical (Y Axis) data limits.

Panning
The panning behavior depends on the axis settings in the Plot menu.
When checking Min, Max or Auto-Fit for an axis, the panning behavior is
restricted. The following panning interactions are supported:
Right-click and drag (plot area)
Panning in both X and Y directions.

Right-click and drag (axis area)


Panning horizontally (X Axis) or vertically (Y Axis).

Legend
Hover
Hover with the mouse over a label to highlight the data series.

Selection
Click on a label to show or hide the data series.

Timeline
Click and drag (time cursor)
Click and drag the time cursor to move the current frame.

Click and drag (plot range edges)


Click and drag the plot range edges to modify the selected meas-
urement range.
Plotting from file or during preview

The Plot window can be used to graph data series from a file or during preview.
When creating a new plot, the plot settings depend partly on if the plot is cre-
ated from a file or during a preview.

QTM USER INTERFACE 183


When creating a new plot from a file:
l The X and Y limits are by default adapted to the range of the data
included in the plot.
l None of the X or Y Axis check boxes are ticked.

When creating a new plot during preview:


l The X limits are set according to the Default Real-Time Plot Size setting on
the GUI project settings page.
l The Auto-Fit check box for the Y Axis is checked by default.

Recommendations when using saved Window layouts

When including Plot windows as part of a saved Window layout, the current
plot settings are saved with the following exception:
l The X and Y Axis limit values (Min, Max) are not stored when the check
boxes are unticked.
As a result, the reset behavior of the axes may depend on if the Window layout
is applied to a file or a preview. Furthermore, the zoom and panning behavior
is dependent on the Axis settings. It is therefore recommended to use Window
layouts with plot settings that are optimal for either files or preview. For
example, when creating a Window layout to be used as the default capture lay-
out, it is good practice to create the plots while in preview, or to make sure the
Auto-Fit check box for the Y Axis settings is ticked.

Menus
The following chapters contain a short description of the menu items available
in the QTM menu bar.
For a description of the popup window menus, please refer to the chapters of
the respective windows, e.g. the Trajectory info window menu in the chapter
about the Trajectory info window.

File
The File menu contains the following items:

QTM USER INTERFACE 184


New
Open a new empty capture file, i.e. go to preview mode. The cameras will
start in the same mode as last preview or measurement.

NOTE: This is needed to start a new measurement.

Open
Open an existing capture file. The following file types can be opened in
QTM:
QTM files: capture files (.qtm) and calibration files (.qca).

C3D files: import of C3D files (.c3d), see chapter "C3D import" on
page 187.

TRB or TRC files: import of TRB/TRC files (.trb/.trc), see chapter


"TRB/TRC import" on page 188.

Batch Process...
Open configuration dialog for the batch processing function, see chapter
"Batch processing" on page 605.

Save
Save a new capture file or save an existing file that has been changed.

Save as...
Save the active file with a new name.

NOTE: When saving files you can always go to the Project data
folder with the link at the top of the list to the left in the Save dia-
log.

Close
Close the active capture file.

New project...
Create a new project, see chapter "Creating a new project" on page 69.

QTM USER INTERFACE 185


Open project...
Browse for an existing project.

Save project
Save the current project settings to the Settings.qtmproj file.

Rename project
Rename the current project, this will rename the project folder.

Open project folder


Open the project folder where for example the calibration files are saved,
see chapter "Project folder" on page 61.

Manage projects...
Open the Manage Projects dialog, see chapter "Manage projects" on
page 72.

Settings management

Backup and Restore


Open a dialog to save and restore backups of the project settings,
see chapter "Backup of project settings" on page 73.

Import from another project


Browse for the project that you want to import the settings from. All
of the settings in your current project will be overwritten.

Project presets
Open a dialog to save project presets that can be used when cre-
ating projects, see chapter "Project presets" on page 74.

Export
Export the capture file to one of the following formats, see chapter "Data
export to other applications" on page 710.

Batch export
Select files and export formats in batch exporting dialog, see chapter
"Batch exporting" on page 710.

To TSV

QTM USER INTERFACE 186


To C3D

To MAT

To AVI

To FBX

To JSON

To TRC

To STO

Import

Add link to video file


Import a link to an AVI file, see chapter "Import video link" on
page 912.

Recent projects
List of recently opened projects

1-10...
The last ten opened QTM files.

Exit
Exit the application.
C3D import

When opening a C3D file, it will be imported in QTM. The import includes:
l 3D data

l Analog data

l Force data

l Force plate positions

You can display, manage and edit the imported data using the regular QTM
functionality. The imported data can be saved as a QTM file.

QTM USER INTERFACE 187


NOTE: It is not possible to retrack the trajectories since the imported file
does not contain any 2D data.

NOTE: When importing force data, a force plate may be converted to a


generic type-2, type-3 or type-4 plate. The generic types are used when
there is not enough information to select one of the vendor specific force
plates.

NOTE: When exporting the imported file as C3D, the exported C3D file
may have a different format than the original C3D file.

NOTE: Not all C3D files may be compatible with the import function.
Contact Qualisys support at [email protected] if you have problems
to import your C3D files.

TRB/TRC import

When opening a TRB or TRC file it will be imported in QTM.


A TRC file (.trc) is a text-based 3D data file format, including meta data
about the capture, such as capture rate.

A TRB file (.trb) is a binary 3D data file format, including meta data about
the capture, as well as marker colors and bones.

It is assumed that the Y-axis is defined as the upward axis. When importing into
QTM, the data will be transformed so that the Z-axis is pointing upwards.

Edit
The Edit menu contains the following items:

Undo
Undo the last edit action on trajectories.

QTM USER INTERFACE 188


Redo
Redo the last edit action on trajectories.

Delete
Delete trajectories or Delete bones depending on what is selected in
the file.

Trim Measurement
Trim the measurement to the current selected range on the time line.

NOTE: When trimming the file, the time stamp of the file cor-
responding to the start of the capture and event times are recal-
culated.

NOTE: The QTM file size will not change when trimming a file that is
included in a PAF project, but the data is trimmed to the selected
range.

Find
Search for the first occurrence of the search term in the Project data
tree, see chapter "Project view" on page 62.

Find Next
Search for the next occurrence of the search term in the Project data
tree.

Trajectory
The Trajectory info window menu for the selected trajectories, see
chapter "Trajectory info window menu" on page 144.

Bone
The Bone menu for creating and deleting bones, see chapter "Bone
menu" on page 120.

QTM USER INTERFACE 189


Labeled trajectories
The Labeled trajectories menu for the trajectories in the Labeled tra-
jectories window, see chapter "Labels menu" on page 158.

Events
The Events menu for creating and editing events, see chapter "Events
menu" on page 136.

Rigid Body
Change mesh settings or change color of selected rigid body. The changes
apply to the rigid body definition in the file, not the one in the project.

View
The View menu contains the following items:

Trajectory info windows


For information about Trajectory info windows, see chapter "Trajectory
info windows" on page 137.

Labeled
Toggle the display of the Labeled trajectories window, which shows
the trajectories that have been identified and labeled.

Unidentified
Toggle the display of the Unidentified trajectories window, which
shows the trajectories that are unidentified.

Discarded
Toggle the display of the Discarded trajectories window, which
shows all trajectories that have been discarded.

Toolbars
For information about toolbars, see chapter chapter "Toolbars" on
page 199.

Capture toolbar
Toggle the display of the Capture toolbar.

QTM USER INTERFACE 190


Playback toolbar
Toggle the display of the Playback toolbar.

AIM toolbar
Toggle the display of the AIM toolbar.

Standard toolbar
Toggle the display of the Standard toolbar.

GUI Control toolbar


Toggle the display of the GUI Control toolbar.

Trajectory toolbar
Toggle the display of the Trajectory toolbar.

Timeline
For information about the Timeline and the menu see chapter "Timeline
control bar" on page 133.

Data info 1 2 3
Toggle the display of the Data info windows (maximum 3 windows) for
showing and plotting 2D, 3D, 6DOF, analog and force data in both preview
and file mode. For detailed information see chapter "Data info window"
on page 167.

Project View
Toggle the display of the Project view window on the left side of the main
window see chapter "Project view" on page 62.

Messages
Toggle Messages window. For detailed information see chapter "Mes-
sages window" on page 179.

Status Bar
Toggle the display of the Status bar at the bottom of the main window.

Terminal
Open the terminal window for the QTM scripting API.

Trajectory Editor
Open the Trajectory Editor window.

QTM USER INTERFACE 191


Trajectory Overview
Open the Trajectory Overview window.

Full Screen
Enable/disable full screen mode.

File Information
Opens a window showing information about the capture, see chapter "File
Information" below.

Real-Time Client Information


Opens a window showing information about currently connected real-
time streaming clients and a log of real-time command and response com-
munication.

3D view/2D view
3D view window menu or 2D view window menu depending on the act-
ive View window, see chapter "3D View window menu" on page 131
respectively "2D view window menu" on page 103.
File Information

The File Information window shows the following information about the cap-
ture.

QTM USER INTERFACE 192


Start of capture as date and time according to the computer clock.

Computer timestamp at start of capture in seconds and ticks from when


the computer was started.

NOTE: The time stamp is recalculated when trimming the file.

WARNING: The time stamps indicate the start of the capture


according to the QTM software. The time stamps may not exactly
correspond to the first frame of the capture and it is discouraged to
use them for synchronization with data from other devices con-
nected to the same or another synchronized computer.

QTM version used for the capture.

Capture frequency used for marker tracking in Hz.

Analog devices used for the capture and their respective capture fre-
quencies.

Tracking information per camera, including number of points, tracking


residuals and average distance to the tracked markers.

Play
The Play menu contains the following items:

Play/Stop
Switch between play and stop. The file will be played in the direction it
was played the last time.

Play

Stop

Play backwards

QTM USER INTERFACE 193


Go to frame...
Enter a frame number in the Timeline Parameters dialog, see chapter
"Timeline parameters" on page 135.

First frame / Last frame

Next frame / Previous frame

Forward 5 frames / Back 5 frames

Forward 20 frames / Back 20 frames

Forward 100 frames / Back 100 frames

Next event / Previous event


Go to next/previous event in the file. Only available if there are events in
the file.

Play with Real-Time Output


The file is played with RT output until you press Stop. This feature can be
used to test the RT protocol and RT plugins as it gives full control of what
is sent. The speed of the real time process can be controlled with the Play-
back speed bar.

Capture
The Capture menu contains the following items:

Capture
Start the motion capture, see chapter "Capturing data" on page 566. If
used without preview the cameras will start in the same mode as last pre-
view or measurement.

NOTE: Changes to Stop capture during a capture.

QTM USER INTERFACE 194


Calibrate
Open the Calibration dialog that is used to start the calibration, see
chapter "Calibration dialog" on page 545.

Refine calibration
Open the Refine calibration dialog, see chapter "Refine calibration" on
page 550.

Advanced calibration (beta)


Open the Calibration dialog for advanced calibration, see chapter
"Advanced calibration (beta)" on page 552.

NOTE: This is a beta feature which can be used to improve cal-


ibration quality over the standard calibration. Advanced calibration
further refines the factory linearization parameters for optimal 3D
tracking. Note that longer calibration times and adjusted wand
movement might be required.

Refine Advanced calibration (beta)


Open the Refine calibration dialog for advanced calibration.

Calibrate Twin System


Start a Twin system calibration, see chapter "Performing a Twin cal-
ibration" on page 519.

Reprocess
Reprocess the active capture file, see chapter "Reprocessing a file" on
page 601.

Recalculate forces
Recalculate the force data in the active capture file, see chapter "Cal-
culating force data" on page 703.

Linearize camera
Start linearization of a camera, see chapter "Linearization procedure and
instructions" on page 486.

QTM USER INTERFACE 195


AIM
The AIM menu contains the following items. For information on automatic iden-
tification see chapter "Automatic Identification of Markers (AIM)" on page 624.

Apply model
Apply the current AIM model to the active capture file.

NOTE: The current AIM model is found on the AIM page in the Pro-
ject options dialog.

Generate model
Generate an AIM model from the active capture file.

Generate from multiple files...


Generate an AIM model by selecting multiple files.

Skeleton
The Skeleton menu contains the following items. For more information on skel-
eton calibration and tracking, see chapter "Tracking of skeletons" on page 671.

Calibrate skeletons
Create skeletons from T-pose in current frame and solve. The skeleton
definition will be added to or updated in the Skeleton solver page in the
Project Options and applied to the current file or real time preview.

Solve Skeletons
Apply skeleton solver to the currently open file.

Identify trajectories using skeleton (SAL)


Label trajectories using SAL, for more information see chapter "How to
use SAL" on page 700.

QTM USER INTERFACE 196


Display Animation Marker Set Guide
Open PDF with information about the Qualisys Animation Marker Set,
including the markers and their placement, T-pose requirements and the
use of extra markers.

Display Sports Marker Set Guide


Open PDF with information about the Qualisys Sports Marker Set,
including the markers and their placement, calibration pose requirements
and the use of extra markers.

Display Traqr VR Set Guide


Open PDF with information about the Qualisys Traqr VR Marker Set,
including the markers and their placement, T-pose requirements and the
use of extra markers.

Display Claw Marker Set Guide


Open PDF with information about the Qualisys Claw Marker Set, includ-
ing the markers and their placement, calibration pose requirements and
the use of extra markers.

Display Full Fingers Marker Set Guide


Open PDF with information about the Qualisys Full Fingers Marker Set,
including the markers and their placement, calibration pose requirements
and the use of extra markers.

Tools
The Tools menu contains the following items:

Project options
Open a dialog with settings for the QTM software, see chapter "Project
options dialog" on page 217.

Analyze Trajectory
Open dialog to analyze trajectories, see chapter "Analyze" on page 153.

Window
The Window menu contains the following items:

QTM USER INTERFACE 197


New window

2D
Open a new 2D view window.

3D
Open a new 3D view window.

Video
Open a new 2D view window displaying only the DV/webcam video
cameras.

Window layouts
See chapter "Window layouts" on page 82.

Layout 1, ... , Layout 5


Activate a saved window layout 1 to 5.

Default File Layout


Activate the default file layout.

Default Capture Layout


Activate the default capture layout.

Save as

Layout 1, ... , Layout 5


Save the current window layout as Layout 1 to 5.

Default File Layout


Save the current layout as default layout for file mode.

Default Capture Layout


Save the current layout as default layout for capture mode.

Help
The Help menu contains the following items:

QTM USER INTERFACE 198


Help topics
Open the QTM help window.

Display keyboard shortcuts


Show a dialog with all of the commands and their keyboard shortcuts.

Open documentation folder


Open the documentation folder that contains for example the PDF ver-
sion of the QTM manual and a Getting started manual.

Login
Log in to your Qualisys customer account for access to online processing.

Start Qualisys Remote Support


Start the Qualisys Remote Support program for remote support over Inter-
net, please contact Qualisys AB on [email protected] first to book a
time. When the program is started support can connect to your computer
and you can also chat with support.

Check for updates


If the computer has an internet connection, QTM will check if there are
updates for QTM or any installed input devices available. If so, inform-
ation and a download screen will be presented.

About Qualisys Track Manager


View the user name, license number, version number, RT protocol and
loaded modules of the installed QTM software. For information how to
enter a plug-in license see chapter "Adding licenses" on page 55.

Toolbars

Standard toolbar

The Standard toolbar contains the following icons, from left to right:

QTM USER INTERFACE 199


New capture file

Open capture file

Save capture file

Open the Project options dialog

Undo

Redo

Open a new 2D view window

Open a new 3D view window

Open a new 2D view window displaying only the DV/webcam video cam-
eras

Playback toolbar

The Playback toolbar contains the following icons, from left to right:
Go to the first frame

Go back twenty steps

Go back one step

Play reverse

Stop

Play forward

Go forward one step

Go forward twenty steps

QTM USER INTERFACE 200


Go to the last frame

Playback speed

NOTE: Move the Playback speed bar, which is logarithmic, or write


the percentage of the speed directly in the Playback speed box.
Double-click on the bar to reset the speed to 100%.

Capture toolbar

The Capture toolbar contains the following icons, from left to right:
Start capture

NOTE: Changes to (Stop capture) during a capture.

Calibrate the camera system

Reprocess the file

Recalculate the force data

Linearize camera

Add event

GUI Control toolbar

The GUI Control contains icons for toggling GUI elements in the 3D view. To
change the appearances of the GUI elements open the 3D view settings page
in the Project options dialog. The toolbar contains the following icons, from
left to right:

QTM USER INTERFACE 201


Marker Labels

Bones

Force Arrows

Force Plates

Grid

Covered Volume

Calibrated Volume

Bounding Box

Camera View Cones

Camera Rays

Follow Selected Markers

Skeletons

AIM toolbar

The AIM toolbar contains the following icons, from left to right:
Apply AIM model (in RT mode this button restarts AIM)

Generate AIM model

Calibrate skeletons

Analyze trajectory

Reload scripts

QTM USER INTERFACE 202


Trajectory/Views toolbar

The Trajectory toolbar contains the following icons, from left to right:
Show/hide Data info window 1

Show/hide Data info window 2

Show/hide Data info window 3

Show/hide Project View

Show/hide Messages window

Show/hide Status bar

Show/hide Terminal

Show/hide Labeled trajectories window

Show/hide Unlabeled trajectories window

Show/hide Discarded trajectories window

Show/hide Trajectory Editor window

Show/hide Trajectory Overview window

Keyboard shortcuts

Menu shortcuts
The following menu commands can be accessed through keyboard shortcuts.
Workflow shortcuts

Ctrl + N
New capture file.

QTM USER INTERFACE 203


Ctrl + M
Open the Start capture dialog.

Ctrl + Shift + Alt + C


Open the Start calibration dialog.

Ctrl + Shift + R
Open the File reprocessing dialog.

F9
Apply AIM model to a capture file or restart AIM and 6DOF calculation in
preview and capture mode.

F10
Calibrate skeletons.
Capture file shortcuts

Ctrl + O
Open capture file.

Ctrl + S
Save capture file.

Ctrl + Shift + S or F12


Save as a new capture file.

Ctrl + F4
Close capture file.
Editing shortcuts

Ctrl + Z
Undo the last editing of the trajectories or bones.

Ctrl + Y
Redo the last action that was undone.

Ctrl + E
Add a manual event.

Del
Delete selected bones or "degrade" selected trajectories:

QTM USER INTERFACE 204


Labeled trajectories: move the selected trajectories/parts to Uniden-
tified trajectories window.

Unidentified trajectories: move the selected trajectories/parts to Dis-


carded trajectories window.

Discarded trajectories: permanently delete the selected tra-


jectories/parts.

Ctrl + Shift + T
Trim a capture file to the current measurement range.
Display and window shortcuts

Ctrl + W or Ctrl + ,
Open the Project options dialog.

Ctrl + D
Toggle the display of the Data info window.

Ctrl + R
Toggle the display of the Project view window.

Ctrl + T
Toggle the display of the Trajectory Editor window.

Ctrl + 1, ... , Ctrl + 5


Activate a saved window layout.

Ctrl + Shift + 1, ... , Ctrl + Shift + 5


Save the current window layout.

F1
Open the QTM help window.

F11 or Shift + Alt + Enter


Enable/disable full screen mode.

2D/3D view shortcuts


The following commands can be accessed in the 2D and 3D views.

QTM USER INTERFACE 205


2D view shortcuts

3
Switch to 3D view.

Shift + M
Activate the Marker mask tool.

Ctrl + Click on camera button


Focus on the selected camera in the 2D view.
3D view shortcuts

2
Switch to 2D view.

P
Toggle 3D view projection between orthogonal and perspective.

Ctrl + Alt + Arrow


Switch between orthogonal viewpoints

C
Center on the selected marker in the 3D view window.

L
Move the selected trajectories/parts to the Labeled trajectory info win-
dow.

U
Move the selected trajectories/parts to the Unidentified trajectory info
window.

Del
Delete selected bones or "degrade" selected trajectories:
Labeled trajectories: move the selected trajectories/parts to Uniden-
tified trajectories window.

Unidentified trajectories: move the selected trajectories/parts to Dis-


carded trajectories window.

X
Split the selected trajectories after the current frame.

QTM USER INTERFACE 206


Ctrl + Alt
Quick identification method, see chapter "Manual identification of tra-
jectories" on page 620.

Shift + B
Activate the Create bones sequence tool.

Shift + C
Activate the Center trajectory tool.

Shift + X
Activate the Cut trajectory trace tool.

B
Create bones between all selected labeled trajectories.

NOTE: Only available in the 3D view.

Left-click
Select a trajectory.

Alt + Left-click
Select only the current part of the trajectory.

Shift + Left-click
Select multiple trajectories.

Ctrl + Left-click
Select multiple trajectories, possible to de-select trajectories.

Shift + Alt + Left-click


Select the current part of multiple trajectories.

Ctrl + drag
Scrubbing feature (3D view window). Hold control key, press left mouse
button anywhere in the empty 3D space and drag to scrub through the
measurement's time line.

Shift + Mouse wheel


Trace range zooming feature. In the 3D view window, hold the shift key
and use the mouse wheel to increase or decrease the trace range.

QTM USER INTERFACE 207


F8, Shift + F8
Use selected trajectories to create a new rigid body definition, based on
average of frames (F8) or on the current frame (Shift + F8).

Trajectory info window shortcuts


The following commands are available in the Trajectory info window.
C
Center on the selected marker in the 3D view window.

J and Shift + J
Jump in time to the next unidentified trajectory or part, respectively, and
center on it in the 3D view window.

S
Swap the selected parts of two trajectories.

W
Swap the part of the current frame in two selected trajectories.

X
Split the selected trajectories after the current frame.

Shift + F8
Define a rigid body with the current frame of the selected trajectories.

F8
Define a rigid body with an average of data in all frames of the selected
trajectories.

L
Move the selected trajectories/parts to the Labeled trajectory info win-
dow.

U
Move the selected trajectories/parts to the Unidentified trajectory info
window.

D
Move the selected trajectories/parts to the Discarded trajectories win-
dow.

QTM USER INTERFACE 208


Del
Delete selected bones or "degrade" selected trajectories:
Labeled trajectories: move the selected trajectories/parts to Uniden-
tified trajectories window.

Unidentified trajectories: move the selected trajectories/parts to Dis-


carded trajectories window.

Discarded trajectories: permanently delete the selected tra-


jectories/parts.

Ins
Add a new label in the Labeled trajectories window.

F2
Rename the selected trajectory.

Alt + Up
Move the trajectory up in the list.

Alt + Down
Move the trajectory down in the list.

Ctrl + A
Select all trajectories in the current Trajectory info window.

Up and Down arrows


Move in the list of trajectories. It can be used together with Shift to select
trajectories.

Ctrl + Space
Toggle the selection of a trajectory.

Right and Left arrows


Expand respectively collapse a trajectory’s list of parts.

Playback keys
The following keys can be used to play the file:
Space
Play forward and Stop.

QTM USER INTERFACE 209


Ctrl + Shift + Space
Start playing with real time output

Ctrl + drag
Scrubbing feature (3D view window). Hold control key, press left mouse
button anywhere in the empty 3D space and drag to scrub through the
measurement's time line.

Ctrl + G
Go to frame.

Right Arrow
Go forward one step.

Shift + Right Arrow


Go forward five steps.

Ctrl + Right Arrow


Go forward twenty steps.

Ctrl + Shift + Right Arrow


Go forward one hundred steps.

Left Arrow
Go back one step.

Shift + Left Arrow


Go back five steps.

Ctrl + Left Arrow


Go back twenty steps.

Ctrl + Shift + Left Arrow


Go back one hundred steps.

Home
Go to the first frame.

End
Go to the last frame.

Page Down
Go to next event.

Page Up
Go to previous event.

QTM USER INTERFACE 210


Project options shortcuts
The following keys can be used to navigate in the Project options dialog and in
other similar dialogs:
Return
OK

Space
Click the selected button or toggle the selected option.

Up and Down arrows


Step through the alternatives in selected radio buttons.

Tab
Step through the settings. Use shift to step backwards.

Ctrl + Tab
Step out of the settings page to the options buttons at the bottom of the
dialog. Use shift to step backwards.

Ctrl + Up or Down arrow


Step through the pages in the options list.

Ctrl + Right or Left arrow


Expand respectively collapse an options branch, e.g. Processing.

Trajectory Editor shortcuts


The following keyboard shortcuts and mouse gestures can be used in the Tra-
jectory Editor window.

NOTE: There may be overlapping shortcuts for the 3D/2D view. Make
sure that the Trajectory Editor window has focus when using the below
shortcuts.

Keyboard shortcuts

Ctrl + A (select all)


Select complete horizontal range.

QTM USER INTERFACE 211


Delete (delete)
Delete selected range.

Alt + Delete (delete spikes)


Delete spikes in current selection.

F (fill gap(s))
Fill selected gap(s) using current fill type.

Shift + F (fill unfilled gap(s))


Fill only unfilled selected gap(s) using current fill type.

S(smooth)
Smooth the data in the selected frame range using the current smoothing
settings.

Shift + S (smooth spikes)


Smooth the spikes in the selected frame range using the current smooth-
ing settings.

Ctrl + K (next gap)


Select next gap.

Ctrl + J (previous gap)


Select previous gap.

Shift + K (next spike)


Select next spike.

Shift + J (previous spike)


Select previous spike.

C (center current frame)


Center visual range on current frame.

Z (zoom to selection)
Zoom to the currently selected range.

X (zoom to extents)
Zoom horizontal and vertical axis to data extents.

H (zoom horizontal)
Zoom horizontal axis to data extents.

V (zoom vertical)
Zoom vertical axis to data extents.

QTM USER INTERFACE 212


A (auto zoom vertical)
Toggle zoom vertical axis automatically.

B (auto zoom horizontal)


Toggle zoom horizontal axis automatically when selecting trajectories.

G (zoom to gap)
Toggle automatic zoom to selected gap.

T (time mode)
Toggle time units in seconds/frames.

Q, Shift + Q
Traverse the trajectory list down and up, respectively.
Mouse gestures

Left mouse button - double click


Set current frame.

Left mouse button - drag


Select range.

Right mouse button - click


Open Trajectory Editor window menu.

Right mouse button - drag


Pan.

Middle mouse button - click


Toggle split/merge view.

Ctrl + Left mouse button - drag


Time line scrubbing: move the Current Frame marker forwards or back-
wards.

Shift + Left mouse button - drag


Horizontal zoom to selected area.

Alt + Left mouse button - drag


Box zoom to selected area.

Mouse wheel
Horizontal zoom.

QTM USER INTERFACE 213


Shift + Mouse wheel
Horizontal pan.

Ctrl + Mouse wheel


Vertical pan.

Trajectory Overview shortcuts


The following mouse gestures can be used in the Trajectory Overview win-
dow.
Zooming

Ctrl + mouse wheel (horizontal zoom)


Zoom in/out relative to the current mouse cursor location.

Shift + left mouse button - drag (horizontal zoom to selected range)


Hold down SHIFT and click and drag in the chart to select the range which
will be zoomed into.

Ctrl + left mouse button - double click in the chart


Show the whole measurement. In case the time line is cropped, the
cropped area will be shown.
Panning

Right mouse button - drag


Pan horizontally.
Timeline

Left mouse button - click on time line (set current frame)


Click on the time axis to move the Current Frame marker there. You can
also hold down the mouse button and drag.

Ctrl + left mouse button - drag anywhere in the chart (time line scrub-
bing)
Move the Current Frame marker forwards or backwards.
Scrolling

Mouse wheel
Scroll the trajectory list vertically.

QTM USER INTERFACE 214


Selection

Left mouse button - double click on a label or a bar (select trajectory)


Select a trajectory. The trajectory will be selected in the Trajectory Win-
dow, 3D View window and the Trajectory Editor (if not locked).
Layout

Left mouse button - click and drag the separator between the tra-
jectory labels column and the chart
Change the width of the trajectory labels column.

Left mouse button - double click on the separator between the tra-
jectory labels column and the chart
Reset the width of the trajectory labels column to match the longest
name.

Plot window shortcuts


The following mouse gestures can be used in the Plot window.
Zooming

Scroll wheel (plot area)


Zoom in/out relative to the current mouse position.

Scroll wheel (axis area)


Zoom in/out horizontally (X Axis) or vertically (Y Axis) relative to the cur-
rent mouse position.

Click and drag


Zoom to selected area.

Double click (plot area)


Reset to data limits.

Double click (axis area)


Reset to horizontal (X Axis) or vertical (Y Axis) data limits.
Panning

Right-click and drag (plot area)


Panning in both X and Y directions.

QTM USER INTERFACE 215


Right-click and drag (axis area)
Panning horizontally (X Axis) or vertically (Y Axis).
Legend

Hover
Hover with the mouse over a label to highlight the data series.

Selection
Click on a label to show or hide the data series.
Time line

Click and drag (time cursor)


Click and drag the time cursor to move the current frame.

Click and drag (plot range edges)


Click and drag the plot range edges to modify the selected measurement
range.

Dialogs
The following chapters contain a short description of the main dialogs in QTM.

QTM dialogs
The dialogs in QTM where different settings are specified are described in the
chapters were they are used. The essential dialogs are:
Project options dialog
This is the main dialog in QTM with the settings for the hardware and the
processing of data, see chapter "Project options dialog" on the next page.

Calibration dialog
This dialog is used to start a calibration, see chapter "Calibration dialog"
on page 545.

Calibration results dialog


This dialog shows the results of a calibration, see chapter "Calibration res-
ults" on page 558.

QTM USER INTERFACE 216


Recalibration settings dialog
This dialog is used to recalibrate a capture file, see chapter "Recalibration"
on page 563.

Start capture dialog


This dialog is used to start a capture, see chapter "Start capture" on
page 569.

File reprocessing dialog


This dialog is used to reprocess a capture file, see chapter "Reprocessing
a file" on page 601.

Project options dialog


The configuration of software and hardware settings in QTM is accessed via
Project options on the Tools menu. The settings are described in detail in the
chapters: "Input Devices" on the next page, "Processing" on page 320, "GUI" on
page 415 and "Miscellaneous" on page 427. You can get a summary of the cur-
rent settings under the Camera system settings heading on the Camera sys-
tem page.
There are four options at the bottom of the Project options dialog.
Click OK to save the settings and apply it to the current measurement. It
will also close the dialog.

Click Apply to save the settings and apply it to the current measurement
without closing the dialog.

Click Cancel to close the dialog without saving the options.

Click Help to open the QTM help window.

NOTE: Most of the settings will only affect a measurement if QTM is still
in the preview mode, i.e. before the measurement has started. To change
settings on a file you must use the reprocessing dialog instead, see
chapter "Reprocessing a file" on page 601.

QTM USER INTERFACE 217


Project opt ions

Input Devices

The Input Devices page lists the available devices in QTM. Double-click on an
entry to go to the settings for that device. The list contains the following inform-
ation:
Enabled
The Enabled column contains the checkbox to enable the device and also
a status light for the device. A green circle means that the device is con-
nected, a yellow circle means that the device status is unknown and red
circle means that the device is disconnected.

NOTE: Devices where the connection status is unknown are always


displayed in the list.

Name
The Name of the device.

PROJECT OPTIONS 218


Category
The Category of the device. The different categories are:
Camera System
Qualisys systems.

Analog Boards
Analog boards supported by QTM.

NOTE: The analog board settings are saved with the serial
number of the board. So the analog board will be visible in the
project even if you disconnect the board from the computer or
it is not active in Instacal.

AV Devices
Any video camera connected to the computer, for example via a
Blackmagic Design card or a webcam.

Force Plates
Digitally integrated force plates.

Instrumented Treadmills
Digitally integrated instrumented treadmills.

EMGs
Digitally integrated EMG systems, see "Wireless EMG systems" on
page 804

Eye Trackers
Digitally integrated eye trackers, see chapter "Eye tracking hardware
in QTM" on page 864.

Manufacturer
The Manufacturer column contains a link to the website of the device's
manufacturer. If the manufacturer is unknown there is a link to Google.

There are four options at the bottom of the list:


Add Device
Integrated devices that can be added to the Input Devices list.

PROJECT OPTIONS 219


Remove Device
You can remove any disconnected (red circle) device.

Refresh
Use the Refresh button to refresh the status of the devices.

NOTE: The refresh button does not work on analog boards when
they are added or changed in Instacal. In that case you need to
restart QTM.

Download device drivers


The Download device drivers link opens a website with current drivers
for Delsys, AMTI and SMI.

Camera system

PROJECT OPTIONS 220


The Camera system branch of the options tree consists of hardware settings
for the cameras. On the Camera System page you can locate the system, set
the Marker capture frequency and the Real time frequency, and obtain an
overview list with the status of the Camera system settings.
The Camera system page contains the following subcategories: Cameras, Cal-
ibration and Synchronization.
Marker capture frequency

The Marker capture frequency is the capture rate that will be used in a
marker measurement. The frequency can be set to an integer between 1 Hz
(frames per second) up to the maximum frequency of the current camera type.
For an overview of maximum capture frequencies per camera, see chapter
"Qualisys camera sensor specifications (marker mode)" on page 926.
Real time frequency

The Real time frequency is the frequency that is used for the real-time pro-
cessing. It applies both in Preview mode and Capture mode. For more inform-
ation on real-time measurements see chapter "Real-time streaming" on
page 590. There are two options for the Real time frequency.
Marker capture frequency
With this option the real-time frequency will be the same as set with
Marker capture frequency. Which means that the cameras will capture
at that frequency and QTM will process as many frames as it can.

Reduced real time frequency


On this option the real time processing is set to run on a fixed frequency
other than the Marker capture frequency. It can be useful if you will
make a capture at a high frequency, then the real-time can be set to a
lower rate so that you can view the data better in preview. You can also
set a lower real time frequency during capture to optimize the real time
output. During capture it is recommended to use a reduced frequency
which is a divisor of the capture rate.

The reduced frequency cannot be set higher than the Marker capture
frequency setting. This is to ensure that you are not using an exposure
setting that is too high for the real-time frequency.

PROJECT OPTIONS 221


NOTE: If the analog board uses the External sync option, the ana-
log frequency can be too low in the real time mode (less than 20 Hz)
when reducing the real time frequency. Because the ratio between
the camera frequency and the analog frame rate will still be the
same. If this happens increase the real time frequency so that the
analog frame rate becomes higher than 20 Hz.

Camera system settings

Under the Camera system settings heading, the settings for the current cam-
era system are displayed. Every entry is marked with a notification icon. There
are three different notification icons, which have the following meaning:

The specified setting is within the range of the normal settings.

The specified setting is configured in a way that differs from the normal
setting. Make sure that it is correct.

The specified setting is configured in a way that can ruin the meas-
urement if it is not correct. Make absolutely sure that the setting is cor-
rect. Some settings may actually damage the camera system, see chapter
"External timebase" on page 278.

Click on an entry to go directly to the corresponding page in the Project


options dialog. By right-clicking on an entry a menu with two options appear:
Change and Reset to default(s). Click Change to go directly to the cor-
responding page. Click Reset to default(s) to reset that setting to the default
value.
Locate System

PROJECT OPTIONS 222


The currently located system is listed under the Camera system heading. To
locate the system use the Locate system button. QTM will open the Finding
camera system dialog and start looking for the system.
QTM can store the default system for the project to check if the camera system
is changed. Activate the Show warning when camera system differs from
default option to enable the default system check. Then save the current sys-
tem as the default system with Set as Default System. QTM will give a warning
when a camera or sync unit has been added or removed from the system. Note
that when the default system check is enabled then you cannot start preview if
cameras are missing. In that case you either need to set a new default system
or disable the check.

Finding camera system

QTM will scan the Ethernet connections for Qualisys systems. When QTM has
found the camera systems that are connected to the Ethernet ports, it will
report the configuration of the camera systems in the Finding camera system
(s) dialog.

NOTE: If one or more devices have old firmware, a dialog appears for
applying a necessary firmware update for using the system with the
installed version of QTM. For more information see chapter "Firmware
update when locating system" on page 470.

NOTE: If more than one camera systems are found, select the camera
system that will be used from the list. Only one camera system can be
used at a time.

PROJECT OPTIONS 223


Click on System info to see detailed information about the selected system.
For Arqus and Miqus systems the Auto order button can be used to auto-
matically order the cameras in the 2D View window.

Camera system information

When the scanning is finished, it is possible to view some camera information.


The Camera system information dialog is opened with the System info push
button on the Finding camera system(s) dialog.
The Camera system information dialog shows information about the indi-
vidual Qualisys devices as illustrated below. This is the same information as
Camera information in the 2D view window menu.

PROJECT OPTIONS 224


Auto order

When a camera system is selected, it is possible to automatically order the cam-


eras by pressing the Auto Order button. When the ordering is finished, the
Auto Order button is renamed to Reverse Order. For more information about
the ordering of cameras, see "Automatic ordering of the cameras" on page 481.

NOTE: Auto ordering is not supported for Oqus cameras.

Cameras

The Cameras page contains all of the camera settings for Qualisys cameras. It
is often easier to use the tools in the 2D view and the Camera settings sidebar
if you want to change settings that are used more often, see chapters "2D view
toolbar" on page 89 and "Camera settings sidebar" on page 91.

PROJECT OPTIONS 225


There are two lists on the page: to the left there is a list with the cameras in the
current camera system and to the right is a list with all the settings for the selec-
ted cameras.
The camera list has the following columns.
ID
The number of the camera shown in the 2D view window and on the cam-
era display.

NOTE: The camera with an M after it is the master camera.

Type
The type of camera.

Serial
The serial number of the camera.

Ip-address
The IP address for the camera.

On the Cameras page you can set individual settings for the cameras. Select
the cameras that you want to change the settings for in the camera list. You can
use Ctrl and Shift to select multiple cameras. The settings list will be updated
depending on the cameras that are selected. If multiple cameras are selected
and there is a setting that has been set individually its value is red. When chan-
ging such values, it will set all the selected cameras to the same setting. Use the
Select all button to select all of the cameras. Only the global settings will be
shown in the settings list if none of the cameras are selected.
The settings list contain the settings for the selected cameras. All of the set-
tings marked with * are global. The settings are described in the chapters
below.
Check the Show description option to get a short description of the selected
setting.
Use the Reset settings button to reset all of the camera settings to the default
values.

PROJECT OPTIONS 226


NOTE: When mixing the different types of cameras, all global settings
will be limited to the lowest value of camera type limits. However, when
the settings are set individually it is possible to set the value within the
camera type limits.

Camera settings

Camera Mode

The Camera Mode setting switches the camera mode for the selected cam-
eras. The available modes are Marker, Marker Intensity and Video. The
modes can also be changed during preview in the 2D view. Changing the
modes in the 2D view will update the Camera Mode setting.
When starting a measurement, the cameras will start in the mode selected with
the Camera Mode setting. However, since changing in the 2D view updates the
setting it means that it is usually the same as the last preview.

Marker settings

Capture rate

The Capture rate is the frequency that is used during a marker measurement.
The setting is global for all cameras in Marker mode.
It is the same setting as Marker capture frequency on the Camera system
page, which means that if the setting is changed it is updated on both pages.
The possible capture range is shown to left of the setting. The range is changed
depending on the camera types within the system, as well as the size that is set
with Image size. For an overview of the maximum capture frequencies per cam-
era model at full size image, see "Qualisys camera sensor specifications
(marker mode)" on page 926.

PROJECT OPTIONS 227


Exposure time

The Exposure time setting changes the exposure time for marker mode. For
marker mode the exposure time and flash time is the same because it is no
meaning exposing the image longer than the flash. The setting can be set indi-
vidually for each camera.
Increase this setting if the markers are not visible. Decrease the exposure if you
have extra reflections. The Range shown to the left of the setting is the range
that can be used with the current capture rate. The maximum value is a tenth
of the period (1/capture rate) or at most 1000 ms in marker mode.

NOTE: For Qualisys underwater cameras exposure times longer than


one tenth of the period can be used. Avoid the use of long exposure
times when testing the cameras in air because they may overheat.

This setting is the main option to change amount of light in the image. Because
then you can change the light input without touching the cameras. Especially
for Oqus 3- and 5-series it is good to have the aperture maximum opened and
change the exposure time instead.
For tips on how to set the exposure see chapter "Tips on marker settings in
QTM" on page 483.

Marker threshold

The Marker threshold sets the threshold intensity level for markers in
Qualisys cameras. The setting can be set individually for each camera.
The level can be set between 5 and 90 % the max intensity level, the default
value is 17. Every pixel value that is brighter than the specified value will be cal-
culated as a marker.

PROJECT OPTIONS 228


When you lower the threshold more pixels will be considered to be a marker. If
you set the threshold too low there might be strange effects, including no mark-
ers or very long markers. On the other hand, a too high threshold can also
mean that the camera cannot calculate any markers.
For tips on how to set the exposure see chapter "Tips on marker settings in
QTM" on page 483.

Marker mode

Under Marker mode click on for the Type setting to select the used marker
type. The setting is global for all cameras. The options for this setting are:
Passive
Use this setting for any marker with reflective material. This is the default
option.

Active
Use this setting for Traqrs, the active 500 mm calibration wand, or short
range active markers (SRAM) with sequential coding.

Passive + Active
In this mode the camera will capture both passive and sequence coded
active markers. This mode can be used if you need to add some tem-
porary markers to the subject and do not want to add active markers.
However, if you mix the passive and active markers all the time you lose
some of the advantages of both types.

Long range spherical active markers (or reference markers)


Any of the long range active marker and reference marker, also the old 40
mm active marker.

Untriggered active markers


Use this mode for active markers that are lit constantly. In this mode the
strobe is turned off to minimize unwanted reflections.

PROJECT OPTIONS 229


Inverted passive markers
This mode is only available for the 3+-series. It inverts the marker detec-
tion so dark areas are calculates as markers instead of the bright areas.

Marker ID range

When Type is set to Active or Passive + Active, the setting ID range becomes
available. The ID range setting sets the sequence length that is used to define
unique blinking patterns that are used to identify the markers. The options are:
Standard (1-170)
Standard ID range of 170 uniquely defined markers using a sequence
length of 21 frames. This is the default option.

Extended (1-740)
Extended ID range of 740 uniquely defined markers using a sequence
length of 41 frames. Use this range if you have more than 170 active mark-
ers. This option is only supported for the Active and the Naked Traqr in
combination with Arqus or Miqus cameras. The Traqr units need to be cor-
rectly configured using the Traqr configuration tool.

For information about the use of passive and active markers, see chapter
"Passive vs Active markers" on page 529. For information about Qualisys active
marker solutions, see chapter "Active marker types" on page 1000.

NOTE: The old type of close range active marker is not supported in QTM
2.5 or later. Please contact Qualisys AB if you use this type of active
marker.

Image size

PROJECT OPTIONS 230


With Image size it is possible to change the pixels that are active when you cap-
ture marker data. The setting can be set individually for each camera. Use the
maximize button to reset the selected cameras to the maximum image size.
The image size is specified in Left and Right, Top and Bottom. Where the Left
and Right values can only be specified in certain steps depending on the cam-
era type. The steps for the current model is displayed next to the settings. The
maximum size depends on the camera type and the selected sensor mode.
By reducing the image size it is possible to increase the capture frequency. The
frequency range will be updated automatically when you change the size. The
image size is showed with a red rectangle in the preview 2D view window. The
cameras will still capture markers outside the rectangle in preview, but when
you make a measurement that data is discarded. In the 2D view of a file the
aspect ratio of the camera view is adapted to the reduced image size.

NOTE: The red rectangle representing the image size is not drawn lin-
earized, this means that with wide angle lens it is best to turn off the
Show linearized data option on the 2D view settings page to see the
true positions of the mask and image size.

PROJECT OPTIONS 231


NOTE: Notes on image size for specific camera models:
l The image size of Arqus, Miqus, 2c, 5+, 6+ and 7+ series cameras is
only limited in the y-direction when increasing the capture rate.
However when increasing the capture rate on the Camera settings
sidebar then the image ratio is kept. You can then increase the
width of the image size by dragging on the red rectangle.
l The 16 extra pixels of the 3+-series are not active pixels and will be
displayed as a black band, if you zoom in on the right side of the
image.
l The x size of the 3-serie camera must be at least 160 pixels. The x
size of the 3+-serie camera must be at least 144 pixels in Standard
mode and 72 pixels in High-speed mode.
l The image size of the 5-series camera can only be changed in the y-
direction. This is because the frame rate on this sensor will not be
increased when changing the number of pixels in the x-direction.

Marker masking

With the Marker masking option, masks can be added to the camera image.
The masks will be shown as green areas in the Marker and Marker intensity
mode. Any detected markers within the masked areas are discarded from the
camera output. For information on how to use the marker masking, see
chapter "How to use marker masking" on page 537.
Use the button on the Auto marker masking option to automatically add
markers masks to the selected cameras. You can select several cameras when
applying the auto marker mask, but it is important to remove any real markers

PROJECT OPTIONS 232


from the volume. The cameras must be in RT/Preview mode to use this feature.
For more information, see chapter "How to use auto marker masking" on
page 538.
The marker mask is set individually for each camera. Select one camera and
add a mask by clicking the plus button on the Number of masks line. The
maximum number of masks per camera is 20 for Arqus and Miqus cameras
and 5 for Oqus cameras. Use the minus button on the line to remove a mask.
Modify the values for the mask in the list.
Left (x-min)
The start pixel of the masking area in X-direction in camera coordinates.

Right (x-max)
The end pixel of the masking area in X-direction in camera coordinates.

Top (y-min)
The start pixel of the masking area in Y-direction in camera coordinates.

Bottom (y-max)
The end pixel of the masking area in X-direction in camera coordinates.

Use current masks in measurements


Enable the use of the marker masks in the measurements. Use this option
to turn off the masks to check that the masks are correct. The masks will
be grey in the Marker and Marker intensity mode and you can see the
markers behind the mask.

PROJECT OPTIONS 233


NOTE: Notes on masks:
l Masks are applied on-camera, which means that masked markers
are not recorded in QTM and cannot be restored. If you need to
apply masks to a file, you can use software masks instead, see
chapter "How to use software marker masks" on page 611.
l The origin of the camera coordinates is the upper left corner of the
2D view and X-direction is horizontal and Y-direction is vertical.
l The pixels can be converted to subpixels by a multiplication of 64.

l Marker masks are not drawn linearized, this means that with wide
angle lens it is best to turn off the Show linearized data option on
the 2D view settings page to see the true positions of the mask
and image size.

Marker circularity filtering (Oqus)

For Oqus cameras the option to use marker circularity filtering is available. The
Marker circularity filtering options are used to filter non-circular markers in
the 2D data. The default is that the filtering is not Enabled. For more inform-
ation about the filtering see chapter "Marker circularity filtering (Oqus only)" on
page 541.
The marker filtering is done in the camera according to the options below.
Enabled
To turn on the filtering check the Enabled checkbox. Which markers that
are filtered out depends on the Circularity level option.

Circularity level
The Circularity level option defines which markers are filtered out as too
non-circular. These markers are then processed according to the Non-cir-
cularity marker settings on the 2D preprocessing and filtering page.
The option has five levels: All markers, Low, Medium, High and Very high.
The default value is Medium. When set to All markers, then all markers

PROJECT OPTIONS 234


are considered to be non-circular and all segments are therefore, if pos-
sible, sent to QTM. The levels Low to Very high corresponds to a circularity
relation of 20 to 80% between x- and y-size.

NOTE: There can be too many markers considered as non-circular in the


camera. In that case there is a red warning in the left upper corner of the
2D view and all the markers that the camera has not been able to process
are considered to be OK.

Marker limits
The Marker limits settings decides which reflections that are detected as mark-
ers by the cameras. When Use default settings is set to True no marker limits
are used and QTM will detect all reflections as markers.

Set the Use default settings to False to set other Marker limits settings. The
setting can be set individually for each camera. The Marker limits settings are
applied on-camera, which means that discarded markers are not recorded in
QTM and cannot be restored.
There are three discrimination settings:
Marker size
Smallest
The Smallest parameter controls the smallest detectable size of a
marker (in subpixels) for each camera in every frame. Any marker
smaller than this will be discarded from the camera output.
This option might be useful to screen out tiny non-marker reflec-
tions or to assure a minimum size of the markers seen by the cam-
era.
The minimum value of this parameter is 128 and the maximum is
the value of the Largest parameter.

PROJECT OPTIONS 235


Largest
The Largest parameter controls the largest detectable size of a
marker (in subpixels) for each camera in every frame. Any marker lar-
ger than this will be discarded from the camera output.
This option might be useful to screen out reflections of other cam-
eras, which tend to be large. It is also useful on other large non-
marker reflections.
The minimum value of this parameter is the value of the Smallest
parameter and the maximum is 60000.

NOTE: The Smallest and Largest parameters are expressed in sub-


pixels, i.e. the sizes depend on the distance between marker and
camera. Check the 2D data in the Data info window to see the size
of the markers you want to remove.

Max number of markers per frame


The Max number of markers per frame parameter controls the max-
imum number of markers transmitted by each camera on a per frame
basis. The value can be specified with any integer between 1 and 10000.
When there are more markers in a frame than the value of this para-
meter, some will be skipped. Since markers are calculated from the top of
the image and downward, markers will be skipped from the bottom of the
camera’s FOV. For example, if there are 10 real markers within the FOV of
a camera and the Max number of markers per frame parameter has
been set to 5 markers, it is just the 5 topmost markers that will be cal-
culated and transmitted to the measurement computer.

Exposure delay

The Exposure delay setting shifts the exposure of a camera compared to other
cameras in the system. This can be used when the flash of a camera disturbs
another camera in the system. However, because the time of the exposure will
be different in the camera system the 2D position that corresponds to the first

PROJECT OPTIONS 236


exposure group is predicted by the 3D tracker. The prediction of the 2D pos-
ition works best when using a higher frame rate. For more information see
chapter "Delayed exposure to reduce reflections from other cameras" on
page 534.
To activate the exposure delay, select Camera group or Advanced on the
Exposure delay mode setting. When the delayed exposure is activated Using
delayed exposure is displayed in the Status bar. The delayed exposure setting
is displayed in each camera's 2D view, e.g. expgrp: 1 when the camera is in
exposure group 1.

The recommended mode is Camera group because then the delay is cal-
culated automatically. First select the cameras that you want in the group from
the camera list to the left. Then select one of the five groups in the Camera
groups setting. E.g. if you only have two groups you should use group 1 and 2.
QTM will then automatically delay the exposure time of the groups so that the
delay for a group is the same as the longest exposure time in the group before.
This means that you can use any exposure time for the cameras and the delay
will always be correct.

The Advanced mode should only be used if you are completely sure about
what you are doing. The delay is then set with an absolute time in the Delay
time setting. The delay time will be the same even if you change the exposure
time of a camera in the system, which means that you must then manually
change the Delay time for the cameras to keep the correct delays to reduce
reflections.

PROJECT OPTIONS 237


NOTE: Notes on exposure delay for specific camera models:
l The Arqus A5 camera is more sensitive to light outside of the expos-
ure time, compared to other Arqus cameras. It means that the
exposure delay can remove reflections from other cameras, but you
usually need to use marker masks to remove direct light from other
cameras.
l Delayed exposure does not work for the Oqus 5-series camera.

Sensor mode

The Sensor mode option can be used to reduce the resolution to get a higher
capture ratewhile keeping the same field of view. The option is only available
for camera models that have more than one sensor mode. The setting can be
changed individually on each camera and also for Marker and Video mode sep-
arately. The sensor mode setting can also be changed on the Camera settings
sidebar. For an overview of available sensor modes per camera model, see
"Qualisys camera sensor specifications (marker mode)" on page 926.

Video settings

Capture rate

The Capture rate on the Video settings is only used for video capture and can
be set independently of the marker capture rate. The setting is also individual
for the cameras so that you can capture at different video capture frequencies.
The maximum capture rate can be increased by reducing the Image size. The
current capture range is shown next to the capture rate. In the Project options
dialog the Image size must be changed first to get a higher frequency.
However, in the Camera settings sidebar the Image size will be reduced auto-
matically when choosing a higher frequency.

PROJECT OPTIONS 238


For an overview of maximum capture rates for each camera model, see
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927 when
using in-camera MJPEG compression and "High-speed video" on page 960 for
uncompressed video.

NOTE: It is possible to capture short video sequences with Qualisys cam-


eras that are not configured for high-speed video. The capture rate is
then limited to 30 Hz, even if you use a reduced image size.

Exposure time

The Exposure time setting changes the exposure time for video mode. The set-
ting can be set individually for each camera.
The setting sets the exposure time in microseconds. The maximum exposure
time is up to 60 microseconds less than the period time, i.e. 1/capture rate.

Flash time

The Flash time setting changes the flash time for video mode. The setting can
be set individually for each camera.
The setting sets the flash time in microsecond. The maximum flash time is
always limited by the exposure time but there are also other limitations. First of
all the flash time cannot be longer than 2 ms. The other limitation is that it can
only be one tenth of the period time of the capture rate. For example, if the cap-
ture rate is 200 Hz the maximum flash time is 500 μs.
The IR flash does not contribute much when you capture video. Therefore, it is
often a good idea to set the Flash time to its minimum of 5 μs.

Gain

PROJECT OPTIONS 239


With the Gain option the intensity of the video image can be increased. Select
the desired gain from the drop-down list. The available values are 1, 2, 4, 8 and
16 (for older camera models the maximum gain is 4). The default value is 1
except for the Oqus 2c-series and Miqus Video where it is 4, or it is set auto-
matically when using auto-exposure. Use the Gain option if the exposure time
and aperture is already at their maximums.
All cameras, except the Oqus 3 and 5 series, have analog gain before the image
capture and therefore the image quality is not decreased at the same degree
with increasing gain. The number of gray scales in the image is also the same
for different gain settings.
The Oqus 3 and 5-series have digital gain, which means that the intensity of the
image is just multiplied with the desired gain. The result is that image quality
and number of gray scales are decreased.

NOTE: The Gain option is available on the Camera settings sidebar for
all cameras except the Oqus 3- and 5-series.

Image size

With Image size it is possible to change which part of the image that is cap-
tured in video mode. The setting can be set individually for each camera. Use
the maximize button to reset the selected cameras to the maximum image
size.
The selected image size will be shown with a red square in the video preview
window. The frequency range will be updated automatically when you change
the size. The camera will still capture video outside the rectangle in preview,
but when you make a measurement the image is cropped.
The image size is specified in Left and Right, Top and Bottom, where the Left
and Right values can only be specified in certain steps depending on the cam-
era type. The step size for the current model is displayed next to the settings.

PROJECT OPTIONS 240


For an overview of the maximum image size for each camera model, see
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927 and
"High-speed video" on page 960.

NOTE: Notes on image size for specific camera models:


l In the uncompressed mode of Oqus 2c, 5+, 6+ and 7+ series cam-
eras the image size is only limited in the y-direction when increasing
the capture rate. However, when increasing the capture rate on the
Camera settings sidebar, the image ratio is kept constant. You can
then increase the width of the image size by dragging on the red
rectangle.
l The image size of the Oqus 5 series camera can only be changed in
the y-direction. This is because the frame rate on this sensor will not
be increased when changing the number of pixels in the x-direction.
l The x size of the Oqus 3 series camera must be at least 160 pixels.
The x size of the Oqus 3+ series camera must be at least 144 pixels
in Standard mode and 72 pixels in High-speed mode.
l The In-camera MJPEG mode for the Oqus 5+-series has a larger
maximum image size because it compresses the image in the cam-
era.

Image resolution

The image resolution will change the quality of the images that are captured by
Qualisys cameras according to the options below. The setting can be set indi-
vidually for each camera.
Resolution
Set the number of pixels that are captured, e.g. with 1/4 resolution every
other line and column is deleted. Choose the resolution for the camera
from the options below.

PROJECT OPTIONS 241


Full - Use all of the pixels

1/4 - Use every second pixel in x and y direction

1/9 - Use every third pixel in x and y direction

1/16 - Use every fourth pixel in x and y direction

1/36 - Use every sixth pixel in x and y direction

1/64 - Use every 8th pixel in x and y direction

1/256 - Use every 16th pixel in x and y direction

NOTE: To optimize the memory usage there is a column of black pixels


to the right in the image. This is especially visible at 1/64 or lower res-
olution.

NOTE: Notes on image resolution for specific camera models:


l The setting is not available for the Oqus 2c, 5+ and 7+-series cam-
eras. If you want to reduce the image size on those cameras you
have to use the Sensor mode setting, which has the advantage of
also increasing the maximum capture rate. For more information,
see chapter "Sensor mode" on page 246.
l The Oqus 4-series camera cannot handle full resolution for video
images, therefore the maximum image resolution is 1/4.

Video compression

The Video compression option can be used to reduce the video file size. By
default the Mode is set to In-camera MJPEG for all cameras that support that
option. You can set the Compression quality of the MJPEG compression to
change the quality and file size of the video; the default is 50. Increasing the

PROJECT OPTIONS 242


quality will lead to larger images and in some cases this means that not all of
the video images can be sent during the capture. These frames will be fetched
after the capture is finished.

NOTE: For Miqus Video, the Video compression setting is not available.
Miqus Video cameras always stream video using In-camera MJPEG.

The default mode on the other cameras is None for uncompressed image and
maximum quality. If the file is uncompressed it may not play at the correct
speed in most external programs.
Switch the Mode setting to Software to specify a codec on the computer for
compression of the video.

Choose a codec from the list on the Compression codec line to activate it. The
codecs in the list are the ones that are installed on your computer.
The codecs are then grouped in Recommended codecs and Other codecs. For
more information about the recommended codecs, see chapter "Recom-
mended codecs" on page 583.

NOTE: If none of the recommended codecs is installed on the computer,


there are no alternatives under the Recommended codecs heading.
Download links for the recommended codecs are available at
https://www.qualisys.com/info/recommended-codecs/.

If available, the settings for a codec are opened with the button on the Con-
figure codec line. The settings depend on the codec, please refer to the codec
instructions for help on the settings.
Click on the button on the Codecs recommended by Qualisys line to go to
a web site with links to codecs.

PROJECT OPTIONS 243


Color/Grayscale

The Color/Grayscale option is only available if you have connected an Oqus 5-


series camera with a color sensor. For the color cameras you can select
between the following options for the color interpolation.
Grayscale
Capture the raw black and white image. The image will be checkered
because of the color filter in front of the sensor. This pattern cannot be
removed because the filter is integrated with the sensor.

Color
Use bilinear color interpolation to get a color image.

NOTE: If you want to use the color interpolation on 5-series with


the Image resolution option, then you must select 1/9 or 1/36 for
the color interpolation to work. The other options delete the color
information from the sensor.

NOTE: It is important to notice that the color interpolation is time con-


suming and therefore the RT frequency will be much lower with color
interpolation.

NOTE: In a measurement the color interpolation is done after the video


has been fetched from the camera, so it does not change the maximum
time that you can capture video.

Auto exposure

PROJECT OPTIONS 244


The Auto exposure option is available on Miqus Video and Oqus 2c cameras.
With the auto exposure the exposure time and gain is controlled automatically
so that image is bright enough. When the camera is using one of the pre-
defined presets then the auto exposure is always on. For the Custom preset it
must be turned on with the Auto exposure option.
When turned on there are two additional settings Auto exposure com-
pensation [EV] and Auto exposure area. The Auto exposure compensation
option controls how bright the image will be. Increase the value to have a
brighter image and decrease it to get a darker image. This option is also avail-
able in the Camera settings sidebar in preview.
By default the whole image area is used to calculate the brightness of the
image. The area used in the brightness calculation can be set with the Auto
exposure area option. In this way the camera can calculate the brightness on
the part of the image that is most useful. This can be used for example when
there is a bright window in the image. It is easiest to define the area with the
Auto Exposure tool in preview, see chapter "2D view toolbar" on page 89.

Color temperature

The Color temperature setting is only available for the Oqus Color video (2c)
camera. With it you can adapt the color temperature used for the white balance
in the video. If you use the wrong option the colors may be very wrong, so it is
important to test the different options to find the best one.
Daylight
This option uses a color temperature that is most suitable for daylight. It
works best outdoors, but also works well if you have large windows.

Office lighting (Cool white fluorescent)


This option is for bluer type of lights. It is usually not the best option if
you mixed types of lighting, for example if there are windows.

Home lighting (Incandescent)


This option is for more yellow type of lights, like light bulbs. It is usually
not the best option if you mixed types of lighting, for example if there are
windows.

PROJECT OPTIONS 245


No color correction
For this option the color temperature is not adjusted at all. It can for
example work well in situations with mixed types of light.

Sensor mode

The Sensor mode option can be used to reduce the resolution to get a higher
capture rate while keeping the same field of view. The option is only available
for camera models that have more than one sensor mode. The setting can be
changed individually on each camera and also for Marker and Video mode sep-
arately. The sensor mode setting can also be changed on the Camera settings
sidebar. For an overview of available sensor modes per camera model, see
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927 and
"High-speed video" on page 960.

Active filtering

The Active filtering mode is used to improve capturing in daylight conditions.


For marker cameras, it is used both in marker and video mode. For Miqus
Video and Hybrid cameras, active filtering is only applied to the marker mode
and any reduced image size is only displayed in Marker or Intensity mode.
The Active filtering mode can be activated individually on each camera.
However usually it is best to activate it on all cameras in the system, but for
example if only a few cameras have a lot of sunlight then you can activate Act-
ive filtering only for those cameras. For more information about active filtering
see chapter "Active filtering for capturing outdoors" on page 539.
There are two settings for the Active filtering mode:
Continuous (Recommended)
The Continuous setting is the default and recommended filtering setting.
When activated the camera captures an extra image before the actual
image. With the exact same settings as the actual image except that there
is no IR flash. The extra image is then subtracted from the actual image so
that the background light is suppressed. This helps in for example

PROJECT OPTIONS 246


daylight conditions where the background light is much brighter than
indoors.

NOTE: The maximum capture rate is limited to about 50% of the


cameras normal maximum capture rate when Active filtering is
activated. This is because there are two images captured for every
frame.

Single
With the Single setting the camera only captures an image before starting
a measurement. As for the Continuous settings the extra image uses no
IR flash but is otherwise the same as the actual image. This setting can be
used if you have a very static setup.

NOTE: Active filtering is not supported on the Oqus 5 camera.

Lens aperture

The Lens aperture setting is only available for Qualisys cameras with a motor-
ized lens. The limits of the aperture depends on the mounted lens. In most
cases it is recommended to have the aperture around 4, for more tips on aper-
ture and focus, see chapter "Tips on setting aperture and focus" on page 481.
It is best to change this setting in preview from the Camera settings sidebar in
the 2D view, so that you can check the brightness and focus of the image dir-
ectly.

NOTE: Once focus and aperture have been set for the cameras in a fixed
camera system it is possible to disable lens control using the Qualisys
Firmware Installer. This way the current lens settings will be fixed. For
more information, see "How to use Qualisys Firmware Installer (QFI)" on
page 471.

PROJECT OPTIONS 247


Lens focus distance

The Lens focus distance setting is only available for Qualisys cameras with a
motorized lens. The limits of the setting is 0.5 m and 20 m, for more tips on
aperture and focus, see chapter "Tips on setting aperture and focus" on
page 481.
It is best to change this setting in preview from the Camera settings sidebar in
the 2D view, so that you can check the focus of the image directly.

NOTE: Once focus and aperture have been set for the cameras in a fixed
camera system it is possible to disable lens control using the Qualisys
Firmware Installer. This way the current lens settings will be fixed. For
more information, see "How to use Qualisys Firmware Installer (QFI)" on
page 471.

2D view rotation

The 2D View Rotation setting defines the rotation of the camera in the 2D
view window. The available options are 0, 90, 180 and 270. You can select mul-
tiple cameras in the camera list to change rotation for all of the selected cam-
eras at the same time.

NOTE: The setting does not apply to camera views in a QTM file. To
change the rotation in a file you must use the Rotate view option on the
2D view window menu.

Start delay

The Start delay option sets the delay in μs for the camera system. When
enabled the camera system will delay the start in relation to the time that the
master camera receives the start command from QTM. This delay is required
on the Main system in a multiple video system setup.

PROJECT OPTIONS 248


The default start delay is 40000 μs and is needed to make sure that all of the
cameras in the system receives the start time from the master camera.

NOTE: The Start delay option is not in use if the cameras are syn-
chronized with an external timebase.

Linearization

The Linearization page contains information about the linearization of each


camera, see chapter "Linearization of the cameras" on page 485. You can also
use the checkboxes to include or exclude cameras for tracking.

Camera linearization parameters

Under the Camera linearization parameters heading there is a list of all the
linearization files of the connected cameras. In the list you can manage the lin-
earization files and select whether a camera will be used for tracking or not.
Each camera is delivered with its own linearization file (*.lin) stored in its
memory. The file name includes the serial number of the camera. When con-
necting a system for the first time in a project, the linearization files are loaded
into the project and downloaded to the C:\ProgramData\Qualisys\Linearization
folder of the computer.
The linearization files that are currently loaded in the project are used as the
intrinsic calibration parameters of the cameras.

PROJECT OPTIONS 249


The following information is displayed for each linearization file:
Camera
The ID of the camera in the current system.

Serial #
The serial number of the camera.

Lin-file
Name of the linearization file.

Focal Length
The focal length reported in the linearization file.

Created (date)
Date of creation for the linearization file.

Deviation (CU)
The deviation reported in the linearization file. This is usually a number
between 2-4 subpixels (CU).

Managing the linearization files


There are three ways to load the camera’s linearization file into the project. The
recommended method is to Load from cameras.

PROJECT OPTIONS 250


1. Load from file
Select a camera in the list, click the Load from file button, and select a lin-
earization file (*.lin).

WARNING: With this method, there is no check that the file cor-
responds to the serial number of the camera, so make sure to select
a file that matches the serial number of the camera.

2. Load from cameras


Click on Load from cameras. A dialog appears and QTM starts to load
the linearization files from the Qualisys cameras into the project.
3. Load from folder
Click on Load from folder. A dialog appears where you can browse for
the folder containing linearization files (*.lin) for the cameras.

If the files are present in the Linearization folder under C:\Pro-


gramData\Qualisys\, just click OK. Otherwise, locate the correct folder and
then click OK. When you click OK, QTM will look for linearization files that
match the serial numbers of the cameras, and choose the ones with the
last date in the file name. If it finds the files in the folder the following dia-
log is shown and the correct files are installed.

PROJECT OPTIONS 251


When you right-click on a camera, you get a context menu with the following
options:

Load linearization file ...


Load a linearization from a file into the project, see Load from file above.

Upload linearization file to camera


Upload the file that is currently loaded in the project to the camera. You
can use this option to replace the linearization file that is currently stored
on the camera with a better one.

WARNING: Uploading replaces the linearization file that is cur-


rently stored in the camera, which will affect future tracking in other
projects.

NOTE: When using the Advanced calibration method, the linearization


files in the project are replaced with new linearization files.

Selecting and deselecting cameras for tracking


The camera is used if the checkbox in the Camera column is selected. Deselect
the checkbox to exclude a camera for tracking.

NOTE: The camera will still capture data during a measurement even if it
is deactivated. Therefore, it can be included again in the measurement by
reprocessing the file, see chapter "Reprocessing a file" on page 601.
However, if a camera has been deactivated during calibration, the cal-
ibration must be reprocessed first, see chapter "Recalibration" on
page 563.

PROJECT OPTIONS 252


Calibration

The Calibration page contains calibration settings required to perform a cor-


rect calibration. The lay-out and available options depend on the Calibration
type. For all calibration types it is very important that the calibration data is
entered correctly in order to obtain high quality motion capture data. For a
detailed description on how to perform a calibration see chapter "Calibration of
the camera system" on page 543.

Calibration type

Select the calibration type that will be used. The supported calibration types
are Wand calibration and Fixed camera calibration, see the chapters below.

Wand calibration

The wand calibration method requires two calibration objects to calibrate the
system. One is a stationary L-shaped reference structure with four markers
attached to it. The stationary L-structure (called reference object below) defines
the origin and orientation of the coordinate system that is to be used with the
camera system. The other calibration object is called calibration wand. It con-
sists of two markers located a fixed distance from each other. This object is

PROJECT OPTIONS 253


moved in the measurement volume to generate data to determine the loc-
ations and orientations of the cameras. For more information see chapter
"Wand calibration method" on page 547.

Calibration kit

Define the calibration kit you are using under the Calibration kit heading. The
calibration kit is used for scaling and locating the coordinate system in the
measurement volume. Two objects are needed to calibrate the system: a ref-
erence structure and a wand.

NOTE: The calibration objects are part of the measurement equipment


and should be treated with care. A scaling error derived from a damaged
calibration object will propagate throughout the whole measurement and
analysis.

Calibration kit type


By choosing the Calibration kit type in the drop-down box the size of
the L-shaped reference structure is specified. The calibration algorithms
will then find the reference markers when the calibration recording is
made.
The calibration kit is set to --- Select kit --- when creating a new project
with default settings.
The alternatives refers to the length and material of the wand that is used
in combination with the L-shaped structure. The following settings are
available for Calibration kit type:
l Active wand kit 500 mm

l Active wand kit 1011 mm

l Wand kit 110 mm

l Wand kit 120 mm

l Wand kit 300 mm

PROJECT OPTIONS 254


l Wand kit 300 mm (carbon fiber)

l Wand kit 600 mm (carbon fiber)

l Wand kit 750 mm

l Kit defined below


The requirements for the configuration of a custom reference object
(L-frame) are:
l The distance A should be shorter than the distance C.

l The distance B should be less than half the distance C.

For detailed information about specific calibration kits, see chapter


"Qualisys calibration kits" on page 995.

Exact wand length


Enter the distance between the centers of the reflective markers on the
reference wand in Exact wand length. It has been measured with high
accuracy and can be found on a plate on the wand.

Reference object definition


The Reference object definition is only used if Kit defined below is
selected as the Calibration kit type. The positions of the centers of the
reflective markers on the L-shaped reference object must then be defined
to specify the reference frame markers. The positions are defined as the
distance from the corner marker (origin marker) to the other markers, see
figure in the Project options dialog. When standard kit types are used
the Reference object definition will show the distances for the selected
kit type.

NOTE: The default origin of the coordinate system for the 300 mm
and 600 mm carbon fiber kits, as well as the 1011 active wand kit, is
at the corner where the frame rests on the floor. For the other kits
the default origin is at the center of the corner marker, for inform-
ation on how to translate the origin see chapter "Translating origin
to the floor" on page 556.

PROJECT OPTIONS 255


Coordinate system orientation and translation

Under the Coordinate system orientation heading the coordinate system of


the motion capture can be customized by choosing the way the X-, Y- and Z-
axes are orientated in the measurement volume. The coordinate system of the
subsequent motion captures will be the same as that used for the reference
structure.
Axis pointing upwards and Long arm axis are the settings which decide the
directions of the axes. Select the axis that you want for each setting to get the
desired coordinate system, see the figure next to the settings to understand
how the axes are orientated.

Maximum number of frames used as calibration input

The Maximum number of frames used as calibration input setting limits


the number of frames used in the calibration process. The default value is 6000
frames. If the number of frames in the calibration file is larger than this setting,
the frames will distributed evenly across the whole measurement.
Increase this value if you have a large volume. Especially if the volume is an
extended volume where not all of the cameras can see the calibration ref-
erence structure. To test this make a long calibration and test with different
Maximum number of frames to see how it affects your calibration result. You
can increase this value to as many frames as you like, at the expense of longer
processing times for the calibration.
For more information about the calibration time and the frequency that are
used see chapter "Calibration dialog" on page 545.

Apply coordinate transformation

With Apply coordinate transformation you can translate and rotate the
global coordinate system to any desired position. Select the checkbox and then
click Define to set the coordinate transformations on the Transformation
page, see chapter "Transformation" on page 259.

PROJECT OPTIONS 256


Fixed camera calibration

On the Calibration page for Fixed camera calibration you should enter the
data from the survey measurement. If you cannot see this Calibration page
change the Calibration type option to Fixed camera calibration. For more
detailed information on fixed camera systems contact Qualisys AB about the
QTM - Marine manual. They include detailed descriptions of the camera
installation, survey measurement, fixed camera calibration, validation and the
use of 6DOF bodies in marine applications.
Use the options Save definition and Load definition to save respectively load
the data for the Fixed camera calibration. The default folder is the project
folder.

NOTE: The first time you enter the survey data it must be entered manu-
ally.

Reference marker locations

Under the Reference marker locations heading you should enter the survey
data of the reference marker positions. Use the Add marker and Remove
marker options to add or delete reference marker locations. Add the markers

PROJECT OPTIONS 257


in the physical order from left to right. This will make it much easier to enter
the markers seen by camera. Double-click the X, Y and Z locations of each
marker to edit it.

NOTE: All of the markers’ locations must be entered to make a suc-


cessful Fixed camera calibration.

Camera locations and markers seen by each camera in order from left
to right

Under the Camera locations and markers seen by each camera in order
from left to right heading you should enter the survey data of the camera pos-
itions. Use the Add camera to add a new camera last in the list. The cameras
must be entered in the same order as the camera system. It is not possible to
rearrange the cameras after they have been added, just to remove a camera
with Remove camera. Double-click the column to enter the following data:
Location X, Location Y and Location Z
The survey measurement data of the camera.

Mock-up cylinder length


The length of the cylinder that was used on the camera dummy when mak-
ing the survey measurement.

NOTE: This length is the horizontal distance between the plate of


the camera dummy and the front side of the cylinder.

Markers seen (l to r)
The markers seen by the camera. Enter them in order from left to right as
seen by the camera and separate them with commas (the numbers refer
to the first column in the Reference marker locations list).

NOTE: QTM uses the top markers in the 2D view window as ref-
erence markers.

PROJECT OPTIONS 258


IMPORTANT: All of the cameras must be entered to make a successful
Fixed camera calibration.

Apply coordinate transformation

With Apply coordinate transformation you can translate and rotate the
global coordinate system to any desired position. Select the checkbox and then
click Define to set the coordinate transformations on the Transformation
page, see chapter "Transformation" below.

Transformation

The Transformation page contains the settings for defining a new global
coordinate system. The two changes that can be made to the coordinate sys-
tem is Translate origin and Rotate coordinate system. By changing these
parameters you can in fact move and turn the coordinate system to any pos-
ition and orientation. The change is always related to the original position and
orientation of the calibration coordinate system. In the case of the Fixed cam-
era calibration this means the origin of the survey measurement.

PROJECT OPTIONS 259


IMPORTANT: A new calibration file is saved if you apply a new trans-
formation on the current calibration. The saved calibration only includes
calibration data and no 2D data. Therefore you must always go back to
the first calibration if you need to recalibrate the calibration.

To activate the change you must select the checkbox of respective setting.
Translate origin (X(mm), Y(mm), Z(mm))
Enter the new position of the origin of the coordinate system (in mm). The
direction of the parameters (X, Y and Z) is always related to the original
coordinate system of the calibration.

Rotate coordinate system (Roll, Pitch, Yaw)


Enter the new rotation of the coordinate system (in degrees). The rota-
tions always refer to the original coordinate system and they are always
applied from left to right. To which axes the rotations are applied
depends on the Euler angles definitions on the Euler angles page, see
chapter "Euler angles" on page 392.

Use the Rotate axis to line or Fetch rigid body buttons to define the
rotation from a measured line or a rigid body, see chapters "Rotate axis to
line" on the next page and "Transform coordinate system to rigid body
(floor calibration)" on page 262.

PROJECT OPTIONS 260


NOTE: Notes about rotations:
l If the Euler angles are changed after a transformation, the trans-
formed coordinate system will have the same orientation as before
but the rotation parameters on the Transformation page will
change to reflect the new definition.
l The rotation angles are used to calculate a rotation matrix, which is
then used to transform the coordinate system. When the resulting
rotation matrix is converted back to rotation angles again it is not
necessarily the same angles. This means that the angles that you
have entered can change after you have applied it to the coordinate
system. E.g. if you just enter a pitch of 100 degrees (using Qualisys
standard) this will result in the angles roll = 180°, pitch =80°, yaw
=180°.

IMPORTANT: If you use a custom bounding box on the 3D Tracking


page, or if you are using a force plate, these will be moved with the trans-
formation and the positions must therefore be re-entered.

Rotate axis to line

With the Rotate axis to line function a line can be used to define the direction
of one of the axes. This can for example be useful to define a vertical or hori-
zontal axis if the floor is not level enough. Follow this procedure to use the func-
tion:

1. Make a measurement with two static markers that defines the line that you
want. It is important that the markers are as static as possible, because an
average is used to define the line. It is also important that the file uses the
current calibration.

2. Keep the file open and go to the Transform page in the Project options dia-
log. If the Transform page is not active go to the Calibration page and check
the Apply coordinate transformation box.

PROJECT OPTIONS 261


3. Activate the Rotate coordinate system option on the Transformation
page.

4. Click on the Rotate axis to line button to open the Rotate axis to line
dialog.

5. Select the axis that you want to define the rotation for.

6. Then select the trajectory that the Axis is pointing from and the trajectory
that the Axis pointing to.

7. Click OK to calculate the rotation and show the result on the Trans-
formation page.

8. A new calibration file will be saved in the Calibration folder after you click
OK in the Project options dialog.

NOTE: Notes on Rotate axis to line:


l This method only affects the rotation of the coordinate system, not
the position.
l The rotation of the unselected axes is also affected.

Transform coordinate system to rigid body (floor calibration)

The position and rotation of a rigid body can be used to define the global
coordinate system. This can for example be used as a floor calibration. Follow
this procedure to use the function:

PROJECT OPTIONS 262


1. Make a measurement with a rigid body or track the rigid body in preview
mode. Make sure that you define the rigid body coordinate system according
to your requirements. For example for a floor calibration, follow these steps
to create the rigid body:

a. Place 4 markers on the floor close to the corner of the measurement


volume.

b. Define a rigid body from these four markers.

c. Use the Translate and Rotate options on the 6DOF Tracking page to
achieve the desired definition. Use the Align the body using its points
option to define a plane, see chapter "Rotate body" on page 352.

d. Check that the new rigid body definition is correct. It is usually easiest
to do this in preview mode.

2. Have preview open and make sure that the rigid body is tracked.

3. Go to the Transform page in the Project options dialog. If the Transform


page is not active go to the Calibration page and check the Apply coordin-
ate transformation box.

4. Activate the Translate origin and/or Rotate coordinate system options on


the Transformation page. For a floor calibration you usually only need to
specify the rotation.

5. Enter the Rigid Body name.

6. Click on the Fetch rigid body button to update the values under Translate
origin and/or Rotate coordinate system. If the coordinate system already
included an earlier transformation, the values will be relative to the untrans-
formed coordinate system.

7. Click OK or Apply to start using the transformation.

PROJECT OPTIONS 263


8. In the Coordinate system dialog, choose if the new transformation
should be applied to the current calibration (Yes) or apply to following cal-
ibrations (No).
When choosing Yes, a new calibration file with suffix "-transformed" will
be saved in the Calibration folder and loaded as current calibration into
the project.

NOTE: To apply the transformation to an existing file, you must


reprocess the file with the new calibration.

Current calibration

The Current calibration page displays the calibration that is used by QTM.
Open the current calibration with the Open button. With the Load other
option you can open a dialog and load another calibration file. A calibration file
can only be loaded when it includes all cameras that are currently included in
the project.
The Load other option can be used to merge calibration files, see chapter
"Merge calibration files" on page 565.

PROJECT OPTIONS 264


If you want to reprocess files with a new calibration, it is often best to change it
in the Reprocessing or Batch processing settings, see chapter "Reprocessing
a file" on page 601 and "Batch processing" on page 605.
The Calibration results are the same as shown after the calibration, see
chapter "Calibration results" on page 558.

Calibration quality

The Calibration quality page contains setting for how to detect if the cameras
needs to be calibrated. The check can be performed in two different ways.
Residual settings
This will test in the 3D tracker if the cameras have a Residual in the File
information on the View menu that is higher than the residual check.
The default value for the residual test is 3 mm. It also tests if there are too
few of the captured 2D markers used by the 3D tracker, i.e. the number of
Points. If a camera is considered to be uncalibrated, then there is a warn-
ing after the tracking of the file.

PROJECT OPTIONS 265


Show calibration quality warnings when retracking too
Use this setting if you want to check the calibration quality when you
retrack a file.

Time settings
This test will only check how long a calibration is considered as new. The
calibration will still be used in a measurement it is just a visual warning.
When the time set with A calibration is new for has passed the triangle
at the bottom right corner turns yellow . Then when the time A new cal-
ibration is recommended after has passed the triangle changes to
orange .
Synchronization

The Synchronization page contains all of the timing related settings.


The Synchronization page contains two lists. The left pane contains a list of
devices. When a Camera Sync Unit is included in the system, it will be the only
device in the list. When using a system with Oqus cameras, the list shows all
Oqus cameras in the system. The right pane contains all the settings for the
selected devices.
The device list has the following columns.
ID
The number of the camera shown in the 2D view window and on the cam-
era display.

PROJECT OPTIONS 266


NOTE: The camera with an M after it is the master camera.

Type
The type of the camera or device.

Serial
The serial number of the camera or device.

Ip-address
The IP address for the camera or device.

NOTE: When a Camera Sync Unit is included in the system, it will be the
only device in the list. The synchronization settings of any Oqus cameras
included in the system are not displayed.

The settings list contain the settings for the selected devices. You can use Ctrl
and Shift to select multiple devices, or the Select all button to select all
devices. All of the settings marked with * are global. If multiple cameras are
selected and there is a setting that has been set individually its value will say
Differs. When changing such value it will set all the selected cameras to the
same setting. If there is no synchronization device connected, only the Wire-
less/software trigger option is present.
Check the Show description option to get a short description of the selected
setting.

Wireless/software Trigger

The Wireless/software trigger settings contain all the settings for triggering
the Qualisys system using a wireless trigger, the keyboard or RT client applic-
ations. Possible input:

PROJECT OPTIONS 267


Qualisys wireless trigger
Use the Trig button for starting and/or stopping captures and use the
Event button for creating events

Keyboard
Use space for starting captures. The space key cannot be used for stop-
ping captures or creating events. In addition, the PageDown key can be
used for starting and stopping measurements, and the PageUp key for
settings events.

NOTE: Alternatively, the keyboard shortcut Ctrl+M can be used for


stopping an ongoing capture (independent of wireless trigger set-
tings).

RT Client application
For example mobile apps and plugins for Matlab and Labview.

UDP start or stop packet


Start or stop a capture when receiving an UDP start/stop packet, see
chapter "UDP start/stop" on page 270.

WARNING: The wireless trigger or keyboard cannot be used to trigger


external equipment (e.g. EMG devices). Simultaneous triggering of the
camera system and external devices require the use of an external trigger
connected to a trigger port.

The following settings are available:


Function
Select the wireless trigger function from the drop-down list.
Start and stop capture disabled
The Wireless trigger or keyboard is not used for starting or stopping
captures.

Start capture
The start of a capture is delayed until a trigger event is received.

PROJECT OPTIONS 268


Stop capture
An incoming trigger event will stop the capture.

Start and stop capture


Both the start and stop of the capture is controlled by the trigger
event.

NOTE: Notes on trigger function:


l The trigger settings of the Wireless trigger and the trigger
ports are complementary, e.g., when the function of multiple
trigger ports and the wireless trigger is set to Start capture,
any of these trigger events will start a capture.
l When choosing Stop capture or Start and stop capture the
capture will also stop if the end of measurement time is
reached, unless you are have checked the Continuous cap-
ture option in the Start capture dialog.
l The stopping behavior of RT client applications may deviate
from the Function setting.

Start/stop on UDP packet


Select the whether to listen for UDP start/stop packets
Listener disabled
Disable the start and stop on UDP packets.

Listener enabled
Enable starting or stopping a capture when receiving an
UDP start/stop packet.

Listen for Main QTM instance


Enable start on the camera start time UDP packet. When using this
setting QTM does not start or stop when receiving the UDP
start/stop packet. Use this setting for Agent systems in a multiple
video system setup.

Event color
Color associated with this type of event. Click in the value field to pick a
color.

PROJECT OPTIONS 269


Event text
Text label associated with this type of event. Click in the value field to edit.

UDP start/stop

QTM supports starting and stopping captures via a UDP start/stop protocol.
QTM also broadcasts UDP start and stop messages for every capture that can
be used to start and stop captures on external devices that support the UDP
start/stop protocol.
To enable external devices to control captures in QTM via UDP start/stop, fol-
low these steps:

1. Make sure the computer running QTM and the controlling device are on
the same local area network.
2. Make sure that the QTM and the controlling device use the same port for
UDP communication. In QTM the Capture Broadcast Port can be set on
the Real-Time output page under the Project Options.
3. In the Wireless/Software trigger settings, set Function to the desired
action, for example Start and Stop capture if you want to start and stop
the capture using UDP start and stop.
4. Enable the Start/Stop on UDP packet option.

The next time you start a capture, QTM will wait for a trigger or stop the cap-
ture depending on the chosen trigger function. If you want QTM to do an auto-
matic series of captures controlled via UDP start/stop, make sure the Batch
capture option is checked in the Start Capture dialog.
The UDP start/stop packets sent by QTM have the following XML format.

PROJECT OPTIONS 270


XML start packet:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>


<CaptureStart>
<Name VALUE="measurement name"/>
<DatabasePath VALUE="measurement directory"/>
<Delay VALUE="0"/>
<PacketID VALUE="increasing number"/>
<HostName VALUE="computer name"/>
<ProcessID VALUE="process id (to be able to easier ignore own
udp broadcast messages"/>
<Notes VALUE=""/>
<Description VALUE=""/>
<Timecode VALUE=""/>
</CaptureStart>

XML stop packet:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>


<CaptureStop RESULT="SUCCESS/FAIL/CANCEL">
<Name VALUE="measurement name"/>
<DatabasePath VALUE="measurement directory"/>
<Delay VALUE="0"/>
<PacketID VALUE="increasing number"/>
<HostName VALUE="computer name"/>
<ProcessID VALUE="process id (to be able to easier ignore own
udp broadcast messages"/>
</CaptureStop>

UDP packet parameters


QTM sends start packet

Name
Filename without the .qtm ending

DatabasePath
Name of the folder if automatic saving is enabled in the Start capture dia-
log

Delay
Not in use

PacketID
Unique ID

PROJECT OPTIONS 271


HostName
Computer name

ProcessID
PID for QTM process

Notes, Description, Timecode


Not in use
QTM sends stop packet

RESULT
l SUCCESS
The capture is ended without any issues
l FAIL
The capture ended with an error
l CANCEL
The capture has been canceled

Name
Filename without the .qtm ending

DatabasePath
Name of the folder if automatic saving is enabled in the Start capture dia-
log

Delay
Not in use

PacketID
Unique ID

HostName
Computer name

ProcessID
PID for QTM process
QTM receives start packet

Name
Sets the QTM file name used for the capture

PROJECT OPTIONS 272


HostName
Used in combination with ProcessID to ignore messages from the local
QTM

ProcessID
Used in combination with HostName to ignore messages from the local
QTM

DatabasePath, Delay, PacketID, Notes, Description, Timecode


Not in use
QTM receives stop packet

Name
Sets the QTM file name used for the capture

HostName
Used in combination with ProcessID to ignore messages from the local
QTM

ProcessID
Used in combination with HostName to ignore messages from the local
QTM

RESULT, DatabasePath, Delay, PacketID


Not in use

Trigger ports

The Trigger port(s) settings contain all the settings for triggering the Qualisys
system using one or more external trigger devices.
For Oqus systems the use of an external trigger device requires a splitter cable
connected to the control port of one or more of the cameras. The Trigger port
settings apply globally to the control ports of all cameras within the system.

PROJECT OPTIONS 273


For Miqus systems a Camera Sync Unit (CSU) is required, see chapter "Camera
Sync Unit: front side" on page 950. The trigger port settings can be controlled
individually for the two trigger inputs Trig NO and Trig NC.
The Trigger input on Oqus cameras and the Trig NO (normally open) input on
the CSU can be used with the Qualisys trigger button. The Trig NC (normally
closed) input on the CSU requires an active trigger device for producing a
"high" level (5 Volt).
For details on how to connect an external trigger device see chapter "How to
use external trigger" on page 492.
The following settings are available:
Function/Trig NO: Function/Trig NC: Function
Select the external trigger option from the drop-down list.
Start and stop capture disabled
The external trigger device is not used for starting or stopping cap-
tures. However, you can still use the external trigger to create
events.

Start capture
The start of a capture is delayed until an external trigger event is
received.

Stop capture
An incoming trigger event will stop the capture.

Start and stop capture


Both the start and stop of the capture is controlled by the trigger
event. For this option a Minimum time between start and stop is
specified, since the two signals otherwise could be confused, see
below.

PROJECT OPTIONS 274


NOTE: Notes on trigger function:
l The trigger settings of the Wireless trigger and the trigger
ports are complementary, e.g., when the function of multiple
trigger ports and the wireless trigger is set to Start capture,
any of these trigger events will start a capture.
l When choosing Stop capture or Start and stop capture the
capture will also stop if the end of measurement time is
reached, unless you are have checked the Continuous cap-
ture option in the Start capture dialog.
l When choosing Stop capture or Start and stop capture
events cannot be created with the external trigger.

TTL signal edge/Trig NO: TTL signal edge/Trig NC: TTL signal edge
Select the trigger edge to Positive (rising edge), Negative (falling edge) or
Any edge (rising or falling). The setting applies to all start, stop and event
signals arriving at this trigger port.

NOTE: The Qualisys trigger button connected to an Oqus Trig in


connector or the CSU Trig NO input will produce a negative signal
edge when pressing the button, and a positive signal edge when
releasing the button. The button is not debounced though, so pos-
itive edges may be produced even when pressing the button.

Event color/Trig NO: Event color/Trig NC: Event color


Color associated with this type of event. Click in the value field to pick a
color.

Event text/Trig NO: Event text/Trig NC: Event text


Text label associated with this type of event. Click in the value field to edit.

Generate events
Activate this option to generate events with the external trigger. This
option applies to all trigger ports, and it is activated by default.

PROJECT OPTIONS 275


Hold-off time
Hold-off time from the last event until a new event can be generated with
the external trigger. This helps to avoid the creation of spurious events
due to contact bounces when pressing or releasing the button. The hold-
off time is by default 100 ms (range 5–1000 ms).

Delay from signal to start/stop [µs]


Delay (in microseconds) from trigger start event to first camera frame.
The delay must be larger than or equal to 20000 μs (20 ms). The stop
delay is twice as long as the configured start delay.

Minimum time between start and stop [s]


Minimum time between a start and a stop signal from an external trigger
device. This time interval applies to signals coming in at any of the trigger
inputs (not including wireless/software trigger). The range is 1–1000
seconds.

Event port (Camera Sync Unit)

The Event port settings contain all the settings for creating events using an
external trigger device connected to the event input of the Camera Sync Unit.
The following settings are available:
TTL signal edge
Select the trigger edge to Positive (rising edge), Negative (falling edge) or
Any edge (rising or falling).

NOTE: The Qualisys trigger button connected to the event input will
produce a negative signal edge when pressing the button, and a pos-
itive signal edge when releasing the button. The button is not
debounced though, so positive edges may be produced even when
pressing the button.

PROJECT OPTIONS 276


Event color
Color associated with this type of event. Click in the value field to pick a
color.

Event text
Text label associated with this type of event. Click in the value field to edit.

NOTE: When using IRIG as external timebase source, the Event port set-
tings are ignored, since the IRIG time code signal needs to be connected
to the Event input on the Camera Sync Unit.

Pretrigger

It is possible to use a pretrigger mode, so that actions can be captured that


occurred before a trigger start event. The pretrigger mode is activated by set-
ting Capture pretrigger frames to Yes. The pretrigger settings apply globally
to the system. In pretrigger mode the cameras continuously update their buffer
memories with a predefined number of frames until a trigger start event
occurs. After the trigger start event, the pretrigger frames will be sent to the
measurement computer. The trigger start event can be generated by an
external trigger device, a wireless or software trigger, or by pressing the space
key on the keyboard.
The following settings are available:
Capture pretrigger frames
Set to Yes for using pretrigger mode

Pretrigger time [s]


Time to buffer data (in seconds) before the trigger start event.

Pretrigger marker frames


Number of marker frames to buffer before the trigger start event.

Pretrigger video frames


Number of video frames to buffer before the trigger start event.

PROJECT OPTIONS 277


NOTE: The number of video frames can differ between cameras
dependent on their video capture rate, which can be set indi-
vidually. If the number of pretrigger video frames is different when
multiple cameras are selected, it will be indicated red.

Trigger start event color


Color used to label the trigger event

Trigger start event text


Text label for the trigger start event

Pretrigger is also supported in combination with the following external devices:


l Analog board USB-2533

For using pretrigger in combination with the USB-2533 analog board the use of
an external trigger is required, see chapter "Measurement with analog capture
while using pretrigger" on page 493 for details on how to connect the external
trigger device. The pretrigger option is not supported in combination with
other any other external devices, such as digital force plates, eye trackers, EMG
devices and A/V devices.

WARNING: Make sure that the time between the start of the capture
and the trigger event is long enough to capture all the pretrigger frames,
i.e. check that the Status bar says ”Waiting for trigger”. Otherwise there
will not be enough pretrigger frames to collect from the buffer.

External timebase

PROJECT OPTIONS 278


Activate the External timebase with Yes on the Use external timebase set-
ting. The external timebase setting applies to all cameras in the system,
because it controls the capture rate of the cameras with an external signal.
There are several ways to synchronize camera frames to an external timebase.
For more information on how to use external timebase, see chapter "How to
use external timebase" on page 494 .
The external time base should be connected to the corresponding input con-
nector of the Camera Sync Unit, see chapter "Camera Sync Unit: front side" on
page 950.
When using an Oqus camera for synchronization the external timebase can
either be connected via the Sync in connector of the splitter cable (see chapter
"Control connections" on page 973), or the Sync in/Video in or Trig in/SMPTE
in connector of the Oqus Sync Unit (see chapter "Oqus Sync Unit" on page 967).
It is recommended that you connect the external timebase signal to the master
camera.

NOTE: It is important to use a stable signal source to achieve the best


possible synchronization.

NOTE: Qualisys video cameras can capture at a different frequency (a


divisor) compared to the marker cameras, see chapter "Using Qualisys
video with External timebase" on page 498.

The following settings are available:


Use external timebase
Activate the use of an external timebase with Yes.

Signal source (Camera Sync Unit)


A Camera Sync Unit can be synchronized with the following signals. Make
sure that the signal source is connected to the corresponding input con-
nector on the Camera Sync Unit.
SYNC
Use a TTL signal to synchronize the system.

PROJECT OPTIONS 279


SMPTE
Use a SMPTE signal to synchronize the system.

NOTE: The Use SMPTE timestamp option is activated auto-


matically when selecting SMPTE as signal source.

GENLOCK
Use a video signal (black burst) to synchronize the system.

IRIG
Use an IRIG signal to synchronize the system.

NOTE: IRIG cannot be used when there are any Oqus cameras
included in the system.

Internal 100 Hz
The system is synchronized with an internal 100 Hz signal. This
option will be automatically selected if 100 Hz continuous has been
selected as Synchronization output mode on one of the syn-
chronization output ports, see chapter "Synchronization output" on
page 285.

Signal source (Oqus)


Oqus cameras can be synchronized with the following signals, where the
corresponding settings differ some for the different signals:
Control port
Connect a TTL signal to the Sync in connector of the splitter cable or
the Oqus Sync Unit. This is the standard method of synchronizing.

IR receiver
Send an IR signal which is received by the IR receiver on the front of
the camera.

SMPTE
Use a SMPTE signal to synchronize the system. You need to use the
Oqus Sync Unit to convert the signal, see chapter "Using Oqus sync

PROJECT OPTIONS 280


unit for synchronization" on page 509.

NOTE: The Use SMPTE timestamp option is activated auto-


matically when selecting SMPTE as signal source.

Video Sync
Use a video signal (black burst) to synchronize. You need to use the
Oqus Sync Unit to convert the signal, see chapter "Using Oqus sync
unit for synchronization" on page 509.

Internal 100 Hz
The system is synchronized with an internal 100 Hz signal. This
option can only be selected if 100 Hz continuous has been selected
as Synchronization output on one of the cameras, see chapter
"Synchronization output" on page 285.

NOTE: This option is needed for the Twin system frame syn-
chronization option, in which case it is selected automatically.

Signal mode
Select the way to synchronize the camera frame to the signal.

NOTE: Not available for all choices of signal source.

Periodic
In this mode the system locks on to the signal of the external
timebase. The capture frequency can be set as a multiplier/divisor.
This is the recommended setting for periodic signals see chapter
"Using External timebase for synchronization to a periodic TTL sig-
nal" on page 494.

Non-periodic
In this mode every single frame that is captured must be triggered
by a signal from an external timebase source. This means that the

PROJECT OPTIONS 281


frame rate is controlled by the external timebase source. To prevent
that the QTM software indicates a timeout error, because no frames
are captured, there is a Max expected time between two frames
[s] setting. Set it to the largest possible interval between two con-
secutive signals from the external timebase source.

NOTE: Notes on non-periodic signal mode:


l The non-periodic sync cannot be faster than 120 Hz. This
is because of network delays described in the Delay
from signal to shutter opening setting below.
l The minimum delay from signal to shutter opening is
19000 μs in non-periodic mode.

Frequency multiplier/divisor
Set the multiplier respectively divisor to multiply and divide the incoming
sync signal.

Frequency tolerance in ppm of period time


Set the tolerance for jitter of external periodic signals in ppm of the
period time. The period time of the external signal should not differ more
than the specified tolerance for the system to lock on to the frequency.

NOTE: Not available for the SMPTE signal.

Use nominal frequency


Activate the use of nominal frequency. When activated QTM will calculate
the capture frequency based on the nominal frequency of the external
timebase so that the relationship is always correct. This is recommended
so that you can easily compare the data to that of an external system. If
you do not use the nominal frequency, QTM will export the capture fre-
quency with one decimal. In that case, the capture frequency may slightly
deviate from what you expect due to the difference in clock speed
between the Qualisys system and the external signal generator.

PROJECT OPTIONS 282


Nominal frequency
Enter the nominal frequency of the external timebase signal.

NOTE: When using SMPTE as signal source the nominal frequency


value is read from the SMPTE frequency setting under Timestamp.

TTL signal edge


Specify the signal edge used for synchronization, either negative (falling
edge) or positive (rising edge).

Delay from signal to shutter opening


Specify the delay of camera exposure relative to the synchronization sig-
nal in microseconds. For the Periodic mode the delay is 0 by default and
can be set to any value. For SMPTE as a signal source the minimum delay
is 100 μs.
For the Non-periodic mode the default and minimum time is 19 ms. This
is because there must be enough time for the trigger packet to reach all
of the cameras in the system. Which means that there is always a delay
between the signal and the actual capture. If it is set too short all of the
cameras will not receive the TCP/IP message and the synchronization will
be lost. To not have to consider the delay set it to the same as the period
of your signal. In this way you will not get an image for the very first trig-
ger event but for all the following the capture will be made at the same
time as the synchronization signal.

Max expected time between two frames [s]


This setting is only active for the Non-periodic mode. Then it decides the
maximum time QTM waits for the next frame, i.e. the longest possible
time between two signals from the external timebase.

PROJECT OPTIONS 283


CAUTION: The camera system can be damaged if the hardware setup is
not done properly. Make sure that the resulting camera frequency never
exceeds the maximum frequency of the cameras in the current mode.
The frequency should be limited to 97.5% of the maximum frequency at
full image size. For an overview of maximum frequencies per camera, see
"Qualisys camera sensor specifications (marker mode)" on page 926.

Timestamp

QTM has the option to add timestamps to camera frames for synchronization
with external signals or devices. The timestamp is displayed in the Timeline
control bar see chapter "Timeline control bar" on page 133.
The following settings are available:
Use timestamp
Check to add a timestamp to the camera frames.

Timestamp type
Select type of timestamp from the drop down menu. The options are:
SMPTE
Time code used for audio and video synchronization. This requires
an Oqus or Camera Sync Unit to convert the SMPTE signal. For more
information about using SMPTE, see chapters "Using Oqus sync unit
for synchronization" on page 509 and "Using SMPTE for syn-
chronization with audio recordings" on page 512.

NOTE: The supported SMPTE frequencies are 24 Hz, 25 Hz


and 30 Hz, without dropped frames.

IRIG
Time code standards by the Inter-Range Instrumentation Group.
This requires a Camera Sync Unit. The IRIG standards currently

PROJECT OPTIONS 284


supported are IRIG B (1 Hz) and IRIG A (10 Hz).

NOTE: Notes on IRIG:


l The IRIG signal should be connected to the Event input
port on the Camera Sync Unit. The IRIG signal should be
IRIG A or B, DCLS type.
l IRIG cannot be used when there are any Oqus cameras
included in the system.

Camera time
Time of the exposure in seconds.nanoseconds. When used without
external clock master, the reference time is the time at which the
master camera was started. When using PTP synchronization with a
GPS-based external clock master the reference time is 1 January
1970. For more information about the use of an external clock mas-
ter, see chapter "How to use PTP sync with an external clock master
(Camera Sync Unit)" on page 501.

Timestamp frequency
Select one of the supported SMPTE or IRIG frequencies from the drop
down menu.

Synchronization output

The Camera Sync Unit (CSU) has three synchronization outputs, Out 1, Out 2
and Measurement time. The outputs Out 1 and Out 2 can be configured con-
trolled to get a customized synchronization output.
For Oqus systems, each camera has a synchronization output, which can be
accessed via a splitter cable connected to the control port. The Syn-
chronization output can be controlled to get a customized synchronization
output from the cameras. Additional modes for the Synchronization output for
Oqus cameras are Camera frequency – Shutter out, Measurement time and
Wired synchronization of active markers. The settings apply to individual
cameras so that there can be different synchronization outputs within the
same system. Select the cameras in the device list to the left. The syn-
chronization signal is sent as soon as the camera is measuring, i.e. both during
preview and capture, with the exception of measurement time, which is only

PROJECT OPTIONS 285


sent during capture, and 100 Hz continuous, which is always sent. For a
description of how to use the sync output see chapter "Using Sync out for syn-
chronization" on page 505.

NOTE: When a Camera Sync Unit is included in the system, QTM only dis-
plays the Synchronization settings of the Camera Sync Unit. The syn-
chronization settings of Oqus cameras are not displayed.

NOTE: If you want to use a sync output signal that is faster than 5000 Hz
with an Oqus system, you must use the master camera as sync device.
The master camera is displayed with an M next to the ID in the camera
list to the left.

There are four different sync modes which are described below. The image
below is a description of the different sync output settings.

The following modes are available. The settings depend on the chosen mode.

Camera frequency - Shutter out (Oqus)

PROJECT OPTIONS 286


This is the default setting for the Oqus cameras. In this mode a pulse is sent for
each frame captured by the camera and the pulse has the same length as the
exposure time.
Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal will be high until the first frame when it will go
low.

Frequency multiplier/Frequency divisor

Use this mode to set the frequency as a multiplier/divisor of the camera cap-
ture frequency. The multiplication factor/divisor is controlled with the Mul-
tiplier/Divisor setting. The current Sync output properties (Frequency [Hz],
Period time [µs] and Pulse duration [µs]) are shown below the setting both
for marker mode and video mode (if applicable), because the marker and video
capture rates can be different. The Sync output frequency will change if any
of the capture rates are changed. The displayed period time is rounded to the
nearest microsecond, but the number of pulses will always be correct com-
pared to the camera frequency.

NOTE: The maximum Multiplier is 1000 and the maximum Output fre-
quency is 100000 Hz.

NOTE: The video capture rate is individual per camera, which is indicated
by a red number if it differs. Select only the camera that Sync out is con-
nected to find out the frequency in video mode.

PROJECT OPTIONS 287


The Duty cycle setting for controls how long the pulse will be in percent of the
current Period time. The actual Pulse duration is displayed for the marker
mode and (if applicable) video mode, respectively.

NOTE: The pulse starts at the start of each period, i.e. you cannot apply
an offset to the pulse. However, by changing the Duty cycle and the TTL
signal polarity you can get an edge at any time between two frames.

Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low. Each signal will then be synchronized with its corresponding cap-
ture frame which depends on the Multiplier/Divisor setting, see image above.

Independent frequency

Use this mode to set the frequency independently of the camera capture fre-
quency. The Output frequency can be set between 1 and 100000 Hz. When
changing the frequency the Period time [µs] will then update to its cor-
responding value.
The Duty cycle setting for controls how long the pulse will be in percent of the
current Period time. The actual Pulse duration (in μs) is displayed below.
Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low. The first pulse will always be synchronized with the camera capture.
The following pulses will be dependent on the relation between the camera cap-
ture rate and the Output frequency.

Measurement time (Oqus)

PROJECT OPTIONS 288


Use this mode to get a pulse that lasts for the whole capture time. The output
will go low/high at the start of a capture and not go high/low until the end of
the last frame. Measurement time applies only to captures, not to preview.
Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low.

NOTE: For the Camera Sync Unit, measurement time is not available as a
mode for the Out 1 and Out 2 ports. Instead, the it has a separate Meas-
urement time output with the same options.

100 Hz continuous

Use this mode to output a continuous 100 Hz TTL signal. It is sent even when
the system is not measuring so that external equipment can lock on to this sig-
nal. In Oqus systems, only one camera can have this option activated.

NOTE: Notes on 100 Hz continuous mode:


l When External timebase is activated it is automatically changed to
the Internal 100 Hz option. In this scenario, the device which is set
to 100 Hz continuous output will be used as a timebase.
l This option is required for the Twin system frame synchronization
option, in which case it is selected automatically.
l The 100 Hz continuous output is not available on Oqus 1, 3 and 5
series cameras.

System live time

Use this mode to get a pulse that lasts for the whole preview or capture time.
The output will go low/high at the start of a measurement and not go high/low
until the end of the last frame. System live time applies to both preview and
captures.

PROJECT OPTIONS 289


Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low.

Wired synchronization of active markers (Oqus)

Use this mode to synchronize the active markers via a wire connected from
Sync out on the camera to the Sync in connector on the driver. The signal is
pulsed so it cannot be used for any other synchronization.

NOTE: This mode is only available if you are using active markers.

Controlled by Analog board

The camera synchronization output is controlled by the settings for an analog


board, see chapter "Sample rate" on page 292.

Measurement time (Camera Sync Unit)

Use this the Meas. time output on the Camera Sync Unit to get a pulse that
lasts for the whole capture time. The output will go low/high at the start of a
capture and not go high/low until the end of the last frame. Measurement
time applies only to captures, not to preview.
Set the TTL signal polarity to change the polarity of the signal. Negative
means that the sync out signal is high until the start of the first frame when it
will go low.

PROJECT OPTIONS 290


Analog boards

The analog boards that have been installed in the measurement computer are
listed on the Analog boards page, for information about analog boards see
chapter "How to use analog boards" on page 747. For instructions on how to
install an analog board, see "How to use analog boards" on page 747.

NOTE: If more than one board is installed make sure that the syn-
chronization cable is connected to all of the analog boards.

The list of the analog boards contains six columns: #, Board type, Channels,
Used, Samples/s and Range. The list shows an overview of analog boards that
are checked as Enabled on the Input Devices page.
The # column contains the number of the analog board in Instacal. The Board
type column shows the type of the analog board, for a list of boards that are
compatible with QTM see "How to use analog boards" on page 747.

PROJECT OPTIONS 291


The four last columns contain the settings for the analog board. To configure
an analog board select the board on the list. Then the analog board appears
below the Analog boards branch of the options tree. Click the name to go to
the page with settings for the analog board, see chapter "Analog board set-
tings" below.
Analog board settings

The page of each analog board contains the following settings.

Sample rate

The Sample rate options control the frequency of the analog board. The
sample rate is most often set in a multiple of the camera frequency, because it
is easier to handle the data that way and this is for example a requirement for
the C3D file. The default setting is one sample per camera frame, i.e. the same
sample rate as for the camera system. What sample rate to use really depends
on what you want to measure, for example for force data it is often enough
with 500 to 1000 Hz.

PROJECT OPTIONS 292


The actual sample rate, in samples per second, for the analog board is shown
in the text boxes to the right. When the External sync option is used with an
Oqus camera as synchronization source, the sample rate is displayed both for
marker mode and video mode . For all other options only the sample rate in
marker mode is displayed.

NOTE: The minimum analog sample rate is 20 Hz, which is a limitation


set by the analog driver.

There are then two options for synchronizing the analog board to the camera
system: External sync and Trigger start.
External sync (frame sync) with(default if available)
The analog board is frame synchronized so that the drift between the cam-
era system and the analog data is eliminated. This is the recommended
setting for a Qualisys system with a USB-2533 analog board. You can use
any frequency for the analog board. For instructions on how to connect
the sync out signal from an Oqus camera or a Camera Sync Unit to the
analog board see chapter "Connection of analog board" on page 752.

There are two options for the External Sync:


Camera selection
Select the camera where the Control port splitter is connected. The
splitter can be connected to any camera in the system but you must
specify which it is, because the sync out signal will be changed on
the individual camera. The specified camera will automatically be set
to a multiple of the capture rate and the Multiplier setting is auto-
matically updated with the Sample rate option.
The frequency can be different on separate boards, but you then
need to connect the boards to different cameras.

PROJECT OPTIONS 293


NOTE: If the Reduced real-time frequency is used during
preview, then the analog frequency during preview may differ
from the ones reported on this page. The ratio between the
camera frequency and the analog frequency set on the Syn-
chronization page will however be the same, so you can cal-
culate the analog frequency if necessary.

Use advanced synchronization


When the Use advanced synchronization option is activated the
analog frequency is controlled directly from the Synchronization
page. This means that the Sample rate option is disabled and you
have to set the frequency directly. The advantage is that you can use
the divisor and independent frequency modes so that you can in
fact get any possible frequency.
Settings
To change the sync output frequency click on Settings to go
directly to the Synchronization settings for the selected cam-
era. There you can use any option for Synchronization output
that you like, see chapter "Synchronization output" on
page 285.

Simultaneous start
The analog board is just started on the first frame of the camera system,
which means that there can be a small drift between the cameras and ana-
log data.

NOTE: The sample rate can be different on separate boards.

Board settings

Under the Board settings heading on the page for the analog board, there are
settings for the range of the analog board.
Use the Range options to select Bipolar range (both negative and positive
voltage) or Unipolar range (only positive voltage) and to select the voltage
range that is used for the analog capture. The default settings are Bipolar and
± 10 Volts.

PROJECT OPTIONS 294


Force plate control

Under the Force plate control heading there is information about the setup of
the control of the force plate. Click More to go to the Force plate control set-
tings page.

Compensate for analog offset and drift

Under the Compensate for analog offset and drift heading there are set-
tings for a filter which removes offset and drift. Activate the filter with the
Remove offset and Remove drift options. The current compensation is dis-
played in the Channels list, see chapter "Channels" on page 297.
Remove offset
Removes an offset from the analog data. This setting can be used to
remove the voltage level of a unloaded force plate. The offset com-
pensation can either be calculated from the measurement or from an
acquired list. The offset can be removed in RT/preview and in a file. When
removed in a file the offset value cannot be changed only turned on and
off, so it is important to set the correct settings before capturing the file.
However if none of the options are selected you can still turn on the com-
pensation in a file from the Data info window, the compensation will
then be calculated from the start of the measurement.
Calculate offset from the measurement
Use the Remove offset from and Number of samples to use
options to define the analog samples that are used to calculate the
offset. Activate offset compensation in real-time with Remove off-
set in real-time.
Remove offset from
Set which part of the measurement to use when calculating the
offset in the measurement. The default is Start of the meas-
urement, but it can be changed to use the End of the meas-
urementif there is noise at the beginning of the
measurements.

PROJECT OPTIONS 295


NOTE: For this offset setting it is important that the
samples used in the file do not include any valid analog
data. E.g. do not stand on the force plate at the start of
the measurement since this data will be equal to 0 N if
Remove offset is activated. In the case the calculated off-
set is higher than 5 V QTM will give you a warning, see
chapter "Analog offset warning" on page 753. However
you can always remove the offset compensation manu-
ally in the file from the Data info window, see chapter
"Analog data information" on page 174.

Number of samples to use


The offset is defined as an average of the number of samples
that are defined in the Number of samples to use option.

Remove offset in real-time


Activate the Remove offset in real-time option to remove the
analog offset during RT/preview as well. Only activate this if
you are using the RT data. The offset will always be calculated
from the beginning of the preview so it is important that this
analog data is correct. For example if you zero a force plate
amplifier after you started the RT/preview then the data will be
wrong, because then the offset was calculated on the signal
from the unzeroed amplifier. The data will however be OK
when you capture the file, because then QTM calculates a new
offset in from the file.

Acquire list of offset values


The Use values in list below option can be used if you have a static
offset that you can acquire during the real-time.
Use values in list below
The offset is saved before the measurement and displayed in
the Channels list below. This option is not available in com-
bination with Remove drift.
Acquire the offset during RT/preview with the Read current
values button. Make sure that the analog signal is correct
when you press Start in the Acquire offsets dialog. Because

PROJECT OPTIONS 296


the offset is then stored the analog signal can include valid
data in the beginning of RT/preview and in a file. E.g. you can
stand on the force plate when the measurement starts.
The offset is valid for the number of minutes specified next to
the option. How long time you can use depends on the drift of
the analog signal. The analog warning below is displayed when
you start a measurement and the time has expired.

Remove drift
Calculates a slope from the first and the last samples of the meas-
urement. The number of samples used to calculate how much to remove
from the analog data is defined in the Number of sample to use option.
This slope is then deleted from the analog data. This setting can be used
to remove an output drift of a force plate that slowly changes the voltage
level during the measurement.
The drift compensation is only applied on a file and cannot be used with
the options Read offset from end of measurement and Use values in
list below. Because those implicates that the analog data is not correct at
the beginning of the measurement.

NOTE: When the drift filter is activated it is important that the


samples in the beginning and end of a in file do not include any
valid analog data. E.g. do not stand on the force plate at the start
and end of the measurement since this data is used to calculate the
drift. However you can always remove the drift compensation manu-
ally in the file from the Data info window, see chapter "Analog data
information" on page 174.

Channels

The channels of the analog boards are listed under the Channels heading.

PROJECT OPTIONS 297


To use a channel select the corresponding check box in the Channel no
column. The selected analog channels must be in successive order. For
example, if only channel no 3 and channel no 7 are used all channels in
between must also be selected. Double-click on the Channel name column to
rename the channel.
An individual delay in ms can be compensated for on each analog channel, for
example it can be used with Delsys Trigno EMG that has a known delay of 48
ms. Double click on the Delay (ms) column to edit the delay of that channel.
The delay is specified in ms, but the actual delay will be in the closest number
of analog samples to that time. The compensation means that the analog
samples on the channel is translated to match when it actually happened, for
example with a 48 ms delay a signal that comes at 1.000 s will be translated to
0.952 s.

NOTE: The delay is not applied in real-time processing.

To edit the other settings right-click for the menu below.

The menu contains the following options.


Capture data from this channel
Use this channel, the same as selecting the checkbox.

NOTE: This option applies to all selected channels, i.e. the line is
colored blue.

Don't capture data from this channel


Do not use this channel, the same as deselecting the checkbox.

PROJECT OPTIONS 298


NOTE: This option applies to all selected channels, i.e. that line is
colored blue.

Select all
Use all of the channels

Unselect all
Use none of the channels. Sometimes the fastest way to select a new
range of channels.

Change channel name...


Open a dialog to change the channel name.

Change channel delay...


Open a dialog to change the channel delay.

Apply offset/drift compensation


Use this options to apply offset and/or drift compensation. However you
can always turn off the compensation in a file from the Data info win-
dow.

NOTE: This option applies to all selected channels, i.e. that line is
colored blue.

Do not apply offset/drift compensation


Use this option to not apply the offset and/or drift compensation.
However you can always turn on the compensation in a file from the Data
info window.

NOTE: This option applies to all selected channels, i.e. that line is
colored blue.

The current settings under the Compensate for analog offset and drift
heading is displayed in the last three columns. The different settings are
described below.

PROJECT OPTIONS 299


If no compensation is activated all three columns display ---. This means
that no compensations is applied in RT/preview or in a file. However you
can turn on the compensation in a file from the Data info window. The
compensation will be calculated from the Use the first and last option.

If the compensation is activated with the option Use the first and last,
then the Offset and Drift columns says Calculated. The offset com-
pensation will be applied in both RT/preview and in a file and calculated
from the first samples, which means that it is important that there is no
signal on the analog channel and the start of RT/preview or in the begin-
ning of a file.
The drift compensation is however just applied in the file, since it isn't cal-
culated for the RT/preview. For the drift compensation it is important that
there is no analog signal in the beginning and the end of the file.

If the compensation is activated with the option Use values in list below,
then the last read value is displayed in the Offset column. The offset com-
pensation will be applied in both RT/preview and in a file. However since
the offset values have already been stored there can be an analog signal
in the beginning of RT/preview and a file. Start RT/preview and use the
Read current values button to updated to offset values for the channels
where the compensation is activated.
The Drift column displays ---, because the drift cannot be compensated
for in this mode.

When the compensation of a channel is deactivated by the Do not apply


offset/drift compensation option, then the Comp. offset/drift column
displays No. The compensation is then not applied, but it can be activated
in a file from the Data info window.

PROJECT OPTIONS 300


Force plate control settings

The Force plate control settings page contains the following settings for the
control of Kistler force plates.

NOTE: These settings are only valid for the Kistler force plates. The Kist-
ler force plates can, however, be externally controlled regarding Oper-
ate/Reset and charge amplifier range settings.

Analog board type

The Analog board type type for force control. Currently, only Qualisys analog
interface can be selected.
Qualisys analog interface
This is the default option that is used with all of the analog boards com-
patible with QTM.

Kistler 5695A DAQ (deprecated)


Deprecated since QTM version 2024.1.

Set the number of force plates and their ranges with the settings described in
chapter "Force plate control list" on the next page.

PROJECT OPTIONS 301


Force plate auto-zero

Use the Force plate auto-zero settings to control when to auto-zero/reset the
force plates connected to the analog board.
There are two options to control the auto-zeroing.
On measurement start
When enabled the force plates are auto-zeroed at two operations in QTM:
just before the Start capture dialog is opened before a capture and in a
batch capture just before QTM starts Waiting for next meas-
urement/trigger.

On preview start
When enabled the force plates are auto-zeroed at two operations in QTM:
new file and changing a setting in Project option during preview.

There is also an option to auto-zero the plates while QTM are in preview mode.
Just right-click in a Data info window that displays force data and select Zero
all force plates. It is important to use this zeroing at least every hour if both
auto-zeroing options have been disabled.

Force plate control list

The list of force plate controls sets the controls and ranges of Kistler force
plates connected via the USB-2533 analog board. Up to four force plates can be
controlled via the USB-2533 analog board.
When connecting the force plates it is important to connect them in the same
order to the analog channels as they are specified in the list. This is because
the order is used to match the force plate numbers on the Force plate list on
the Force data page, see chapter "Force data" on page 360. For more inform-
ation on how to connect the force plates to the board see chapters "Connecting
Kistler force plates" on page 790.
A new Kistler plate is added with the Add option. The force plate control set-
tings can then be edited or removed by selecting the plate and use respectively
the Edit or Remove option. Adding a plate also means that an Operate/Reset sig-
nal will be sent to the force plate.

PROJECT OPTIONS 302


The Force plate control dialog contains the following settings:
Z Range/XY Range
Specify the range for the vertical (Z) and horizontal (XY) range of the force
plate. The selected ranges determine the magnitude of the forces that can
be measured with the force plate. The ranges of the amplifier can be dif-
ferent for the vertical and horizontal force components.

NOTE: The Kistler scaling factors will automatically be changed to


the correct setting if the Selected by forceplate control options is
used, see chapter "Kistler scaling factors" on page 371.

Amplifier
Select the type of amplifier used for the force plate, Internal or External.

Time-constant filter
The Time-constant filter is useful for long measurements of static forces,
for regular measurements the Time-constant filter should not be used.

For further information about the range settings and the time-constant filter,
see the manual for the Kistler force plate.

PROJECT OPTIONS 303


AV devices

The external video devices that are selected under Input Devices are listed
under the AV Devices node. This include video captured with Blackmagic
Design cards and DV/webcam cameras, for more information on how to use
them see chapter "Video capture with Blackmagic Design cards" on page 899
and "External video devices in 2D view" on page 100.
To use a video device go to the Input Devices page and select the check box in
the Enabled column. Then the video will be displayed in the 2D view window.
To open a 2D view window with just the video cameras you can click on the
Video button in the toolbar.
When DirectShow video cameras are connected (for example web cameras),
the AV Devices node can be expanded, giving access to a list of available video
properties (resolution, color space and frequency).

PROJECT OPTIONS 304


NOTE: QTM supports simultaneous video recording from several video
devices, but there can be some problems with the capture of video data if
there are too many cameras. E.g. the frame rate of the video devices may
be lowered and sometimes the video is not captured at all, it depends a
lot on how fast the computer is.

AV device settings

In the AV devices properties tab the video properties (resolution, color space
and frequency) can be selected for DirectShow video cameras, for example web
cameras.

Force plates
The available force plates are listed in the settings tree to the left under the
Force plates heading.
For detailed information about the settings of the respective force plate types,
see the following chapters.

PROJECT OPTIONS 305


AMTI Digital force plates

The settings for AMTI Gen5, OPT-SC (Optima Signal Conditioner) and AccuGait
Optimized are divided between the capture settings and the processing set-
tings. The settings are the same for Gen5, OPT-SC and AccuGait Optimized so in
the following descriptions we only refer to Gen5. On the AMTI Gen5 page there
are only settings for the capture. There is one page for each connected AMTI
Gen5 amplifier, the name of the page will be the same as specified for the plate
on the Force data page. The default name includes the model and serial num-
ber of the connected force plate.
For AMTI force plates with the new calibration chip the only processing setting
that is needed is the location, see chapter "Force plate location" on page 382. If
you have an old AMTI force plate where you have to input the calibration mat-
rix manually then that needs to be added in the AMTI-Netforce program.
QTM will then automatically read the settings file for that program. For a
description on how to connect the cameras see chapter "Connecting AMTI
Digital force plates" on page 756.

The Amti Gen5 heading contains information about the force plate that is con-
nected to the amplifier. The information includes the Model and the Serial
numbers.

PROJECT OPTIONS 306


The only setting under Sync settings heading that can be set on the AMTI Gen5
is the Sync source option, to decide which camera that the synchronization
cable is connected to. Then click on Advanced to change the frequency of the
force plate. You will go to the Timing Advanced page with just the Sync source
camera selected. Change the Synchronization output setting to the desired
frequency, see chapter "Synchronization output" on page 285. It is recom-
mended to use the Camera frequency multiplier option to set the frequency,
because for example when exporting to C3D then the force frequency must be
a multiple of the marker capture rate.

NOTE: If there is an analog board in the system, that is also frame syn-
chronized, then it is recommended to use the same camera as syn-
chronization source for the analog board and the AMTI Gen5. Then you
can use the Sample rate option on the Analog board page to control the
frequency on both of the boards.

NOTE: The buffer of the AMTI Gen5 is 16 samples. Therefore if the num-
ber of analog samples is not an multiple of 16 there will be some empty
force plate samples at the end of the measurement.

The following information is displayed on the page:


Relative capture rate
The sample factor compared with the marker capture rate, which is the
same as the multiplier used for the Sync output.

Capture rate (marker)


The resulting AMTI Gen5 capture rate when using a camera in marker
mode as sync source.

Capture rate (video)


The resulting AMTI Gen5 capture rate when using a camera in video mode
as sync source.

Use the Force plate auto-zero settings to control when to auto-zero/reset the
force plate. There are two options to control the auto-zeroing.

PROJECT OPTIONS 307


On measurement start
When enabled the force plates are auto-zeroed at two operations in QTM:
just before the Start capture dialog is opened before a capture and in a
batch capture just before QTM starts Waiting for next meas-
urement/trigger.

On preview start
When enabled the force plates are auto-zeroed at two operations in QTM:
new file and changing a setting in Project option during preview.

NOTE: There is also an option to auto-zero the plates while QTM


are in preview mode. Just right-click in a Data info window that dis-
plays force data and select Zero all force plates. It is important to
use this zeroing at least every hour if both auto-zeroing options
have been disabled.

Arsalis

The Arsalis device settings are managed via the Arsalis settings page. Once the
Arsalis force plates are set up correctly, the force calculation can be defined
under the force plate settings, see chapter "Force plate settings" on page 362.
For information about how to connect and set up Arsalis force plates for use
with QTM, see chapter "Connecting Arsalis force plates" on page 759.

PROJECT OPTIONS 308


The Arsalis page contains three buttons to communicate with the force plates
and a list with settings for the located force plates.
Reset Settings
Reset settings all of the Common settings to their default values.

Locate Force Plates


Locate the force plates that are currently activated in the 3D-Forceplate
software.

Zero Force Plates


Zero the force data for all of the currently connected force plates.

The settings list contains a top section with common settings and a section with
individual settings for each force plate.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

IP Address
Enter the IP address for the computer running 3D-Forceplate soft-
ware.

Port Number
Enter the port number used by the Local server in the 3D-Forceplate
software.

Frequency
Enter the frequency for the force plates. The plates supports the fol-
lowing frequencies: 100, 200, 250, 400, 500, 1000, 2000. Make sure
that you select one that can be evenly divided by the camera frame
rate.

Trigger Mode
Set the trigger mode for the force plates. The default value is start
only. This is the recommended mode since you then get a

PROJECT OPTIONS 309


synchronized start with the camera system. Note that you must
make sure that the trigger options are correct in the 3D-Forceplate
software.
The other available options are as follows. The none option doesn't
have any hardware synchronization. The stop only option will stop
on the signal and the both option will both start and stop on the sig-
nal. There is currently no way to configure the sync signal from the
cameras in a way that is compatible with the two last options.

Enable Sync Out Signal


Enable a sync signal on the SyncOut port. The signal is configured in
the 3D-Forceplate software.

Individual settings and information for each force plate


The individual settings are only displayed after locating the force plates.
Note that all of these settings are imported from the 3D-Forceplate soft-
ware and cannot be changed in QTM.
Name, Manufacturer, Product, Model
Information about the force plate.

Width, Length, Height, Width Between Transducers, Height


Between Transducers
Force plate dimension.

Rotation X, Rotation Y, Rotation Z, Position X, Position Y, Pos-


ition Z
Rotation and Position of the force plate.

GRF Range X, GRF Range Y


Ground reaction force ranges for X and Y.

Channels
The channels captured from the force plates with their respective
unit and frequency. Note that the four last channels are either 1 or
0.
Force X, Force Y, Force Z (N, frequency)

Moment X, Moment Y, Moment Z (Nm, frequency)

Trigger (frequency)

PROJECT OPTIONS 310


Aux (frequency)

Zero (frequency)

Sync (frequency)

Bertec corporation device

The Bertec device settings are managed via the Bertec Corporation Device set-
tings page. Once the Bertec force plates are set up correctly, the force cal-
culation can be defined under the force plate settings, see chapter "Force plate
settings" on page 362.
For information about how to connect and set up Bertec force plates for use
with QTM, see chapter "Connecting Digital Bertec force plates" on page 764.

The Bertec page contains the following buttons to communicate with the force
plates and a list with settings for the located force plates.
Restore Default Settings
Restore settings to their default values.

Synchronize Settings
Synchronize changed settings to the Bertec device. Synchronize Settings
should be used when changing the Frequency setting.

PROJECT OPTIONS 311


Detect Plates
Detect the force plates that are currently connected to the computer.

Zero Plates
Zero the connected Bertec devices.

The settings list contains a top section with common settings and a section with
individual settings for each Bertec device.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

Frequency
Enter the frequency for the force plates. Make sure that it matches
the frequency for Sync out on the Synchronization page.

Force Plate Surface Offset (mm)


Enter the thickness of the cover on the force plates, for example
when using a carpet.

Autozero
Check to automatically re-zero the force plates at the start of pre-
view or when opening the capture dialog. Autozero applies only
when the measured vertical force is below 20-40 N.

Individual settings and information for each Bertec device


The individual settings are only displayed after locating the Bertec device.
Note that all of these settings are imported from the Bertec amplifier and
cannot be changed in QTM.
Name, Manufacturer, Product, Model
Information about the Bertec device.

Width, Length, Height


Device dimensions.

PROJECT OPTIONS 312


Rotation X, Rotation Y, Rotation Z, Position X, Position Y, Pos-
ition Z
Default Rotation and Position of the force plate. Note that the loc-
ation must be set for each force plate on the Force data page.

Channels
The channels captured from the Bertec device with their respective
unit and frequency.
Force X, Force Y, Force Z(N, frequency)

Moment X, Moment Y, Moment Z(Nm, frequency)

Kistler Force Plates

The Kistler device settings are managed via the Kistler Force Plates settings
page. Once the Kistler devices are set up correctly, the force calculation can be
defined under the force plate settings, see chapter "Force plate settings" on
page 362.
For information about how to connect and set up Kistler digital force plates for
use with QTM, see chapter "Connecting Kistler digital force plates" on page 768.

The Kistler Force Plates page contains the following buttons to communicate
with the force plates and a list with settings for the force plates included in the
configuration.

PROJECT OPTIONS 313


Reset Settings
Reset settings to their default values.

Sync Settings
Synchronize changed settings to the Kistler device. Sync Settings should
be used when changing the Frequency setting.

Zero Offsets
Manually zero the connected Kistler force plates.

NOTE: The force plates are automatically re-zeroed when starting a


preview or when opening the capture dialog.

The settings list contains a top section with common settings and a section with
individual settings for each force plate.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

DataServer version
The version number of DataServer.

Frequency
Enter the frequency for the force plates.

Individual settings and information for each Kistler force plate


The individual settings are displayed for the Kistler force plates included
in the configuration. Note that all of these settings are imported from the
Kistler configuration file and cannot be changed in QTM.
Name, Manufacturer, Product, Model
Information about the Kistler force plate.

Width, Length, Height


Force plate dimensions.

PROJECT OPTIONS 314


Rotation X, Rotation Y, Rotation Z, Position X, Position Y, Pos-
ition Z
Default Rotation and Position of the force plate. Note that the loc-
ation must be set for each force plate on the Force data page.

Shear range, Vertical range


Range settings used for plates connected via the Kistler 5695 DAQ,
as defined in the configuration file.

Shear/Vertical ranges
Select the force ranges for Kistler digital force plates.

Channels
The channels captured from the Kistler force plate with their respect-
ive unit and frequency.
Force X, Force Y, Force Z (N, frequency)

Moment X, Moment Y, Moment Z (Nm, frequency)

Instrumented treadmills
The treadmills that are selected as input device are listed under the Instru-
mented Treadmills category. Currently, the h/p/cosmos-Arsalis Gaitway-3D
treadmill is supported in QTM. Use the Add Device button to add the Gaitway-
3D treadmill to the Device list. The settings are managed via the Gaitway-3D set-
tings page, see "Gaitway-3D" below.
Gaitway-3D

The Gaitway-3D device settings are managed via the Gaitway-3D settings page.
Once the Gaitway-3D device is set up correctly, the force calculation can be
defined under the force plate settings, see chapter "Force plate settings" on
page 362.
For information about how to connect and set up a Gaitway-3D treadmill for
use with QTM, see chapter "Connecting a Gaitway-3D instrumented treadmill"
on page 797.

PROJECT OPTIONS 315


The Gaitway-3D settings page contain the following sections.
Gaitway-3D
Connection settings and information about the connected device. The
local server address of the Gaitway-3D data stream must be specified in
the IP-address field. Press the Connect button to connect to the Gaitway-
3D data stream.

Sample rate
Select the sample rate in Hz of the Gaitway-3D data stream from the Fre-
quency drop down menu.

Sync settings
Check the Simultaneous start option for synchronizing the Gaitway-3D
data with QTM captures.

Zero force plate


Press the Zero force plate button to set the zero level of the force plate
when unloaded.

PROJECT OPTIONS 316


EMGs
The available integrated EMGs are listed in the settings tree to the left under
the EMGs heading. The settings pages are specific for the used EMG devices.
For detailed information about supported devices, see "Wireless EMG systems"
on page 804.

Gloves
The available motion glove devices are listed in the settings tree to the left
under the Gloves heading.
Manus Gloves

The Manus Gloves settings are managed via the Manus Gloves settings page.
For information about how to connect and set up MANUS gloves for use with
QTM, see chapter "Connecting Manus gloves" on page 889.

The Manus Gloves page contains the following buttons and a list with settings
for the gloves.
Restore Default Settings
Restore the settings to their default values.

Synchronize Settings
Synchronize changed settings with the device.

PROJECT OPTIONS 317


The settings list contains a top section with common settings and a section with
the channels for the treadmill.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

IP
Enter the IP address for the Manus device.

Model Type
Skeleton type used for the glove bindings.

Gloves
Gloves are indicated with their serial number and contain information
about the glove type and data channels.

Generics
The available integrated generic devices are listed in the settings tree to the left
under the Generics heading.
h/p/cosmos treadmill

The h/p/cosmos treadmill settings are managed via the hpcosmos treadmill
settings page. For information about how to connect and set up a h/p/cosmos
treadmill for use with QTM, see chapter "Connecting the h/p/cosmos treadmill"
on page 912.

PROJECT OPTIONS 318


The h/p/cosmos page contains the following buttons and a list with settings for
the treadmill.
Restore Default Settings
Restore the IP setting to its default value.

Synchronize Settings
No action for the h/p/cosmos treadmill integration.

The settings list contains a top section with common settings and a section with
the channels for the treadmill.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

IP
Enter the IP address for the treadmill.

PROJECT OPTIONS 319


Channels
The channels captured from the treadmill, with their respective unit and
frequency.
speed
The speed of the treadmill in m/s.

elevation
The elevation of the treadmill in degrees.

heart rate
The heart rate in beats per minute.

Processing

The Processing branch of the options tree contains options for actions that can
be performed on the 2D data. The options have an effect on the 2D data in
real-time (preview), during a measurement and after the measurement, for

PROJECT OPTIONS 320


information see chapters Processing data and "Real-time streaming" on
page 590. For batch process a similar processing page is used see chapter
"Batch processing" on page 605.
The following actions can be performed:
l Auto backup (only Capture action), see chapter "Auto backup" on
page 572.
l Process every frame (only Real time action). For more information, see
chapter "Outline of how to use real time" on page 596.
l Store real-time data (only Capture action), see chapter "Store real-time
data" on page 572.
l Pre-process 2D data

l Track each frame


l 3D

l 2D (only Capture action)

NOTE: 3D and 2D tracking are mutually exclusive and can


therefore not be used at the same time.

l Merge with Twin Slave (only Capture action)

l Gap-fill the gaps (only Capture action)

l Apply the current AIM model

l Solve skeletons

l Apply SAL

l Apply glove data

l Calculate 6DOF

l Calculate force data

l Calculate gaze vector data

PROJECT OPTIONS 321


l Export to TSV file (only Capture action)

l Export to C3D file (only Capture action)

l Export to Matlab file (only Capture action)

l Export to AVI file (only Capture action)

l Export to FBX file (only Capture action)

l Export to JSON file (only Capture action)

l Export to TRC file (only Capture action)

l Export to STO file (only Capture action)

l Start program (only Capture action)

The actions can be defined separately for Real-time and Capture. Select each
action that will be used from the processing steps list. The settings for each
option can be reached by clicking on the corresponding page in the tree to the
left.
The following processing pages are not associated with actions, but the options
affect real-time output and the representation of rotations:
l Real-Time output

l 6DOF analog export

l Euler Angles

PROJECT OPTIONS 322


2D Preprocessing and filtering

The 2D Preprocessing and filtering page contains settings for the pre-
processing of 2D data before 3D tracking.
Non-circular marker settings (Oqus)

Use the Non-circular marker settings to define how to handle the segment
data that is sent from the Oqus cameras. The cameras will only send segment
data when the Marker circularity filtering option is enabled on the Cameras
page, "Marker circularity filtering (Oqus)" on page 234. How many markers that
have segment data depends on the Circularity threshold on that page.

NOTE: There can be too many markers considered as non-circular in the


camera. In that case there is a red warning in the left upper corner of the
2D view and all the markers that the camera have not been able to check
are considered to be OK.

The options for handling non-circular markers with segment data is:

PROJECT OPTIONS 323


Correct center point of non-circular markers
With this option non-circular (partially hidden) markers with segment data
are corrected, if possible, before they are used by the 3D tracker. The cen-
ter point will be more exact for partially hidden markers than the marker
calculated by the camera. The correction works best if the marker size is
between 500 and 1500 subpixels. If the marker size is below 320 subpixels
then the marker is not considered for correction since it is too small to
find a good center point for.

NOTE: If the Marker circularity filtering is not enabled on the Cam-


eras page, then the options are greyed out and the message No
marker segment data available is shown. It is the same when
reprocessing files that don't have any segment data.

Discard non-circular markers


With this option non-circular markers with segment data are discarded so
that they are not used by the 3D tracker. This mode is faster than the cor-
rection options, so it can be used in some cases when the processing time
is important.

For more information about how to use the non-circular marker settings see
chapter "Marker circularity filtering (Oqus only)" on page 541 and "How to use
circularity filter (Oqus only)" on page 610.
Filtering

The Filtering option applies software marker masks and size filtering to the 2D
data. The option is used on all of the cameras in the measurement. Markers
that have been filtered are indicated by a rectangular frame in the camera view
in the 2D view window.
Software marker masks (only available when reprocessing a file)
Enable software marker masks in reprocessing, see chapter "How to use
software marker masks" on page 611.

Minimum marker size


Set the minimum size for the 2D marker data in subpixel units to be used
in the 3D tracking. For more information about marker size filtering see
chapter "How to use marker size filter" on page 613.

PROJECT OPTIONS 324


Maximum marker size
Set the maximum size for the 2D marker data in subpixel units to be used
in the 3D tracking.

3D Tracking

The Tracking page contains settings for the 3D tracking of the motion capture
data. The 3D tracker uses 2D data of the cameras in the system to calculate a
trajectories in a 3D view, see chapter "3D tracking measurements" on page 614.
3D Tracker parameters

The 3D tracking function uses a small buffer (4 - 10 frames) of previous pos-


itions of a trajectory to predict the next location of the trajectory’s marker in 3D
space. The function uses a very general algorithm with just a few tracking para-
meters that adjust the equations to specific tracking problems. The con-
tribution of individual cameras to the estimated 3D position of a marker are
weighted so that rays closer to the center of the sensor are weighted higher
than rays closer to the edges.
The behavior of the 3D tracker can be controlled by the following parameter.

PROJECT OPTIONS 325


Prediction error

The Prediction error parameter specifies the maximum distance (in mm)
between a predicted position of a trajectory and a captured point that is
allowed for it to be assigned to that trajectory. The parameter therefore
provides a margin of error with which a 3D point may deviate from the
mathematically calculated next position. Since real-world data cannot be
expected to exactly fit an equation, this provides a mechanism for dealing
with the unpredictability of the real changes in a marker’s trajectory. The
example above shows the 3D point (the red ball) and its predicted next
position (the black cross). The blue sphere is the volume in which the next
3D point is accepted as part of this trajectory.

The default value of the Prediction error is 25 mm. If the Prediction


error is larger than half of the closest distance between two markers in
the measurement, there is a possibility of trajectory cross-over or swap-
ping between the two trajectories. Therefore, if erratic or jumpy motions
within single trajectories are seen, then it is likely that the parameter has
been set too high. On the other side if the value is set too small, excessive
division of trajectories (high segmentation) will occur, resulting in many
more trajectories than markers. The reason is that the distances because
of noise and acceleration might be greater than that allowed for by the
Prediction error parameter. Therefore, if high segmentation or gaps in
the data are experienced it is likely that the parameter is set too low.

Maximum residual

PROJECT OPTIONS 326


The 3D tracking function uses the Maximum residual parameter
together with the Prediction error to control the continuation of already
started trajectories. The Maximum residual sets a limit (in mm) to the
distance from the final location of the 3D point within which all inter-
secting camera rays are considered to belong to that 3D point. The
example above shows three camera rays where the intersections are not
in the same point, but they are all within the Maximum residual (d) of
the 3D point (the red ball). The parameter is shown as a circle but in real-
ity it is of course a sphere around the 3D point.

The default value of the Maximum residual is 6 mm. The value for the
parameter can usually be set to 2 - 5 times the average residual values in
the calibration result for the camera system. A too large value will slow
down the calculation and will tend to produce merged 3D points, resulting
in defective trajectories. On the other hand a too small value will probably
cause ghost markers and high segmentation, resulting in many more tra-
jectories than markers.

Minimum trajectory length

The Minimum trajectory length parameter defines the minimum num-


ber of frames for a trajectory. The default value is 2 frames, which is also
the minimum value since a trajectory that is only 1 frame long is not really
reliable.

Increase the number of frames if you have a lot of short extra trajectories.
When those are removed the AIM process will also work better. However
it is a good idea to check so the extra trajectories are not caused by some-
thing else, e.g. a bad calibration or something reflective.

Minimum ray count per marker

The Minimum ray count per marker parameter defines the minimum
number of cameras required for the 3D tracking of a marker. The default
value is 2, which imposes no limitation to the 3D tracking.

Increasing this value adds constraints to the 3D tracker by requiring that


at least the set number of cameras must be involved in the reconstruction
of a 3D marker. This can be useful in situations in which tracking artifacts
in the form of ghost markers occur. It is recommended to keep this value

PROJECT OPTIONS 327


as low as possible since higher values may result in additional trajectory
gaps and a decrease of the effective capture volume.

Ray length limits

The ray length limits represent the minimum and maximum distance
between a marker and a camera to be used for 3D tracking. The units are
in meters.

When the option Automatically calculate ray length limits is checked,


the ray length limits are automatically calculated based on the current cal-
ibration. When this option is unchecked, you can manually set the min-
imum and maximum ray lengths.

NOTE: For fixed calibrations the automatic minimum ray length is


always set to 0.5 m.

Rays

Camera tracking rays are a mapping between 2D data of the cameras and the
3D trajectories, which can be shown in the 3D view window, see chapter "Rays
in 3D views" on page 130.
To show the rays, they to be stored during the 3D tracking stage of the pro-
cessing. To store the rays, make sure that the option Store is enabled.

NOTE: When the Store option is enabled, the processing will take longer
time and the file size of the QTM file will be larger.

Auto join

As soon as a marker is completely hidden during the measurement, even if it is


just for one frame, a new trajectory will be started by the tracking function. The
Auto join function examines the trajectories after the tracking to see if any can
automatically be joined. The Max frame gap specifies the number of frames
allowed between two trajectories that will be qualified for joining, the matching

PROJECT OPTIONS 328


number of milliseconds with the current frequency is displayed next to the
option. The Auto join function then uses the tracking parameters to decide if
two trajectories can be joined.

NOTE: The Auto join function is enabled by default with a value of 10


frames, and it can be used to make the identification easier.

NOTE: Auto join only joins two trajectories so that they are one in the tra-
jectory info windows. If you want to fill the gap between the two tra-
jectories with data you need to use the gap-fill function, see chapter
"Trajectories" on page 338.

Bounding box restricting 3D data

The bounding box is the volume in which the trajectories will be calculated. The
use of a bounding box can be helpful to reduce the amount of unneeded data
by discarding 3D data which falls outside the volume of interest. The bounding
box is displayed as a white outlined box in the 3D view, the display is controlled
on the 3D view settings page.
When enabling the bounding box, the 3D data is restricted to the volume
spanned up by the bounding box. The limits of the bounding box can be
defined in the text edit fields as distances (in mm) in each direction to the ori-
gin of the global coordinate system. The default behavior is that 3D points that
fall outside the bounding box are discarded, i.e. the option Reevaluate track-
ing solution if 3D point is outside bounding box is unchecked. By checking
the option Reevaluate tracking solution if 3D point is outside bounding
box the 3D tracker will try to find alternative solutions for 3D points if the initial
solution lies outside the bounding box. This can be useful for small systems, for
example with two cameras.

PROJECT OPTIONS 329


NOTE: If you use a bounding box and then translate or rotate the global
coordinate system the bounding box will also be moved. This may result
in that you move the bounding box outside the wanted measurement
volume. Therefore the bounding box must be corrected manually after
the transformation.

Auto range

The Auto range option automatically sets the measurement range so that it
only includes frames that have 3D data. The empty frames are not deleted just
placed outside the measurement range. Increase the measurement range on
the Timeline control bar if you want to use more frames, see chapter "Timeline
control bar" on page 133.

2D tracking

The 2D tracking page contains settings for the 2D tracker. The 2D tracker uses
the data of just one camera to calculate trajectories in a plane in the 3D view
window, see chapter "2D tracking of data" on page 618.

PROJECT OPTIONS 330


Tracking settings

Under the Tracking settings heading you can select the camera that will be
tracked with the Camera to track option. Choose the camera from the drop-
down list. You can only 2D track one camera at a time.
The option Turn on track filtering and remove unidentified tracks is very
useful to limit the number of trajectories in the 2D tracking output. It filters the
data so that short identified tracks and the 2D markers that have not been iden-
tified are not displayed as trajectories. Use the option Minimum length of
identified tracks to set the minimum number of frames for an identified track
in the 2D data to be transferred to a 3D trajectory.

IMPORTANT: If you turn off the filter the unidentified markers in the 2D
data will be transferred to a 1 frame long trajectory. This means that the
data will be very fragmented if the 2D tracker has left a lot of markers
unidentified.

Auto join

As soon as a marker is completely hidden during the measurement, even if it is


just for one frame, a new trajectory will be started by the tracking function. The
Auto join function examines the trajectories after the tracking to see if any can
automatically be joined. The Max frame gap specifies the number of frames
allowed between two trajectories that will be qualified for joining, the matching
number of milliseconds with the current frequency is displayed next to the
option. The Auto join function then uses the tracking parameters to decide if
two trajectories can be joined.

NOTE: The Auto join function is enabled by default with a value of 10


frames, and it can be used to make the identification easier.

PROJECT OPTIONS 331


NOTE: Auto join only joins two trajectories so that they are one in the tra-
jectory info windows. If you want to fill the gap between the two tra-
jectories with data you need to use the gap-fill function, see chapter
"Trajectories" on page 338.

2D to 3D settings

The 2D to 3D settings heading contains settings for how the 2D data is dis-
played in 3D view window.
Distance to measurement plane
Set the distance from the camera sensor to the measurement plane in
mm. The measurement plane will be the same as the green grid in the 3D
view window. With this setting you can get the distances in the 3D view
window to match the real distances. You must measure the distance your-
self, but for most applications the accuracy of the distance does not have
to be better than in cm.

Map positive x axis of the sensor to


Select the axis that will map to the positive x axis of the sensor, i.e. to the
right if looking in the direction of the camera. Choose the axis from the
drop-down list.

Map positive y axis of the sensor to


Select the axis that will map to the positive y axis of the sensor, i.e. down-
wards in the camera view. Choose the axis from the drop-down list.

Axis pointing upwards


Select the axis that will be pointing upwards. Choose the axis from the
drop-down list. With this setting you can move the camera in the 3D view
so that it looks from different directions, i.e. from the side, from above or
from below.

PROJECT OPTIONS 332


Twin System

The settings on the Twin System page are used to enable and control the Twin
system feature. This feature enables a Twin master system to control a Twin
slave system on another computer. It can for example be used to combine an
underwater system with one above water. For more information about the
Twin system feature, see chapter "Twin systems" on page 514.
To enable the Twin master functionality select the Enable Twin Master option
on the system that you want to use as Twin master. The Twin System settings
are then used to control the Twin slave computer and the merging of data.

NOTE: You do not need to change any settings related to Twin systems
on the Twin slave system.

To enable the automatic transfer of the Twin slave file you must select the
Merge with Twin Slave option on the Processing page.
The two systems must also be calibrated with a twin calibration, for inform-
ation about the settings see chapter "Twin System Calibration" on page 336.
Twin Slave System

The Twin Slave System settings controls which slave is controlled by the Twin
master system.

PROJECT OPTIONS 333


Use the Find Slaves button for QTM to search for any available QTM programs
on other computers. QTM must be running on the other computer and connect
via Ethernet (directly or via the LAN) to the Twin master computer. Only com-
puters with a located camera system will show up in the list. Select the com-
puter that will be the Twin slave from the list below the Address setting.

It is possible to enter an address manually in the Address setting if the Find


Slaves option fails. The Port and the Password setting only needs to be
changed if they have been modified in QTM of the Twin slave system.
Capture Frequency

The Capture Frequency settings control the frequency and synchronization of


the Twin master and slave systems. The settings that are set for the two sys-
tems will override any other settings on the two systems. However, the timing
settings will not change until you start a new measurement on the Twin master
with New on the File menu. The following settings can be set:
Frequency
The frequency for the two systems must always be set. The Twin slave fre-
quency cannot be higher than the Twin master. It is recommended to use
the same frequency or a divisor of the Twin master frequency for the Twin
slave, but it can be any frequency.

IMPORTANT: If you do not use the same frequencies for the two
systems then the data of the slave system is interpolated. When you
use a divisor of the twin master frequency, then the interpolated
data is just filled between the actual 3D data. If the twin slave fre-
quency is not a divisor of the twin master frequency, then QTM will
interpolate all of the slave data so that it matches the twin master
frequency.

Enable Frame Sync


When enabled the frame sync will control the External timebase settings
of the two systems so that they are frame synchronized with each other.
The use of Frame Sync is recommended. For information about how to

PROJECT OPTIONS 334


connect the synchronization signal between the systems see chapter
"How to use frame synchronized twin systems with separate volumes" on
page 514.
You need to select the Frame Sync Master Source that will send the 100
Hz continuous signal on its Sync out connection. The system with the
Frame sync master source will be set to the Internal 100 Hz option for
external timebase. The other system will be set to using the Control port
or SYNC option. The Multiplier and Divisor options of the systems will be
set to numbers displayed next to the Frequency setting.

NOTE: The 100 Hz continuous output is not available on Oqus 1, 3


and 5 series cameras; the use of Frame sync requires at least one
other type of Oqus camera or a Camera Sync Unit.

Frame Sync Master Source (only available with the Frame sync option)
Select the device that you want to use as frame sync master. This can be a
sync output on the Camera Sync Unit or an Oqus camera. Devices with
multiple sync out ports (e.g. Camera Sync Unit) will additionally let you
specify which port to use (currently, only the Out 1 port on the Camera
Sync Unit is supported for Twin). The Frame sync master source can be in
either the Twin master or the Twin slave system.
Trajectory Conflicts

If there are two labeled markers with the same names in the twin master and
twin slave file, then QTM will have to know how to handle the trajectories. QTM
can either Merge trajectories or Rename trajectories with extension. If the
trajectories are merged then the twin slave trajectory will be added to the twin
master trajectory and if there are any overlaps the twin master data is used.
When renaming the trajectories then the extension _slave is added to all of the
labeled twin slave trajectories.

NOTE: This option will only be used if the Twin slave file contains labeled
trajectories. Most of the time it is recommended to not identify the data
on the slave file and instead apply AIM to the merged data.

PROJECT OPTIONS 335


NOTE: If the volumes are completely separated there will be no overlaps
in the merged data.

Twin System Calibration

The Twin System Calibration page contains settings for the twin calibration of
the two systems and describes the Translation and Rotation that is used for
transforming the twin slave data to the twin master coordinate system. For
information on how to perform a twin system calibration see chapter "Per-
forming a Twin calibration" on page 519.
The result of the last twin calibration is displayed below the transformations:
Calibration time
The processing time of the current twin calibration. Since it is only the pro-
cessing time that is displayed you need to check which files that are used
to be sure exactly what is used, see chapter "Twin System Calibration dia-
log" on the next page.

Average Wand Length (mm)


The average wand length of the current twin calibration.

Wand Length Error RMS (mm)


The wand length error of the current twin calibration.

PROJECT OPTIONS 336


Click on Reset Calibration if you want to zero the twin calibration data. Click
on Calibrate to change the twin calibration, see chapter "Twin System Cal-
ibration dialog" below.

Twin System Calibration dialog

The Twin system calibration dialog is opened with the Calibrate button on
the Twin system calibration page. From the dialog you can control and
change the current twin calibration. It can either be changed by updating the
measurements used for twin calibration or manually.
Manual calibration
Enter the translation and position of the twin calibration. If you have
made a twin calibration from two files then it is the result that is dis-
played. However, you can also enter the numbers manually, but it is not
recommended if you want the best accuracy. Click on Calibrate to update
the twin calibration with the manually entered data.

Calibrate using two measurements


These settings displays the current files used for the twin calibration. You
can click Browse to change the files or change the Wand length. Click on
Calibrate to update the twin calibration.

PROJECT OPTIONS 337


Trajectories

The Trajectories page contains the setting Gap Fill Settings.


Gap Fill Settings

Under the Gap Fill Settings heading there are options for the gap fill func-
tionality.
Set the Max frame gap option to select how many frames that can be gap
filled without preview. The trajectories will only be gap filled if the gap between
the two parts is less than the Max frame gap. The matching number of mil-
liseconds with the current frequency is displayed next to option.
Select the Default interpolation type to decide whether Polynomial or Lin-
ear gap fill is used by default. The Polynomial gap fill uses two frames before
and after the gap to calculate a third degree polynomial. If the gap starts or
ends with a trajectory part which consists of one frame, then the polynomial
gap fill will use the next available frame to calculate the polynomial. If there is
no other trajectory part then polynomial gap fill is calculate using just that one
frame.
The Linear gap fill calculates the line between the start and end frame of the
gap.

PROJECT OPTIONS 338


The gap fill function is applied to trajectories either via the Trajectory info win-
dow menu or as a processing step. When gap fill is performed as a processing
step gap fill is tried on every trajectory.
If gap fill is applied from the Trajectory info window menu, it is only the selec-
ted trajectories that are gap filled and the setting is retrieved from the Max
frame gap setting saved in the file. To change the gap settings you need to
reprocess the file with the new gap fill settings under the Trajectories pro-
cessing options.

AIM

The AIM page contains settings for the Automatic Identification of Markers
(AIM) function, for information about AIM see chapter "Automatic Identification
of Markers (AIM)" on page 624.
AIM models

Under the AIM models heading there are two lists with AIM models. The
Applied models list are the models that are currently in used by QTM. There
can be several models in this list and then QTM will try to apply them all.
Remember to Remove (moves the model to the Previously used models list)
the unnecessary models from this list otherwise the AIM application might fail.
Add a saved model to the list with Add model. If you want to apply the same
AIM model to multiple actors, you can set the number of actors in the Nr To
Apply column.

PROJECT OPTIONS 339


The Previously used models list contains all the models that have been used
by QTM. If you want to use a model again click on Add to move the model to
the Applied models list. Remove a model from the list by clicking on Remove
from list, it will only remove the model from the list the AIM model file is not
deleted.
AIM model application parameters

Under the AIM model application parameters heading you can set the fol-
lowing settings that adjust the application of the AIM model.
Relative marker to marker distance tolerance
Change the permitted marker to marker distance relative the model
marker to marker distance. As default any marker that is within ±30 % of
the model's marker to marker distance will be tried in the model applic-
ation.

Use the physical IDs of active markers (if present in both file and
model)
This setting only applies to files with active and passive markers in the
same file. When activated AIM will use the active marker IDs to match the
markers with the AIM model, which means that you will aid the AIM model
when identifying the rest of the markers. The AIM model must have been
created with active markers placed on the correct positions in the
AIM model.

Keep the current labels as a starting point (this setting is reset


every time)
Use this setting to help AIM with the identification in a file. When activ-
ated AIM will keep the identified trajectories as they are and try to find a
solution for the rest of the trajectories that matches these.

NOTE: The setting is inactivated after each time it is used, because


to be able to use it you must manually identify markers before
applying AIM.

Use random trajectory color for each AIM file


Use this option to assign a random color to the trajectories for each

PROJECT OPTIONS 340


applied AIM model. This can be helpful to distinguish trajectories when
using the same AIM model multiple times.

Skeleton solver

The Skeleton Solver page is used to manage the skeleton definitions used in
the project. For more information on skeleton tracking, see chapter "Tracking
of skeletons" on page 671.
Marker Label Mapping
Mapping of markers associated with a skeleton. See chapter "Skeleton
marker label mapping" on page 681 for how to use a custom marker map-
ping.

Skeleton template
Select template for global parameters used by the skeleton calibration.
See chapter "Skeleton template" on page 694 for how to use the Skeleton
template.

Skeletons
List of skeletons loaded in the project. The column are:

PROJECT OPTIONS 341


Name
Name of the skeleton, corresponding to the used prefix of the tra-
jectory labels.

Scale factor (%)


Scale factor of the skeleton in percent used for streaming and
exporting skeleton data. The scale factor is applied inversely. For
example, for a scaling factor larger than 100% the skeleton data is
scaled down. For more information about the use of the scale factor,
see chapter "Scale factor" on page 693.

The skeleton list can be managed using the following buttons.


Remove: Remove the selected skeleton.

Load: Load skeleton from QTM skeleton file.

Save: Save selected skeleton to QTM skeleton file. If you select mul-
tiple skeletons, each skeleton will be saved in a separate file.

Skeleton solver parameters


Options for the skeleton solver.
Marker count threshold
Set the threshold for how many Segment markers that needs to be
identified in a skeleton, for the skeleton solver to calculate skeleton
data. The percentage is calculated for all Segment markers in a skel-
eton. For more information about the use of the marker count
threshold, see chapter "Marker count threshold" on page 701.

PROJECT OPTIONS 342


SAL

Skeleton Assisted Labeling (SAL) uses the skeleton segment markers to identify
unlabeled trajectories or parts that can be associated with them. It requires a
solved skeleton to be able to identify trajectories. The unidentified trajectory
part that is closest to a missing segment marker and fulfills the below criteria is
added to the corresponding labeled trajectory.
SAL uses the following marker to segment marker distance criteria for labeling
unidentified trajectory parts.
Claim threshold
Required closeness of a marker to a segment marker. At least one frame
of an unidentified part must be within the claim threshold of a segment
marker in order to be labeled as the corresponding trajectory. The default
value is 20 mm. Use a lower value when markers can be close to each
other for example when solving fingers.

Disqualification threshold
The maximum tolerated distance of a marker to a segment marker. If any
frame of a claimed part is beyond the disqualification threshold, it will be
disqualified as a solution. This setting prevents that wrong unidentified
markers that happened to be close to a missing segment marker at some
instance are accepted as a solution. The default value is 200 mm.

For more information about SAL see chapter "How to use SAL" on page 700.

PROJECT OPTIONS 343


Glove

The glove processing step allows data from motion gloves to be applied to skel-
etons within QTM. For more information about currently supported motion
gloves, see chapter "How to use motion gloves" on page 889.
The glove processing step settings dialog is used to create bindings which asso-
ciate a glove to the skeleton its data will be applied to. To create a new binding,
select an available glove in the bottom row of the bindings grid and select the
associated skeleton.
When running this processing step, all segments in the skeleton with names
that match the data sent from the glove will have the rotation data from the
glove applied to them. This requires that the target skeletons have hand hier-
archies which match that of the glove.
The use of motion gloves is natively supported by the Qualisys Animation skel-
eton. For more detailed instructions, see the chapters for the respective motion
gloves under "How to use motion gloves" on page 889.

PROJECT OPTIONS 344


6DOF tracking

The 6DOF Tracking page contains the 6DOF Tracker parameters and the list
of Rigid bodies. The 6DOF tracker uses this information to calculate the pos-
ition and rotation from the 3D data, see chapter "6DOF tracking of rigid bodies"
on page 649.
It is also possible to export the 6DOF tracking output to another computer and
as an analog signal, see chapters "Real-Time output" on page 387 and "6DOF
analog export" on page 388, respectively.
6DOF Tracker parameters

Specify the global tracker parameter for the 6DOF tracking under the 6DOF
Tracker parameters heading. More 6DOF tracker parameters, in particular
Min. markers, Max. residual and Bone tolerance, are available for the indi-
vidual rigid body definitions, see chapter "Rigid bodies" on the next page.
Reidentify all body markers
Enable this option to reidentify the markers for all rigid bodies using the
Rigid bodied definitions and settings for reprocessing. Only available
when reprocessing a file, for more information see chapter "Calculating
6DOF data" on page 660.

NOTE: Rigid bodies that are parts of AIM models are not affected
by this option.

PROJECT OPTIONS 345


Do not require that the whole body is visible before identifying it
the first time
Enable this option to remove the requirement that all markers of the rigid
body must be visible to identify the body the first time. This can help the
6DOF tracker to find a solution in case only a part of the body is visible.

Calculate missing markers in rigid bodies


Active the option to calculate virtual trajectories for lost 6DOF markers in
RT and files, for more information see chapter "Virtual markers calculated
from 6DOF data" on page 662.

For information on 6DOF tracking see chapter "6DOF tracking of rigid bodies"
on page 649.
Rigid bodies

The Rigid bodies list contains the definition of the 6DOF bodies. The bodies are
used by the 6DOF tracking to find the measured rigid bodies in a motion cap-
ture. The list consists of the following columns. In case a column refers to sep-
arate items for the rigid body definition or its points, respectively, this is
indicated by the / separator.
Rigid bodies:
Label
The Label column contains the name of the rigid body and its points.
Double-click on the name of the rigid body or the points to edit them. The
points can have any name, however if the same name is used in another
6DOF body or an AIM model then you need to follow the instructions in
chapter "How to use 6DOF bodies in an AIM model" on page 669.

Enabled
Enable or disable calculation of 6DOF data for rigid bodies with the check
box in the Enabled column. Disabled rigid bodies will appear as "Dis-
abled" in the Data Info window and count for the indexing of rigid bodies
in the real-time stream.

Color
The color of the rigid body is displayed in the X / Color column on the
same row as the name of the rigid body. Double-click on the color to open
the Color dialog where any color can be selected. The color is used in the
3D view window for the markers of the rigid body and for its name.

PROJECT OPTIONS 346


NOTE: The 3D trajectories automatically have a slightly brighter
color than the 6DOF body and therefore it will look like the markers
have two colors.

Origin and Orientation.


The coordinate system that the 6DOF body data refer to is displayed in
the Y / Origin and Z / Orientationcolumns on the same row as the name
of the rigid body. Double-click on either the origin or the orientation set-
ting to open the Coordinate system for rigid body data dialog, see
chapter "Coordinate system for rigid body data" on page 354.

Min. markers
Specify the minimum number of markers required for 6DOF tracking of
the rigid body.

Max. residual
Specify the maximum residual accepted for 6DOF tracking of the rigid
body.

Bone tolerance
The Bone tolerance (in mm) is the maximum separation between the
lengths of the corresponding bones in a rigid body definition and a meas-
ured rigid body. E.g. if the Bone tolerance is specified to 5.0 mm and the
current bone in the rigid body definition is 100.0 mm, then the measured
separation between two markers must be in the range of 95.0 - 105.0 mm
for the tracker to accept the possibility that the calculated markers may
be the pair specified for the rigid body.

The default value of the Bone tolerance is 5 mm. Increase the value of
the parameter if the 6DOF tracking cannot find the body. Decrease the
value of the parameter if a body is found but the orientation or something
else is wrong.

The effect of the Bone tolerance differs slightly between RT and in files.
In RT the marker that is outside the tolerance will be unidentified and the
6DOF body will be calculated from the remaining markers. In a file the
automatic 6DOF tracker will discard the whole trajectory that is wrong
and then calculate the 6DOF body from the other trajectories. However if

PROJECT OPTIONS 347


you identify the trajectories manually and then just Calculate 6DOF, then
there will be no 6DOF data in frames where a marker is outside the tol-
erance.

Filter
Select the filter for smoothing 6DOF data. The default is No filter. The fil-
ter is applied both in real time and in a capture. For more information
about smoothing 6DOF data, see chapter "Smoothing 6DOF data" on
page 356.

Mesh
Object file of 3D mesh associated with rigid body. Double-click on the
mesh setting of the rigid body to open the Mesh Settings dialog, see
chapter "Rigid body Mesh Settings dialog" on page 358.

Points:
X, Y, Z
The X, Y and Z columns contain the coordinates of the points in reference
to the local origin. Double-click on the coordinates of to edit them.

Virtual
Select this option to make a point in the 6DOF body virtual, see chapter
"Virtual markers calculated from 6DOF data" on page 662.

Id
The ID of the trajectory in case the point is associated with a sequentially
coded active marker.

The options that are used to edit the rigid bodies or their points are described
below:
Translate
With Translate the local origin of the 6DOF body definition can be moved
to any place in reference to the points of the body, which means that the
rotation center of the body is changed. The local origin is also the origin of
the coordinate system that represents the 6DOF body in the 3D view.
Click Translate to open the Translate body dialog, see chapter "Trans-
late body" on page 350.

Rotate
With Rotate the pitch, roll and yaw of the local coordinate system is
changed. This will change the orientation of the local coordinate system in

PROJECT OPTIONS 348


reference to the global coordinate system. I.e. it changes the rotation of
the rigid body where its roll, pitch and yaw are zero in reference to the
global coordinate system. Click Rotate to open the Rotate body dialog,
see chapter "Rotate body" on page 352.

Edit color
Open the Color dialog where any color can be selected. The color is used
in the 3D view window for the markers of the rigid body and for its name.

NOTE: The 3D trajectories automatically have a slightly brighter


color than the 6DOF body and therefore it will look like the markers
have two colors.

Coordinate system
Change the definition of the local coordinate system, see chapter
"Coordinate system for rigid body data" on page 354.

Reset rotation
This will reset the orientation of all the rigid bodies in the list. Reset
means that the local coordinate systems will be aligned to the global
coordinate system and all the angles will therefore be zeroed.

NOTE: The angles may differ from zero after reset if another ref-
erence system than the global coordinate system is defined in the
Coordinate system for rigid body data dialog.

Add body
Add a new body to the Rigid bodies list. The new body will be empty and
called ’New Body #1’, ’New Body #2’ and so on.

Remove body
Remove the selected body from the Rigid bodies list.

Add point
Add a point to the selected rigid body.

Remove point
Remove the selected point.

PROJECT OPTIONS 349


Edit point
Edit the selected point. Use Tab and Shift+Tab to go to the next respect-
ively the previous coordinate.

Edit label
Edit the selected label (rigid body or point).

Acquire body
Acquire the rigid body definition from the current marker positions in RT/-
preview mode, see chapter "Acquire body" on page 356.

Load bodies
Loads bodies to the Rigid bodies list from an XML file.

NOTE: Load bodies will overwrite any other bodies in the list.

Save bodies
Save the bodies and all of the individual options in the Rigid bodies list to
an XML file. Specify the name and the folder and click Save. The file can
be edited in a text editor, e.g. Notepad. The xml format is the same as
used for the RT protocol, see 6DOF xml parameters.

NOTE: Make sure that all of the bodies for the measurement are in
the same file, since Load bodies overwrites the bodies in the list. If
you want to combine the rigid bodies from two or more different
files you can copy-past them into a single file.

Translate body

PROJECT OPTIONS 350


The Translate body dialog contains the following five ways to translate the
local origin:
To local coordinates (in mm)
Specify the translation of the local origin in the X, Y and Z direction of the
local coordinate system. E.g. if the local origin is translated 1 mm in the X
direction all of the points’ X coordinates will be 1 mm less than before.

To the current position of this rigid body


Move the local origin to current position of the selected rigid body in the
list. This will zero the position of the body in the list if its position is
referred to the current rigid body.

To the geometric center of the body (the average of the body


points)
Move the local origin to the geometric center of all the points in the rigid
body definition. The geometric center can be seen as the center of mass
of a body with the same weight at all of the points.

To point in the body


Move the local origin to one of the points in the rigid body definition.
Enter the number of the point that is used as local origin.

So that point ... in the body has local coordinates (in mm)
Move the local origin so that one of the points in the rigid body definition
has a desired position. Enter the number of the point and the position in
X, Y and Z direction (local coordinate system).

To the spherical centroid of the body (body should be a sphere)


Move the local origin to the spherical centroid of all the points in the rigid
body definition. The points should be approximately configured in a
sphere. The estimate of the centroid is based on a least square fit of the
points.

PROJECT OPTIONS 351


Rotate body

The Rotate body dialog contains the following ways to rotate the local coordin-
ate system:
Rotate the system
Rotate the local coordinate system clockwise around one of the axes,
when looking in the positive direction. Choose the angle of rotation either
in Degrees or in Radians. Then select which axle to rotate round, X, Y and
Z.

NOTE: The rotation are not changed by the Euler angles but will
always be the same.

Align the body using its points


Define the rotation of the local coordinate system with three or four
points from the rigid body definition. You can use 0 to signify the origin of
the local coordinate system if you want to use it as one of the points. See
detailed instructions for using this method below.

PROJECT OPTIONS 352


Rotate as this rigid body
Rotate the local coordinate system to current orientation of the selected
rigid body in the list. This will zero the orientation of the body in the list if
its orientation is referred to the current rigid body.

Follow these steps to set the rotation with the "Align the body using its points"
method:

1. First start with defining one of the axes, choose which one from the drop-
down list, by making it parallel to a line from one point to another, enter
the number of the points in the body definition. The direction will depend
on the order of the points so that the axis will always point in the dir-
ection from the first point to the other.

NOTE: The first point does not need to coincide with the origin of
the rigid body.

2. Then you define the direction of a second axis, choose which one from
the second drop-down list. The following options are available for defin-
ing the second axes:
a. Intersect point
The intersect option means that the second axes will point in the dir-
ection of the specified point. However since the axes must be ortho-
gonal, the second axes will actually intersect the projection of the
point on the coordinate plane orthogonal to the first axes. The name
of the orthogonal plane is displayed in the dialog.
The example below displays the second axes defined by the pro-
jection of point 3 on the orthogonal plane defined by the line from
point 1 to point 2.

PROJECT OPTIONS 353


b. Parallel to the line from point
The parallel option means that the second axes will point in the dir-
ection from the first specified point to the second as projected on to
the orthogonal plane to the first axes. The name of the orthogonal
plane is displayed in the dialog.
The example below displays the second axes defined by the pro-
jection of the line from point 3 to point 4 on the orthogonal plane
defined by the line from point 1 to 2.

Coordinate system for rigid body data

To describe the position and orientation of the 6DOF body its data must be
referred to another coordinate system. By default the data is referred to the
position and orientation of the global coordinate system. However, with the set-
tings on the Coordinate system for rigid body data dialog you can refer the
local coordinate system to the following alternatives of coordinate system ori-
gin and orientation.

PROJECT OPTIONS 354


Use the global coordinate system
The position is referred to the origin and orientation of the global coordin-
ate system.

Use the coordinate system of this rigid body


The position is referred to the current position and orientation of another
rigid body. This means that if the reference body moves the 6DOF body
data will change even if that body is stationary. Select the body from the
drop-down list.

NOTE: Using another body as reference will increase the noise in


the 6DOF data, especially if there is a long distance between the two
bodies. This is because a small rotation error in the reference body
will result in a much larger noise in rigid body.

NOTE: If the reference 6DOF body cannot be tracked the 6DOF


body will disappear in the Data info window. However the 6DOF
data is always saved and displayed in the 3D view window so that if
the file is reprocessed with another reference the 6DOF body will
appear again.

Use this coordinate system (relative the global coordinate system)


The position is referred to a stationary point defined in the global coordin-
ate system. Define the position in mm in the three directions (X, Y and Z)
and orientation in degrees for Roll, Pitch and Yaw.

NOTE: Roll, pitch and yaw is the Qualisys standard, but if the Euler
angles definition are changed on the Euler angles page the new set-
tings will be used in this dialog.

With Get position and Get orientation the current position or ori-
entation is acquired. Which means that the data will be zeroed for the cur-
rent position of the rigid body.

PROJECT OPTIONS 355


Acquire body

With Acquire body a rigid body definition can be acquired from preview mode.
Place the rigid body with the markers in the measurement volume and open a
new file with New on the File menu. Open the 6DOF Tracking page in the Pro-
ject options dialog and click Acquire body to open the Acquire body dialog.

Specify the number of frames to collect with the Frames to acquire setting.
Click Acquire to start the acquisition. The points of the rigid body definition are
then calculated from the average of each marker’s position in these frames.
The Stop option can be used to cancel the collection before all frames have
been captured.
To see that the 6DOF tracking can find the body, change to 6DOF tracking on
the Processing page and click Apply. The body should appear in the 3D view.

NOTE: The measurement must be done on a stationary rigid body with


at least four markers and the body cannot be flat.

NOTE: It is a good idea to place the body so that the orientation of the
desired local coordinate system is aligned with the global coordinate sys-
tem. It is also a good idea to place the desired origin of the local coordin-
ate system in the origin of the global coordinate system. Another way to
easily define the local origin of the body is to use an extra marker placed
at the desired location of the local origin. After acquiring the body
coordinates, use the Translate body dialog to translate the local origin to
the location of the extra marker. Then delete the extra marker from the
body definition with Remove point.

Smoothing 6DOF data

It is possible to smooth 6DOF data in QTM. The smoothing can be applied both
in real time and in a capture. Smoothing of 6DOF data can be useful if you need
to stabilize noisy data. The smoothing is applied to both position and ori-
entation.

PROJECT OPTIONS 356


The smoothing algorithm includes two types of smoothing. The first type is
"Holt-Winters" double exponential smoothing, taking into account the trend of
the data. The second type is a jitter reduction filter, operating within a specified
radius. In addition, there is a prediction parameter that can be used to project
the trend ahead to compensate for lag. The parameters are:
l Data smoothing factor of the double exponential filter s, with a value
between 0 and 1.

l Trend smoothing factor of the double exponential filter c, with a value


between 0 and 1.

l Prediction parameter p, corresponding to the number of frames the trend is


projected ahead.

l Position jitter radius rp in mm.

l Orientation jitter radius ro in degrees.

A disadvantage of smoothing is that it may introduce side effects. Smoothing


and jitter reduction may introduce lag, whereas trend compensation may lead
to overcompensation or overshoot. The smoothing effect of the filter, including
the side effects may depend on the characteristics of the movement, as well as
the capture rate. This should be taken into account when choosing the smooth-
ing parameters.
QTM includes presets with useful combinations of smoothing parameters for
various situations. The presets can be selected individually per rigid body. The
available filter presets are:
No filter
No smoothing. This is the default option.

Multi-purpose
Light smoothing and jitter reduction. This preset is suitable for most situ-
ations which require effective smoothing of light noise, while keeping side
effects to a minimum. The filter parameters are: s=0.25, c=0.25, p=0, rp=5,
ro=5.

PROJECT OPTIONS 357


High stability
Strong smoothing and jitter reduction. This preset is suitable when you
need stable data in challenging tracking situations. In case of high accel-
erations or impacts filtering may lead to noticeable side effects. The filter
parameters are: s=0.5, c=0.5, p=1, rp=25, ro=25.

Static pose
Effective jitter reduction for immobile objects. Application of this preset to
moving rigid bodies may lead to noticeable lag for small displacements
and rotations within the specified radius. The filter parameters are: s=0,
c=0, p=0, rp=10, ro=10.

The selected presets in the rigid body definitions in the Project Options will be
applied in real time and in subsequent captures when Calculate 6DOF is
checked in the Processing options. It is also possible to reprocess captures
with different presets.
The effect of the smoothing filter can be visually inspected in the 3D view win-
dow by comparing the markers of the rigid body overlay in the 3D view window
with the measured marker positions. For more detailed comparison of the
6DOF data, you can define the same rigid body twice, one with filter and the
other one without.
Rigid body Mesh Settings dialog

The Rigid Body Mesh Settings dialog is used to associate a mesh with a rigid
body and to inspect and modify the mesh object settings. For more information
about how to use Rigid body meshes see chapter "Rigid body meshes" on
page 667.
The dialog contains the following settings:

PROJECT OPTIONS 358


Filename
A drop-down list of mesh objects in the Meshes folder of the project. Copy
the .obj file, and any associated .mtl or image file, manually to the Meshes
folder to use the mesh in the project. The path to the Meshes folder can
be set on the Folder options page, see chapter "Folder options" on
page 427. For information about obj files and the features supported, see
chapter "Compatibility of meshes" on page 426.

Position
The 3D translation of the object relative to the global coordinate system.
The values of X, Y and Z are specified in mm units.

Rotation
The 3D rotation of the object relative to the global coordinate system. The
values for Roll, Pitch and Yaw are specified in degrees around the X, Y
and Z axes, respectively.

Scale
The scale factor of the mesh object.

Opacity
The opacity of the mesh object from 0 (transparent) to 1 (opaque). An .obj
file can include an opacity option for individual faces, these will be
rendered as transparent even if the Opacity option is set to 1.

The buttons:
Reset
Remove the link to the file and resets the Position, Rotation and Scale
values.

OK
Accept the settings and close the dialog.

Cancel
Discard the settings and close the dialog.

Apply
Review the changes directly in the 3D View window while in Preview
mode.

PROJECT OPTIONS 359


Force data

The Force data branch of the options tree contains settings for the force data
calculation on the installed force plates. For a correct force calculation the fol-
lowing settings for the force plate must be specified: the dimensions, the cal-
ibration factors, the gain and the location in the coordinate system of the
motion capture.
For information about how to use the force data see chapter "Force data cal-
culation" on page 703.
General settings

The setting Coordinate system for force/COP display and export defines in
what coordinate system the force data is displayed in the Data info window
and exported.

NOTE: The C3D export only includes analog data and force plate para-
meters, so it is therefore not changed by this setting.

The default value is Local (Force plate), which means the local coordinate sys-
tem of the force plate with the origin in the middle of the force plate.

PROJECT OPTIONS 360


You can change the orientation to World (Lab) and then all of the force data
will be in the global coordinate system of the measurement.
Force plates

Under the Force plates heading on the Force data page the force plates are
managed for which the force data will be calculated. This applies to all types of
integrated force plates, namely digitally integrated force plates and instru-
mented treadmills, and force plates connected via an analog board.
Use the Add plate option to add a new force plate to the list. Right-click on the
force plate to open a menu where you can Change name and Remove plate.
The AMTI digital plates are created automatically and cannot be removed. They
can however be renamed and then the same name is used for it on the Input
Devices page.
To enable force plates in QTM, select the check box next to the force plate
name in the Calculate force column.
Select a force plate and click Edit plate to open the settings for that plate, or
double-click on it. The available settings depend on the force plate type, see
chapter "Force plate settings" on the next page. For more information about
force plates see chapter "How to use force plates" on page 756.
To remove a force plate from the list, select it and click the Remove Plate but-
ton.
The Define Plates button is used to automatically define force plates for spe-
cific digital integrations, e.g., Arsalis, Gaitway-3D, Bertec, Kistler, or any other
custom QDevice integrations.

NOTE: The force plates that are activated will be shown in the 3D view
window even if there is no analog data.

PROJECT OPTIONS 361


Force plate settings

The settings for each force plate are found under the corresponding page. For
example, settings for force plate 1 on the Force plate 1 page.

Force plate type

Under the Force plate type heading there are a drop-down box for the force
plate type. The available force plate types depend on which force plate integ-
rations you have installed.
The Calibration and Settings options depend on the force plate type. For the
digital integrated force plates the type cannot be changed and the Calibration
and Settings options are controlled automatically. For more information about
the digital integrations see chapter "Digital force plate integrations" on
page 756.
For analog integrated force plates, the Calibration and Settings options
depend on the force plate type. In addition, generic settings are available for
connecting custom analog force plates or for force data imported from C3D
files. For a detailed description, see the below chapters.

AMTI calibration and settings (6 channels)

The following settings apply to AMTI analog force plates with 6 analog output
channels.

PROJECT OPTIONS 362


AMTI force plate calibration parameters
Select the AMTI force plate type and click Calibration under the Force plate
type heading to go to the AMTI force plate calibration parameters dialog. It
contains the settings for dimensions and calibration factors of the AMTI force
plate, see example in image below.

Force plate dimensions


Under the Force plate dimensions heading you should enter the dimension
parameters (in mm) for the AMTI force plate. The parameters can be found in
the user manual of the AMTI force plate.

Inverted sensitivity matrix


The Inverted sensitivity matrix is used to calibrate the force plate. Enter the
values of the inverted sensitivity matrix (in SI-units) under the Inverted sens-
itivity matrix heading. For old AMTI plates the inverted sensitivity matrix was
called calibration matrix. The values can be found in the manual of the AMTI
force plate or can be loaded with Load from file from the diskette, which is
attached to the manual of the AMTI force plate.

PROJECT OPTIONS 363


NOTE: The file contains the Sensitivity matrix, which is then converted to
the inverted when imported to QTM.

AMTI force plate settings


Select the AMTI force plate type and click Settings under the Force plate type
heading on the Force plate page to open the AMTI force plate settings dia-
log.

There are four settings on the dialog: Analog board, Channel, Excitation
Voltage and Gain, see example in image above.
Select the analog board where the force plate is connected from the Analog
board drop-down list.
With Channel each signal is combined with its respective analog channel.

NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

The Excitation Voltage is set to the bridge excitation voltage of each


channel in the AMTI amplifier.

PROJECT OPTIONS 364


NOTE: The drop-down list gives you standard values, but you can
also type any value for the setting.

The Gain is set to the gain of each channel in the AMTI amplifier.

NOTE: The drop-down list gives you standard values, but you can
also type any value for the setting.

For more information about these settings see the manual of the AMTI force
plate.

AMTI portable calibration and settings (8 channels)

The following settings apply to AMTI portable analog force plates with 8 analog
output channels.

AMTI 8-channel force plate calibration para-


meters
Select the AMTI 8 Channel (Portable) force plate type and click Calibration
under the Force plate type heading to go to the AMTI 8-channel force plate
calibration parameters dialog. It contains the settings for dimensions and cal-
ibration factors of the AMTI portable force plate.

PROJECT OPTIONS 365


Force plate dimensions
Under the Force plate dimensions heading you should enter the dimension
parameters (in mm) for the AMTI portable force plate. The parameters can be
found in the user manual of the AMTI portable force plate.
Old type of force plate
Activate the Unit before SN 232 option, if you have a force plate with
serial number lower than 232. These force plates have a "zero" level of 2.5
V. When calculating the force in a file, the data is always zeroed with the
first 10 frames of the measurement. This is because the error is quite
large and because of the 2.5 V offset you cannot use the standard option
to remove the offset.

Calibration matrix
The Calibration matrix is used to calibrate the force plate. Enter the values of
the calibration matrix (in SI-units) under the Calibration matrix heading. The
values can be found in the manual of the AMTI force plate or can be loaded
with Load from file from the diskette, which is attached to the manual of the
AMTI portable force plate.

NOTE: The file contains the Sensitivity matrix, which is then converted to
the inverted when imported to QTM.

AMTI 8 Ch force plate settings


Select the AMTI 8 Channel (Portable) force plate type and click Settings under
the Force plate type heading on the Force plate page to open the AMTI 8 Ch
for plate settingsdialog.

PROJECT OPTIONS 366


First you must select the analog board where the force plate is connected
from the Analog board drop-down list.

Then associate each signal from the force plate with its respective analog
channel with the Channel settings.

NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

For more information about the signals see the manual of the AMTI portable
force plate.

Bertec calibration and settings

The following settings apply to Bertec analog force plates.

PROJECT OPTIONS 367


Bertec force plate calibration parameters
Select the Bertec force plate type and click Calibration under the Force plate
type heading to go to the Bertec force plate calibration parameters dialog.
It contains the settings for dimensions and calibration factors of the Bertec
force plate.

Dimensions
Under the Force plate dimensions heading you should enter the dimensions
parameters for the Bertec force plate. The parameters can be found in the user
manual of the Bertec force plate.

Calibration matrix
The Calibration matrix is used to calibrate the force plate. Enter the values of
the calibration matrix, which is found in the manual of the Bertec force plate,
under the Calibration matrix heading. Bertec often just supplies the six values
of the diagonal in the matrix. The values can also be loaded with Load from
file from the diskette, which is attached to the manual of the Bertec force plate.

PROJECT OPTIONS 368


Bertec force plate settings
Select the Bertec force plate type and click Settings under the Force plate
type heading on the Force plate page to open the Bertec force plate set-
tings dialog.

There are three settings on the dialog: Analog board, Channel and Gain, see
example in image above.
Select the analog board where the force plate is connected from the Ana-
log board drop-down list.

With Channel each signal is combined with its respective analog channel.

NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

The Gain is set to the gain of each channel as selected on the Bertec amp-
lifier.

For more information about these settings see the manual of the Bertec force
plate.

PROJECT OPTIONS 369


Kistler calibration and settings

The following settings apply to Kistler analog force plates, and the legacy Kistler
DAQ Type 5698A/B integration (deprecated in QTM 2024.1).

Kistler force plate calibration parameters

For information on how to connect a Kistler force plate see chapter "Con-
necting Kistler force plates" on page 790.
Select the Kistler force plate type and click Calibration under the Force plate
type heading to go to the Kistler force plate calibration parameters dialog.

Force plate dimensions


Under the Force plate dimensions heading you should enter the length, width
and transducer origin parameters for the Kistler force plate. These parameters
can be found in the documentation that is delivered with the Kistler force plate.
If you cannot find these values, please refer to Kistler support.

NOTE: Enter the absolute values of the parameters, that is for the h
value write 45 when the Kistler manual says -45.

PROJECT OPTIONS 370


Kistler COP Correction
The COP correction is a method developed by Kistler to improve the accuracy
of the COP calculation. According to Kistler the error can be reduced by 3-5
times. The method is implemented in QTM from the paper "Improving COP
Accuracy with Kistler Force Plates", contact Kistler for more details.
Activate the Kistler COP correction with the Use COP correction checkbox.
Select the force plate model from the Force plate model list, the method is
only available for the force plates in the list.

NOTE: The coefficients are force plate model (type number) specific. If in
doubt, please contact Kistler.

NOTE: The coefficients apply only if the force plate is mounted on a rigid
foundation according to Kistler specifications.

Kistler scaling factors


The scaling factors of the Kistler plate must be entered under the Kistler scal-
ing factors heading. The scaling factor ensures that the force data is scaled cor-
rectly. Follow these steps to use the correct scaling factors.

NOTE: If you update from a version earlier than QTM 2.7 all of the
ranges will get the value of the currently entered range. Enter the correct
values for all of them so that you can switch ranges more easily.

l How to find the scaling factors differs between internal and external amp-
lifiers.
Internal amplifier
For a Kistler force plate with internal amplifier enter all of the scaling
factors that are found in the calibration certificate matrix of each
force plate.

PROJECT OPTIONS 371


External amplifier
For a Kistler force plate with external amplifier you must calculate
the scaling factors, see instructions below.
l Then select the range with the Current XY range and Current Z range
options:
Selected by forceplate control
The default option is Selected by forceplate control, which means
that the range is controlled by the ranges set on the Force plate
control settings page.

NOTE: The force plates will be ordered by the analog chan-


nels that they are connected to, so that the one with the first
channels will use the ranges of the first plate on the Force
plate control settings page.

Range 4 - Range 1
You can select the range manually if you don't use the analog board
to control the force plate ranges. Range 4 will give you the maximum
force range and range 1 will give you highest sensitivity, i.e. min-
imum force range. It is important to use the same range as is used
by the force plate to get correct forces, you can use different ranges
for the XY and Z settings.

Calculating scaling factors with external amplifier


For a Kistler force plate with external amplifier, the scaling factors need to
be calculated from the data found in the calibration certificate matrix of each
force plate, click on Info for more information. Calculate the scaling factors for
all of the ranges with the help of the formulas in the Kistler scaling factor
info dialog. The force plate sensitivity parameters and the value of the ranges
are found in the Kistler calibration sheet.

PROJECT OPTIONS 372


Maximum measurable force
Under the Maximum measurable force heading the maximum measurable
forces, for the specific settings, are shown in N for the X/Y and Z directions.
Click Recalculate to recalculate the maximum forces when a setting has been
changed.

Kistler force plate settings


Select the Kistler force plate type and click Settings under the Force plate
type heading on the Force plate page to open the Kistler force plate set-
tings dialog.

First you must select the analog board where the force plate is connected
from the Analog board drop-down list.

Then associate each signal from the force plate with its respective analog
channel with the Channel settings.

PROJECT OPTIONS 373


NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

For more information about these settings see the manual of the Kistler force
plate.

Generic 6 ch (c3d type-1)

Force plate calibration parameters

Select the Generic 6 ch (c3d type-1) force plate type and click Calibration under
the Force plate type heading to go to the Force plate calibration para-
meters dialog. It contains the settings for dimensions of the Generic 6 ch (c3d
type-1) force plate.

NOTE: The main use of the Generic 6 ch (c3d type-1) force plate is for
handling of imported c3d files. It is recommended to use the vendor spe-
cific force plate types when capturing data.

NOTE: For information about c3d type-1 plates refer to the c3d.org web-
site.

PROJECT OPTIONS 374


Force plate dimensions
Under the Force plate dimensions heading you should enter the dimension
parameters (in mm) for the Generic 6 ch (c3d type-1) force plate. The para-
meters can be found in the user manual of the force plate.

Force plate settings


Select the Generic 6 ch (c3d type-1) force plate type and click Settings under
the Force plate type heading on the Force plate page to open the Force
plate settings dialog.

First you must select the analog board where the force plate is connected
from the Analog board drop-down list.

Then associate each signal from the force plate with its respective analog
channel with the Channel settings.

NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

PROJECT OPTIONS 375


For more information about the signals see the manual of the force plate.

Generic 6 ch (c3d type-2)

Force plate calibration parameters

Select the Generic 6 ch (c3d type-2) force plate type and click Calibration under
the Force plate type heading to go to the Force plate calibration para-
meters dialog. It contains the settings for dimensions of the Generic 6 ch (c3d
type-2) force plate.

NOTE: The main use of the Generic 6 ch (c3d type-2) force plate is for
handling of imported c3d files. It is recommended to use the vendor spe-
cific force plate types when capturing data.

NOTE: For information about c3d type-2 plates refer to the c3d.org web-
site.

Force plate dimensions


Under the Force plate dimensions heading you should enter the dimension
parameters (in mm) for the Generic 6 ch (c3d type-2) force plate. The para-
meters can be found in the user manual of the force plate.

PROJECT OPTIONS 376


Force plate settings
Select the Generic 6 ch (c3d type-2) force plate type and click Settings under
the Force plate type heading on the Force plate page to open the Force
plate settings dialog.

First you must select the analog board where the force plate is connected
from the Analog board drop-down list.

Then associate each signal from the force plate with its respective analog
channel with the Channel settings.

NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

For more information about the signals see the manual of the force plate.

PROJECT OPTIONS 377


Generic 8 ch (c3d type-3)

Force plate calibration parameters

Select the Generic 8 ch (c3d type-3) force plate type and click Calibration under
the Force plate type heading to go to the Type 3 Force plate calibration
parameters dialog. It contains the settings for dimensions of the Generic 8 ch
(c3d type-3) force plate.

NOTE: The main use of the Generic 8 ch (c3d type-3) force plate is for
handling of imported c3d files. It is recommended to use the vendor spe-
cific force plate types when capturing data.

NOTE: For information about c3d type-3 plates refer to the c3d.org web-
site.

Force plate dimensions


Under the Force plate dimensions heading you should enter the dimension
parameters (in mm) for the Generic 6 ch with matrix (c3d type-4) force plate.
The parameters can be found in the user manual of the force plate.

PROJECT OPTIONS 378


COP correction
Under the COP correction heading you can enable COP correction with the
Use COP correction option. Enter the Coefficients for the correction poly-
nomial.

Force plate settings


Select the Generic 8 ch (c3d type-3) force plate type and click Settings under
the Force plate type heading on the Force plate page to open the Force
plate settings dialog.

First you must select the analog board where the force plate is connected
from the Analog board drop-down list.

Then associate each signal from the force plate with its respective analog
channel with the Channel settings.

PROJECT OPTIONS 379


NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

For more information about the signals see the manual of the force plate.

Generic 6 ch with matrix (c3d type-4)

Force plate calibration

Select the Generic 6 ch with matrix (c3d type-4) force plate type and click Cal-
ibration under the Force plate type heading to go to the Force plate cal-
ibration parameters dialog. It contains the settings for dimensions of the
Generic 6 ch with matrix (c3d type-4) force plate.

NOTE: The main use of the Generic 6 ch with matrix (c3d type-4) force
plate is for handling of imported c3d files. It is recommended to use the
vendor specific force plate types when capturing data.

NOTE: For information about c3d type-4 plates refer to the c3d.org web-
site.

PROJECT OPTIONS 380


Force plate dimensions
Under the Force plate dimensions heading you should enter the dimension
parameters (in mm) for the Generic 6 ch with matrix (c3d type-4) force plate.
The parameters can be found in the user manual of the force plate.

Calibration matrix
The Calibration matrix is used to calibrate the force plate. Enter the values of
the calibration matrix (M) (in SI-units) under the Calibration matrix heading.
The values can be found in the manual of the force plate. The calibration matrix
must fit in this formula F = M * A

Force plate settings


Select the Generic 6 ch with matrix (c3d type-4) force plate type and click Set-
tings under the Force plate type heading on the Force plate page to open the
Force plate settings dialog.

First you must select the analog board where the force plate is connected
from the Analog board drop-down list.

PROJECT OPTIONS 381


Then associate each signal from the force plate with its respective analog
channel with the Channel settings.

NOTE: On the Analog board (...) page the channel names can be
renamed to match the signal names.

For more information about the signals see the manual of the force plate.

Force plate location

To be able to view the force vectors in the same coordinate system as the
motion capture, the location of the force plate must be specified. The settings
are the same for all force plate types and are done under the Force plate loc-
ation heading on the Force plate page. The force plate location will be visu-
alized as a purple square in the 3D view window.
The three buttons have the following functions:
Generate
Automatically generate the force plate location from a capture of mark-
ers at the corners of the force plate, see chapter "Generate force plate
location from a capture" below.
Use default
Use default force plate location. This method is only available for the
Gaitway-3D instrumented treadmill.
View/Edit
Manual review and possibility to edit force plate corner positions, see
chapter "Manual revision and specification of force plate location" on
page 386.

Generate force plate location from a capture

For automatic generation of the force plate location, place a marker on top of
the four corners of the force plate. The markers do not need to be placed
exactly on top of the corners, however, it is important that the markers are
placed symmetrically for a correct estimation of the center of the force plate.
Follow these steps for automatic generation of the force plate location:

PROJECT OPTIONS 382


1. Make a 1 second motion capture of these markers.

2. Identify the markers, you can give them any label, and keep the capture file
open.

3. Open the Force plate page and click Generate. QTM tries to identify the
corners of the force plate by comparing them to the width and length of
the plate (as entered in the Force plate calibration parameters dialog).
A dialog box like the one below is displayed:

4. Click OK to open Load measured force plate location dialog, see below,
and select one of the solutions found by QTM. Click Cancel to select the
markers’ locations manually, see further down. Try to make sure that the
orientation is correct. It is recommended to make a test measurement
after the location process to see that the orientation is correct. If the force
arrow is pointing downwards, you can use the Rotate 180 degrees option
in the Force plate location dialog, shown below.

PROJECT OPTIONS 383


5. When spherical markers are used, the level of the force plate in the ver-
tical direction will be the centers of the markers. If the level of the force
plate should be exactly in level with the actual force plate, use the Adjust
for marker offset setting in the Load measured forceplate location
dialog. You can also click View/Edit in the Force plate settings dialog
and adjust the vertical position.

PROJECT OPTIONS 384


NOTE: If QTM cannot find a solution among the labeled trajectories
or if you click Cancel in the Automatic identification dialog,
almost the same dialog will appear but you must select each corner
manually, see below. in this case it is especially important to test
that the force plate location is correct before making the meas-
urements.

PROJECT OPTIONS 385


Manual revision and specification of force plate location

To manually specify the location click View/Edit. Enter the X, Y and Z coordin-
ate (in mm) of the four corners of the force plate. The coordinates should be in
the coordinate system of the motion capture (lab coordinates).
Make sure that the orientation is correct. Use the internal coordinate system of
the force plate, shown in the dialog, to determine the orientation of the force
plate corners. Most force plates have the positive y axis in the direction away
from the connectors of the force plate.

COP (Center Of Pressure) threshold

The Center Of Pressure (COP) is the position of the center of the pressure on
the force plate. It is calculated by QTM from the data of the pressure sensors in
the corners of the force plate.
The COP (Center Of Pressure) threshold heading on the Force plate page
contains settings for the COP threshold. When the Z component of the force is
below the threshold level, the force vector will not be shown in QTM. The
COP will be calculated though and accessible in the file.
Select the Activate check box to activate the COP threshold filter. Enter the Z
axis COP threshold level in Newton to disable the visualization of the force
vector and COP. The Z axis is in the force plate coordinate system, which means
that it is always the vertical force which is used in the filter. This is because the
horizontal forces can be very small but still correct.

PROJECT OPTIONS 386


Force plate settings status window

At the bottom of the Force plate page there is a Settings status window,
which shows the status of the settings of the force plate. It uses the same noti-
fication icons as under the Camera system settings heading on the Camera
system page. The settings that are shown are Dimensions, Scaling factors
(Calibration for the AMTI force plates), Location and Channels.

Real-Time output

With the real-time output function any tracked data (3D or 6DOF) and also ana-
log and force data can be sent to another computer via TCP/IP. The RT server is
always running so that a program can acquire the RT data.
The setting that can be changed for the RT output is the TCP/IP and OSC port
numbers. Uncheck the option Use default port numbers to set other port
numbers. The first four ports are grouped together, so that their port number
are changed with the Base Port number, which by default is 22222.
The Capture Broadcast Port (default 8989) is used for receiving and broad-
casting UDP start and stop packets. For more information, see chapter "Wire-
less/software Trigger" on page 267.

PROJECT OPTIONS 387


You can also activate real time client control with the Allow client control
option. Activate the option to enable RT clients to control QTM (master mode).
If you set a Password, that password must be sent from the RT client for it to
run in master mode. The RT clients can always connect to QTM and receive the
RT output (slave mode), which is the only mode that is available if Allow client
control is deactivated.

NOTE: Only one RT client at a time can be in master mode to control


QTM.

6DOF analog export

The 6DOF analog export page is only available if an analog output board (PCI-
DAC6703) is installed in the measurement computer. With the analog export
the information about 6DOF bodies’ positions can be used in feedback to an
analog control system. Select the Enable analog 6DOF output option to
enable the output of the board. The output will continue as long as the option
is selected and the 6DOF body is being tracked.

PROJECT OPTIONS 388


The list on the 6DOF Analog export page contains the wanted signals and
their respective settings. Use Add value or double-click in the empty area to
add a new signal, see chapter "Analog channel settings" below. Use Edit value
or double-click on the signal to change the settings of the selected signal. With
Remove value the selected signals are removed from the list.
Before using the analog export you should set the Range calibration. Use the
Test output option to test the output of the analog board.
Analog channel settings

When clicking on Add value or Edit value the Analog channels settings dia-
log is displayed. In the dialog the following settings can be set:
Signal
The data in QTM that is used for the output signal. For each 6DOF body on
the 6DOF bodies page there are seven available outputs: X, Y, Z, Roll,
Pitch, Yaw and Data available. Data available shows whether the 6DOF
body is visible or not.

NOTE: The rotation angles will change if you change the Euler
angles definitions.

Channel
The channel on the analog output board that will be used for the signal.
Each channel can only have one signal assigned to it.

PROJECT OPTIONS 389


Input min/Input max
The minimum respectively the maximum value (mm or degrees) of the
input data. If the input data is equal or smaller than the Input min the
output of the channel is equal to the Output min. If the input data is
equal or larger than the Input max the output of the channel is equal to
the Output max.

NOTE: For the three rotation angles the maximum input ranges
depend on the ranges of the respective angle.

Output min/Output max


The minimum respectively the maximum value (V) of the output on the
channel.

NOTE: Data available has two positions Available and Not avail-
able instead of the input and output settings. Set the value in V
which will be on the channel depending on whether the 6DOF body
as seen or not.

Test output

When clicking on Test output the following dialog is opened.

In the dialog four tests can be performed to test the output of the channels:

PROJECT OPTIONS 390


Absolute minimum voltage
The output of all channels are set to the minimum value of the analog
board. This value should be entered on the Analog output range cal-
ibration dialog.

Absolute maximum voltage


The output of all channels are set to the maximum value of the analog
board. This value should be entered on the Analog output range cal-
ibration dialog.

Voltage
The output of all channels are set to the specified voltage.

Signal % of signal range


The output of all channels is set to the specified percentage of the chan-
nel’s specified output range. If the channel is not used the output will be 0
V.
Range calibration

In the Analog output range calibration dialog the range of the specific board
is entered to calibrate the output of the channels. The maximum and minimum
values can be measured with the Test output option.

PROJECT OPTIONS 391


Euler angles

On the Euler angles page you can change the definition of the Euler angles
used in QTM. The definition applies to all places in QTM where Euler angles are
used, displayed or exported, including:
l Rotation of the global coordinate system

l The orientation rigid bodies (6DOF)

l The orientation of skeleton segments

PROJECT OPTIONS 392


NOTE: Any changes of the Euler angle definition apply directly to the pro-
ject or any files that are opened within the project. Files do not need to
be reprocessed when changing the Euler angle definition since these set-
tings are not stored in the file.

Select Euler angle definition

By default QTM uses the Qualisys standard, which is described in the chapter
"Rotation angles in QTM" on page 663. The definition can also be seen when
Qualisys standard is selected as the grayed settings under the Definition of
rotation axes heading, see screen dump above.
Use the Custom setting if you want another definition of the rotation angles.
Then, define the rotation angles by choosing the type of Euler axes and set the
rotation order and angle conventions, as described below.
Definition of custom rotation axes

The following settings are used to define custom Euler angles.

Type of Euler axes

Select the type of Euler axes from the following options:

PROJECT OPTIONS 393


Local (rotated) rotation axes
The rotations will be made around the axes of the local coordinate sys-
tem, which is most common for Euler angle representations. The first rota-
tion will be around the corresponding axis of the reference coordinate
system. The second rotation will be around the second rotation axis of
the rotated local reference system, i.e., after the first rotation has been
applied. Similarly, the third rotation will be around the third axis of the
rotated local reference system, after the first and second rotation.

Global (fixed) rotation axes


The rotations will be made round the axes of the reference coordinate sys-
tem. The default value is the global coordinate system. When calculating
the rotations this means that the first or second rotations will not affect
the axes of the following rotations.

NOTE: For rigid bodies, the reference system may be different from the
global coordinate system, dependent on the chosen reference coordinate
system, see chapter "Coordinate system for rigid body data" on page 354.

Rotation order and angle conventions

Define the rotation order and angle conventions for the custom Euler angle
definition.
First rotation axis
Define the first rotation axis. This can be any of the three axes.
Angle range
Select the angle range of the first axis. It can be either -180° to 180°
or 0° to 360°.

Second rotation axis


Define the second rotation axis. This cannot be the same as the First rota-
tion axis.

PROJECT OPTIONS 394


NOTE: Because of the definition of the Euler angles it is not pos-
sible to choose range for the second axis. E.g. an angel of -10° is
equal to 350°, which means that the range would be split 0° to 90°
and then 270° to 350°, see more about the angles below.

Third rotation axis


Define the third rotation axis. This cannot be the same as the Second
rotation axis. However it can be the same as the First rotation axis.
Angle range
Select the angle range of the first axis. It can be either -180° to 180°
or 0° to 360°.

Positive rotation
For each axis you can define the direction of Positive rotation. It can be
either Clockwise or Counterclockwise when seen along the positive dir-
ection of the axis.

Name
For each axis you can set a new name. This name will then be used every-
where in QTM.

PROJECT OPTIONS 395


IMPORTANT: To get an unambiguous definition of the rotation there
have to be two additional definitions used on the rotation angles. These
two definitions are set by QTM, but the second also depends on how you
define the Euler angles.

1. The rotation angles are always applied with the first rotation first and
so on.

2. The rotation angles have limitations to the ranges of the angles. The
first and third angle is always defined between -180° and 180° or 0° to
360°. But the middle angle depends on the Euler angle definition:

First and third rotation round different axes


The second rotation is always defined between -90° to 90°
(there are singularities at -90° and 90°).

First and third rotation round the same axis


The second rotation is defined between 0° to 180° or 0° to -
180°, depending on the order of the axes and the positive rota-
tion (there are singularities at 0° and 180° respectively at 0° to
-180°). Check the text at bottom of the Euler angle page to
find out which it is for the current definition.

PROJECT OPTIONS 396


TSV export

On the TSV export page there are settings for the TSV export. For information
about the TSV export see chapter "Export to TSV format" on page 711.
Data type

Under the Data type to export headings you can select which data types to
export by checking the items. Each data type is exported as a separate file with
a suffix added to the file name. The following types of motion data can be selec-
ted:
2D data
Export the 2D data of the capture file.

3D data
Export the 3D data of the capture file.

6D data
Export the 6DOF data of the capture file.

Analog data
Export the analog data of the capture file.

PROJECT OPTIONS 397


Force data
Export the force and moment data of the capture file.

Eye tracker data


Export eye tracker data of the capture file.

Skeleton data
Export skeleton data of the capture file.
General settings

The settings under the General export settings heading are applied to all TSV
exports. The settings are:
Include TSV header
Include the TSV header with information about the capture file in the
exported file.

Export time data for every frame


Include two columns with frame number and measurement time for each
frame in the exported TSV file. The time data will be entered in the first
two columns of the file. If the SMPTE timecode exist, it will be added as a
third column after the measurement time.

NOTE: For Non-periodic external timebase you will not be guar-


anteed to get the correct timestamps, because the actual frequency
of the measurement is unknown.

Write column headers


Include a header over each column of data, which describes the data in
that column. E.g. for 3D data the three different positions (Toe X, Toe Y
and Toe Z)

Include events
Include the events in the 2D, 3D or 6DOF or 3D TSV file.

Null data string


Set the data string for empty data frames in a trajectory, e.g. ‘0’, ‘-‘ or
‘NULL’. Do not write the quotation marks. The null data string can be
empty, i.e. it does not need to contain a character.

PROJECT OPTIONS 398


Skeleton Data Reference Frame
Specify if the skeleton data is exported in global or local coordinates.
2D settings

The setting under the 2D Settings heading is only applied to 2D data. The
Export linearized 2D data setting is enabled by default. Disable the setting to
export raw unlinearized 2D data.
3D Settings

The settings under the 3D Settings heading are only applied to 3D data. The
settings are:
Include type information per frame
Include a column for each trajectory with the trajectory type per frame.

Exclude unidentified trajectories


Exclude unidentified trajectories from the exported file.

Exclude empty trajectories


Exclude empty trajectories without data (e.g. a new label) from the expor-
ted file.

Exclude non-full frames from beginning and end where any of the
labeled trajectories are not found
Exclude completely empty frames of the labeled trajectories, in the begin-
ning and the end of the measurement.

PROJECT OPTIONS 399


C3D export

On the C3D export page there are settings for the C3D export. For information
about C3D export, see chapter "Export to C3D format" on page 727.
3D Data

The settings under the 3D Data heading are only applied to 3D data. The set-
tings are:
Exclude unidentified trajectories
Exclude unidentified trajectories from the exported file. If unidentified tra-
jectories are included they will not have a name in the C3D file.

Exclude empty trajectories


Exclude empty trajectories without data (e.g. a new label) from the expor-
ted file.

Exclude non-full frames from beginning and end where any of the
labeled trajectories are not found
Exclude completely empty frames of the labeled trajectories, in the begin-
ning and the end of the measurement. This setting overrides the selected
measurement range.

PROJECT OPTIONS 400


Convert to Y-axis upwards
Convert the 3D data so that the Y-axis points upwards. This option is only
applicable if the Z-axis is pointing upwards. This option is not applicable if
the file contains force plate data.
Label Format

Under the Label Format heading you can change the format of the C3D file
with the following two settings:
Following the C3D.org specification (short label)
Use the C3D.org specification, which uses short labels.

De facto standard (full labels)


Use full labels, i.e. the full names for the trajectories and analog channels
will be included in the file.
Event Output Format

Under the Event Output Format heading you can change the format of the
C3D file with the following two settings:
Following the C3D.org specification (use original start time)
Use the C3D.org specification, which uses original start time. Default
value and required if using the C3D file in Visual3D 2020.8.3 or later.

Qualisys standard (use cropped start time)


Event time specified relative to cropped start time.
Parameter Groups

The SUBJECTS parameter group is used to define multiple subjects or objects in


a C3D file. Enable the Write SUBJECTS parameter group option to include this
information in the C3D export.
The SUBJECTS group is generated as follows:

1. Trajectory prefixes (text before first underscore) are collected for all non-
rigid-body markers. Each unique prefix is associated with a SUBJECT (e.g.
skeletons).
2. For each enabled rigid body, the longest common prefix of the its point
labels is extracted and associated with a SUBJECT.

PROJECT OPTIONS 401


NOTE: This requires that the points in the rigid body definition start
with the same text. The prefix of a rigid body does not need to be
separated with an underscore, but it may contain one or several
underscores.

The option is disabled by default to allow for using underscore in labels. When
the option is enabled make sure that all labels start with an actual prefix.
Units

Specify the length unit used for the C3D export. The alternatives are meters
(m), centimeters (cm) and millimeters (mm, default). This setting applies to all
length data, including 3D trajectories and force plate positions.

Matlab file export

The Matlab file export page contains settings for the export to MAT files. For
information about MAT file export see chapter "Export to MAT format" on
page 729. Select which data types to export with the Data type to export
options.

PROJECT OPTIONS 402


The following data can be selected.
3D data
All of the 3D data will be included, both labeled and unidentified. Use the
Exclude unidentified trajectories option to only export the labeled tra-
jectories to the MAT file.

6D data
Include 6DOF data.

Skeleton data
Include skeleton data. Use the Skeleton Data Reference Frame option
to specify if the skeleton data is exported in global or local coordinates.

Analog data
All of the analog data will be included, both data from analog boards and
EMG.

Force data
Include force data.

Eye Tracker
Include eye tracker data.

Timecodes
Include timestamps for each camera frame, according to Timestamp set-
tings.

Events
Include events.

The following export options are available.


Exclude unidentified trajectories
Check this option to exclude unidentified 3D trajectories from the export.

Skeleton Data Reference Frame


Select the reference frame for the export of skeleton data.

PROJECT OPTIONS 403


AVI Export

The AVI Export page contains settings for the export of 3D or 2D views to AVI
files. The export can be done either as a processing step or manually from the
File/Export menu. For more information see chapter "Export to AVI file" on
page 739.

NOTE: If you export an AVI file manually the current setting of the pro-
cessing step will also change. Therefore it is recommended that you save
the View that you want to use in a processing step so that you can return
to that setting.

Window settings

The Window settings control the input to the AVI file, i.e. the view that is used
for the export. The settings consist of a list of active and saved views (3D or
2D). Select the view that you want to use in an export, then you set the output
with the Video settings below. That view will then be saved as Previous set-
tings in the list and used in the export until you select another view. It is import-
ant to notice that if you make an AVI export from the File menu, the new

PROJECT OPTIONS 404


settings will be used in the export directly after a capture. Therefore it is recom-
mended that you save any views with the Save View option, that you want to
be able to use again.
The list has the following columns:
Name
The name of the view depends on whether it has been saved or not.
The active views, i.e. those views that are open in QTM, are called Window
1 and so on. Then there is a saved view with the last AVI export made by
QTM, it is called Previous settings. The name of these two cannot be
changed and you cannot delete the view from the list.
The views that you have saved with the Save View option can have any
name. Double-click on the name to change it. The saved views can also be
deleted by right-clicking on the view and select Delete.

Type
The type is either 3D or 2D. A 2D view that only displays video images are
still a 2D type of view. Then the type can be either Active or Saved.

NOTE: If you are saving an Oqus video to an AVI file, e.g. with a 3D
overlay of, then the video image will be linearized. I.e. the same
parameters that are used to correct the marker data is applied to
the video data to correct for the lens distortion. Therefore the pixels
will not match exactly with the original video image. The lin-
earization can be turned of with the Show linearized data option
on the 2D view settings page in Project options.

Size
The size is the x and y size of the view in pixels.

View count
The View count displays the number of cameras displayed in a 2D view. A
number within parentheses means that there that number of views with a
3D overlay.

Below the list there are three options.

PROJECT OPTIONS 405


2D view text
The 2D view text option toggles whether the text, e.g. camera number, is
displayed in the AVI export.

2D view borders
The 2D view borders option toggles whether the borders around a cam-
era view is included in the AVI export.

Save View
The Save View option save the selected view so that you can use it later.
The view is then copied and the type changed to Saved. The name can be
changed directly in the Name column.
Video settings

The Video settings control the output of the AVI export. The following options
are available:
Width
The width of the video in number of pixels. The width can also be changed
by resizing the Preview window.

NOTE: The width can be larger than the display size.

Height
The height of the video in number of pixels. The height can also be
changed by resizing the Preview window.

NOTE: The height can be larger than the display size.

Frame rate
The frame rate in Hz of the video file. The rate is down-sampled if you use
a frame rate lower than the marker or video capture in the view. I.e. if you
enter 30 Hz and the file is captured at 120 Hz, then the video file will con-
tain every fourth frame.

PROJECT OPTIONS 406


NOTE: For just playback on a computer it is sufficient to use 30 Hz.

Playback speed
The Playback speed option controls the speed of the file in % of the ori-
ginal speed, so that the file can for example be played in slow motion. E.g.
if you have a file captured at 120 Hz and a Frame rate for the AVI export
of 30 Hz, then you can use a Playback speed of 25% to get all of the cap-
tured frames in the video file at a quarter of the original speed.

Preset
The Preset option is for using standard video output in the 16:9 format.
The available presets are:
Custom
The Preset option is set to Custom as soon as any of the Video set-
tings does not match the other three presets.

480p - 30 Hz
The video output is set to 480p at 30 Hz. This means a size of
854*480 pixels and using progressive scanning.

720p - 30 Hz
The video output is set to 720p at 30 Hz. This means a size of
1280*720 pixels and using progressive scanning.

1080p - 30 Hz
The video output is set to 1080p at 30 Hz. This means a size of
1920*1080 pixels and using progressive scanning.

Use window dimensions


The Use window dimensions setting makes the size of the video the
same as the window that is selected in the Windows settings.

Use window aspect ratio


The Use window aspect ratio setting makes the ratio of width and
height the same as the window that is selected in the Windows settings.

Codec
Use a codec to reduce the file size of the video. For information about
recommended codecs, see chapter "Recommended codecs" on page 583.

PROJECT OPTIONS 407


There are two options for the codecs:
Codec settings
Click on Codec settings to open the settings for currently selected
codec.

Quality
For some codecs you can set the Quality directly without opening
the settings.

Preview
Display a preview of how the exported window will look like, showing the
data of the current frame. The size of the video window can be changed
by resizing the preview window. However, if the Use window dimensions
setting is active then the size is locked.

Marker/Video frequency
The marker and video frequency of the current camera settings are dis-
played at the bottom of the page. The video frequency that is displayed is
only the highest of the used frequencies.
If you are exporting an AVI file from a QTM file then the marker/video fre-
quency displays the frequencies of that file.

The settings of the previous export are always saved, that means that if you
change settings for an export from the File menu, those settings will also be
used for an export directly after a file capture. The settings are not saved if you
use the Export view to AVI option on the 3D and 2D view window menus.

PROJECT OPTIONS 408


FBX export

On the FBX export page there are settings for the FBX export. For information
about the FBX export, see chapter "Export to FBX file" on page 742.
The following settings are available for FBX export.
File type

File type
Choose if the exported data is in ASCII format or in binary format.
Export a separate file for each skeleton and rigid body
Check this option to export a separate file for each skeleton and
rigid body. The name of the skeleton or rigid body is added to the
file name as a suffix, separated by an underscore.
Exported data

Opticals
Export labeled trajectory data.
Actors (MotionBuilder)
Export Actor(s) that can be used for IK solving in MotionBuilder. See
MotionBuilder documentation for more information. Requires that

PROJECT OPTIONS 409


skeleton data has been calculated and that the Opticals option is
enabled.

Rigid bodies
Export 6DOF data from rigid bodies as single segment models.
End bones
Enable or disable the export of an end bone for the exported rigid
bodies (checked by default). The end bone can be included for better
visualization in animation software.

Skeletons
Export skeleton data.
The data is scaled according to the Scale factor used in the Skeleton
solver settings of the file. To change the Scale factor, you need to repro-
cess the file with the modified skeleton solver settings, see chapter
"Reprocessing a file" on page 601.
Characters
Export Character(s) that can be used for retargeting in third party
software. A character provides a mapping from QTMs skeleton struc-
ture to a predefined structure, defining which segment is the head,
the legs, the arms and so on. This in turn can be used to drive 3D
models with a slightly different skeleton structure. Requires that the
Skeletons option is enabled.

Root naming
Select the root name used for skeletons. The options are Reference
(default) and root. Alternatively, the user can specify a custom name.

Cameras
Export poses of calibrated cameras.

Timecode
If enabled, SMPTE timecodes (when available) will be used as timestamps
for any exported data.
Naming convention

Select the naming convention of labels for the export. The options are:

PROJECT OPTIONS 410


FBX (default): labels exported as defined in QTM.

Maya: labels are converted if needed so that they do not contain spaces
or other symbols that are not supported in Maya.

JSON export

On the JSON export page there are settings for the JSON export. For more
information about the JSON export, see chapter "Export to JSON file" on
page 743.
The following data types can be exported.
3D data
All of the 3D data will be included, both labeled and unidentified.

6D data
Include 6DOF data.

Analog data
All of the analog data will be included, both data from analog boards and
EMG.

PROJECT OPTIONS 411


Force data
Include force data.

Eye tracker data


Include eye tracker data.

Timestamps
Include timestamps for each camera frame, according to Timestamp set-
tings.

Skeleton data
Include skeleton data. Use the Reference Frame option to specify if the
skeleton data is exported in global or local coordinates.

Events
Include events.

Camera information
Include information about the cameras.

TRC export

PROJECT OPTIONS 412


On the TRC export page there are settings for the TRC export. For more inform-
ation about the TRC export, see chapter "Export to TRC file" on page 744.
The TRC export contains the labeled 3D trajectories.

STO export

On the STO export page there are settings for the STO export. For more inform-
ation about the STO export, see chapter "Export to STO file" on page 745.

PROJECT OPTIONS 413


Start program

Start an external program using the command "Action Argument(s)". This can
be useful for further automatic processing of an exported file.
Action
Specify the program to be started.

Argument(s)
Specify additional arguments to be added to the command. The following
parameters are available:
$p: Path of the current capture

$f: File name of the current capture

TIP: To open a capture in Excel, make sure that Export to TSV file is
selected and use the Action and Argument(s) fields as below.

PROJECT OPTIONS 414


GUI

The GUI page contains display related settings. All of the GUI settings are
applied both in RT and on files. On the GUI page there are settings for the
Screen update and for Close previous measurement file. Under the GUI
main page there are pages for 2D View Settings, 3D View Settings and Static
Mesh Objects.
The Screen update options can be used to optimize the performance of the
computer. Most of the time 15 Hz is enough for the screen update as the eye
cannot register movements faster than 25 Hz.
Real-Time mode
Set the frequency for the screen update during real time (preview).
Uncheck the check-box next to the setting to disable the GUI during pre-
view. The screen update rate for Video can be set independently.

Capturing
Set the frequency for the screen update during capturing. Uncheck the
check-box next to the setting to disable the GUI during capture. The

PROJECT OPTIONS 415


screen update rate for Video can be set independently.

File playback
Set the frequency for the screen update during file playback.

Status bar
Set the frequency for the update of the numbers in the status bar.

Show Real-Time Latency


Enable the display of the latency in the status bar, for more information
see chapter "Real time latency" on page 595.

The 2D/3D view presets options are used to control and save presets for the
2D and 3D GUI.
Preset drop-box
Select the preset from the Preset drop-box. There are two standard pre-
sets: QTM default and QTM High Contrast. These can be used to set the
default settings respectively to change the colors to improve the visibility
in sunlight.
Then there are 8 user presets that can be saved with any 2D view and 3D
view settings.

Save
To save a preset select one of the 8 user presets in the Preset drop-box
and then click on Save. All of the settings on the 2D view settings and 3D
view settings pages are saved to the preset.

Rename
Rename the currently selected preset.

The Plot options are used to control some plot window settings.
Default Real-Time Plot Size
Default data buffer duration when plotting data in Preview mode, used
when creating a new plot or applying a Windows layout while in Preview.

The Measurement file close options are used to control how QTM closes files
automatically.
When opening another file
The current file is closed when opening another file (Open in the File
menu).

PROJECT OPTIONS 416


When beginning a new measurement
The current file is closed when starting a new measurement (New in the
File menu).

Use the Reset hidden message boxes button to display all message boxes
that have been hidden. E.g. the message box about the force plate position
after a calibration.

2D view settings

The 2D view settings page contains settings for objects that is shown in the 2D
view window. The settings are saved with the project and are applied both in
the RT/preview and on a opened file. You can use the Reset settings button if
you want to reset all of the settings to default.
Show linearized data
Toggle whether linearization is applied to the marker and video data in
the 2D views. The unlinearized data is the original data, while the

PROJECT OPTIONS 417


linearized data is the one that is used for 3D tracking. Therefore it is best
to use linearized data when looking at the 3D overlay, because otherwise
the 3D positions will not match the 2D positions.

NOTE: Marker masks and the red rectangle representing the image
size are not drawn linearized, this means that with wide angle lens it
is best to turn off the Show linearized data option to see the true
positions of the mask and image size.

Show ray information


Enable the display of the trajectory color for the marker in the 2D views,
i.e. display which markers that are used for a trajectory.

Enable audio playback


Enable audio playback for captured video from DV-cam and webcam.

Background color
Select the Background color of the marker views.

Marker color
Select the Marker color in the 2D views.

Hardware marker mask color


Select the Hardware marker mask color in the 2D views.

Software marker mask color


Select the Software marker mask color in the 2D views.

Marker display
All markers have the same size
Select whether the 2D markers are displayed with their actual size or
with the same size.

Marker size
Select the marker size in subpixels when All markers have the
same size is set to Yes.

3D overlay opacity [percent]


Set the opacity for the 3D overlay. The default is 50. 0 makes the overlay
invisible and 100 is almost no opacity, which means that the 2D data or

PROJECT OPTIONS 418


video is hardly visible.

3D overlay elements
Select the 3D elements that will be displayed in 3D overlay. The following
elements can be controlled in the 3D overlay.
Markers, Marker traces, Bones, Grid, Axes, Cameras, Force
plates, Force arrow, Force trace, Rigid bodies, Volumes, Bound-
ing box, Gaze vector, Gaze vector trace, Skeletons, Static mesh
objects.

3D view settings

The 3D view settings page contains settings for objects that is shown in the 3D
view window and also how to operate the 3D view. The settings are saved with
the project and are applied both in the RT/preview and on a opened file.
You can use the Reset settings button if you want to reset all of the settings to
default. Only the more complex settings are explained below, for explanation
of the other settings please check the description in the Project options dia-
log.

PROJECT OPTIONS 419


NOTE: The display of some of the 3D view elements can be changed dir-
ectly on the GUI Control toolbar.

Axes
Display settings for the global axes in the 3D view.
Show axes, Length of the axis [mm]

Background
Display setting for the background in the 3D view.
Background color

Bones
Display settings for the bones in the 3D view.
Show bones, Show AIM structure bones, Bone thickness [mm],
Bone default color

The Show AIM structure bones option toggles the display of the
AIM structure bones used to create an AIM model.

Cameras
Display settings for the cameras in the 3D view.
Show cameras, Show camera IDs

Crosshair at rotation center


Display settings for the crosshair shown when zooming and translating in
the 3D view.
Show crosshair, Crosshair color, Crosshair size [pixels]

Force vector
Display settings for the force vector in the 3D view.
Show force vector, Force vector color, Scale factor [mm/N],
Show force trace
The Scale factor option sets the size of the force arrow in relation
to the force in N.
Activate the force trace, also called force butterfly or Pedotti dia-

PROJECT OPTIONS 420


gram, with the Show force trace option. The force trace is shown
for the active measurement range.

Force plates
Display settings for the force plate in the 3D view.
Show force plates, Force plate color,

Gaze vector
Display settings for the Gaze vector of eye-trackers.
Show gaze vector, Gaze vector color, Gaze vector length [mm],
Show gaze vector trace and Gaze vector trace color.

Grid
Display settings for the grid in the 3D view.
Show grid, Show grid measure, Grid color, Automatic size, Grid
length [m], Grid width [m], Vertical offset [mm], Distance
between each gridline [mm], Number of subdivisions, Thicker
lines on outside and center
The Automatic size option will make the grid approximately the
same size as the lab area, i.e. the size is set by the camera positions.

Markers
Display settings for the markers in the 3D view.
Default unidentified marker color, Default labeled marker
color, Use global marker size setting, Marker size (mm), Show
marker labels, Show marker traces, Traces line style, Show tra-
jectory count, Show labeled trajectory information, Show
unidentified markers

The Use global marker size setting option will decide whether to
use the same marker size on all files or individual settings for each
file. Disable the option to use the marker size that was used when a
file was saved.

The Show trajectory count option will display the number of selec-
ted trajectories and the total number of trajectories in the current
frame at the bottom right corner of the 3D view window.

PROJECT OPTIONS 421


The Show labeled trajectory information option shows the status
of the labeling at the current frame in the upper right corner of the
3D view window.

OpenGL
OpenGL format selection mode
The default values will give the best quality of graphics. However if
there are any problems with the graphics first try to disable the anti-
aliasing, then try one of the other modes: Use options below or
Use explicit index.

Use full scene anti-aliasing if available


Anti-aliasing smooths the edges of the graphics, but it can also slow
down the playback. This feature is not available on graphic boards
where the driver doesn't support OpenGL 2.0.

OpenGL double-buffering (only available for Use options Below)


The default setting is to Use double buffering. But to reduce the
workload on the graphic board you can try Use front buffer only or
Use single buffer only.

Use HW acceleration (only available for Use options Below)


Disable the acceleration if the graphic board has no built-in memory.

OpenGL format number (only avialable for Use explicit index)


Set the index for the OpenGL format, however some indexes might
not work at all. To see a list of the available formats click on the
button.

Rays
Display settings for camera tracking rays in the 3D view.
Enable camera rays
Show/hide camera rays

Length of ray excess [mm]


Additional length of ray beyond the calculated intersection point.

Rigid bodies
Display settings for the rigid bodies in the 3D view.

PROJECT OPTIONS 422


Show rigid bodies, Show coordinate system axes, Length of the
coordinate system axes [mm], Show rigid body points as mark-
ers, Show rigid body labels, Show rigid body wireframe, Show
rigid body mesh
Activate the display of the rigid body points in the 6DOF definition,
with the Show rigid body points as markers option. The markers
are calculated from the 6DOF definition and therefore they will be
displayed even if the corresponding marker is hidden.

Skeletons
Display settings for skeletons in the 3D view.
Show skeletons, Show skeleton labels, Skeleton color, Skeleton
thickness, Show segment labels, Show segment coordination
axes, Show segment markers, Show segment constraints, Con-
straint tolerance range [%], Segment constraint color

Static mesh objects


Display settings for static mesh objects in the 3D view.
Show static mesh objects

Volumes
Display settings for the display of volumes in the 3D view, for more inform-
ation see chapter "Volumes in 3D views" on page 125 and "Camera view
cones in 3D views" on page 128.
Show covered volume, Cut covered volume at floor level, Cam-
eras required to consider volume covered, Show calibrated
volume, Enable camera view cones, Length of camera view
cones [m], Show bounding box

PROJECT OPTIONS 423


Static mesh objects

The Static Mesh Objects page contains a list of static 3D mesh objects that are
shown in the 3D view window in the project. Mesh objects are Wavefront 3D
object files.
Static meshes in this list are associated with and saved in the project. The 3D
meshes are not stored in QTM files, which means that the whole project must
be shared to include the meshes when opening a file.
The static meshes for a project will be shown in the 3D view window inde-
pendent of the file or measurement being displayed. Use the Show static
mesh objects option on the 3D View Settings page to disable the display.

NOTE: Using really large meshes can slow down the rendering in QTM.

The buttons can be used to manage the list or access and edit the settings for
the objects:

PROJECT OPTIONS 424


Add
Add a new object to the Mesh Objects list. Choose an .obj file in the
Static Mesh Settings dialog that opens and set the properties.

NOTE: When adding a mesh, you may need to change the scale
factor since there is no standard for how to represent the size in the
.obj file.

Edit
Open the Static Mesh Settings dialog for inspecting or modifying the
Mesh Object settings.

Remove
Remove the selected mesh object from the list.
Static Mesh Settings dialog

The Static Mesh Settings dialog is used to inspect and modify the static mesh
object settings.
The dialog contains the following settings:
Filename
A drop-down list of mesh objects in the Meshes folder of the project. Copy
the .obj file, and any associated .mtl or image file, manually to the Meshes
folder to use the mesh in the project. The path to the Meshes folder can
be set on the Folder options page, see chapter "Folder options" on
page 427. For information about obj files and the features supported, see
chapter "Compatibility of meshes" on the next page.

PROJECT OPTIONS 425


Position
The 3D translation of the object relative to the global coordinate system.
The values of X, Y and Z are specified in mm units.

Rotation
The 3D rotation of the object relative to the global coordinate system. The
values for Roll, Pitch and Yaw are specified in degrees around the X, Y
and Z axes, respectively.

Scale
The scale factor of the mesh object.

Opacity
The opacity of the mesh object from 0 (transparent) to 1 (opaque). An .obj
file can include an opacity option for individual faces, these will be
rendered as transparent even if the Opacity option is set to 1.

The buttons:
Reset
Remove the link to the file and resets the Position, Rotation and Scale
values.

OK
Accept the settings and close the dialog.

Cancel
Discard the settings and close the dialog.
Compatibility of meshes

A mesh file consists of a lot of faces and each individual face can be associated
with materials and textures, but not all of the features applied to a face is
rendered in QTM. Complex obj files may not be supported and then those faces
and textures are not rendered. Some know limitations are:
l The material is only rendered for the front side of a face. If a material doesn't
show the reason can be that it is applied to the backside of the face.

l Only four faces can be included on one row in the .obj file. If there are more
than four faces on a row then none of them are rendered.

PROJECT OPTIONS 426


l The paths to the material and texture files are sometimes written as absolute
paths in the .obj and .mtl file. So if the material or texture is not applied to
the mesh, check that the paths are relative and the file names are correct.

Miscellaneous

Folder options

The Folder options page contains the settings for the locations of the following
file types saved by QTM.
The Project files locations are different for each project.
Calibration folder
The Calibrations folder is by default set to a folder called Calibrations in
the Project folder. It can be changed to another folder with the Browse
option, for example if you want all of the calibrations in different projects
to be saved in the same folder. However, then you must change it manu-
ally in each project.

PROJECT OPTIONS 427


AIM models folder
The AIM models folder is by default set to a folder called AIM models in
the Project folder. It can be changed to another folder with the Browse
option, for example if you want all of the AIM models in different projects
to be saved in the same folder. However, then you must change it manu-
ally in each project.

Meshes folder
The Meshes folder is by default set to a folder called Meshes in the Pro-
ject folder. It can be changed to another folder with the Browse option,
for example if you want to use the same meshes tin different projects.
However, then you must change it manually in each project.

The Global directories are used by all projects.


Default path for new projects
The default path is used when you create a new project. Change it with
the Browse option, if you do not want the Documents folder to be the
default folder to save the projects in.

Temporary video storage


The Temporary video storage is used for storage of the temporary video
files of Qualisys video and external video cameras. It can be changed to
another folder with the Browse option, which can be useful if the actual
file will be stored on another hard-drive than the temporary file.

Auto save location


The auto save location is used for storage of the temporary auto saved
captures in case Auto backup is enabled under Capture actions on the
Processing page. It can be changed to another folder with the Browse
option, which can be useful if the actual file will be stored on another
hard-drive than the temporary file.

Linearization files folder


The camera linearization files are stored in a folder that can be accessed
by all users of the computer.

PROJECT OPTIONS 428


Startup

The options on the Startup page changes how projects are loaded when QTM
starts. The default is that no project is loaded automatically and that you have
to select a project from the Open project dialog, see chapter "Manage pro-
jects" on page 72.
However, if you want to load a project automatically, then activate the option
Automatically load a project when QTM starts and select one of the fol-
lowing two options.
The most recent project
Loads the project that was last opened in QTM.

This project
Loads a selected project when QTM starts. Select the project with the
Browse option.

PROJECT OPTIONS 429


Events

The Events page contains settings for Event shortcuts. For information on
how to set and edit events see chapter "How to use events" on page 706.

Event shortcuts

The Event shortcuts options are for creating default event names that can be
easily accessed in the Add events dialog when creating an event, see chapter
"Adding events" on page 706. The shortcuts are enabled in the dialog with the
option Show shortcut list when creating events.
Use the Add Shortcut and Remove Shortcut buttons to add/remove shortcuts
to/from the list. The Event name and Color can be edited in the list.

PROJECT OPTIONS 430


Scripting

The Scripting page contains the settings for the terminal and the use of script
files in QTM. For more information about the QTM Scripting Interface, see
chapter "QTM Scripting Interface" on page 1017.
Terminal settings
Language
Choose the language used in the terminal window. The choices are
Python or Lua.

Text color
Define the color of the text used in the terminal window.

Script files

The script files section contains a list of script files that are loaded when
starting the project. Use the check box to activate or deactivate the script
files.
Add
Open a file dialog for adding script files to the list. You can use mul-
tiple select for adding multiple scripts at once.

Remove
Remove a script file from the list.

PROJECT OPTIONS 431


Syst em set up

System hardware

Qualisys camera types


Qualisys cameras are available for a wide range of capture applications and
environments. The main camera platforms are Arqus, Miqus and Oqus.
For an overview of the currently available camera platforms and types, refer to
the Qualisys website: https://www.qualisys.com/cameras/
For detailed information about features and specifications, refer to chapter
"Qualisys technical reference" on page 926.
Marker cameras

Qualisys offers a variety of marker cameras for Arqus, Miqus and Oqus sys-
tems. For an overview and specifications of the various types and models, see
"Qualisys camera sensor specifications (marker mode)" on page 926.
The marker cameras calculate the positions and sizes of the detected markers
on the camera, which then are sent to the computer over the camera network.
This leads to a great reduction of information, which allows the cameras to be
connected in a daisy chain. The edges of the markers are detected by using an
intensity threshold. The center and size of the markers is calculated in sub-
pixels, a subdivision of each pixel in 64 units per dimension.
The following features are available for Qualisys marker cameras:
Sensor mode
Most marker cameras have the possibility to use different sensor modes.
For an overview of available sensor modes per camera model, see
"Qualisys camera sensor specifications (marker mode)" on page 926. The
available sensor modes are:
High-resolution mode: in high-resolution mode the full resolution
of the sensor is used. This is the default sensor mode.

SYSTEM SETUP 432


High-speed mode: in high-speed mode one fourth of the pixels is
used, allowing for increased capture rates at full field of view.

Marker size filtering


Size filtering can be used to filter markers that are smaller or larger than a
set minimum or maximum threshold.

Marker masking
Marker masking can be used to block marker detection at selected areas
of the sensor. For more information, see chapter "Marker masking" on
page 536.

Circularity filtering (Oqus only)


Filtering of markers based on their circularity level. This feature is sup-
ported by all Oqus marker cameras. For more information, see chapter
"Marker circularity filtering (Oqus only)" on page 541.

Exposure delay
Exposure delay can be used to improve marker detection in setups with
opposing cameras by shifting the exposure of one or more cameras rel-
ative to other cameras in the system. For more information, see chapter
"Delayed exposure to reduce reflections from other cameras" on
page 534.

Active filtering
Active filtering can be used to improve marker detection in environments
with high background illumination. Active filtering works by capturing
each image twice to remove the background light. This significantly
increases the ability to capture passive markers in an outdoor envir-
onment. Active filtering is supported by all camera types, except the Oqus
3 and 5-series cameras. For more information, see chapter "Active fil-
tering for capturing outdoors" on page 539.

Marker types
All marker cameras support the use of different types of markers, namely
passive and several types of active markers. For more information, see
chapter "Choice of markers" on page 529.

Video mode
All marker cameras can be used in video mode. This can be helpful, for
example for pointing the cameras when setting up the system. Most

SYSTEM SETUP 433


cameras have the possibility to stream compressed video images. For an
overview, see "Qualisys video sensor specifications (in-camera MJPEG)" on
page 927.
Video cameras

Qualisys offers a variety of cameras for recording video.


The main types of video cameras are:
l Video cameras for streaming video, see chapter "Streaming video" below.

l A hybrid camera that can be used both as a marker camera and a color video
camera, see chapter "Miqus Hybrid" on the next page.

l High-speed video cameras for recording of buffered high-speed video


sequences, see chapter "Oqus high-speed video camera" on page 436.

For more information on how to use Qualisys video cameras, see chapter
"Qualisys video capture" on page 574.

Streaming video

Streaming video can be recorded with Miqus Video, Miqus Video Plus, or the
Oqus color video camera (Oqus 2c). When using streaming video, compressed
video data is sent to QTM during the capture, allowing for long video captures.
For an overview of the sensor specifications of all cameras supporting stream-
ing video, see chapter "Qualisys video sensor specifications (in-camera MJPEG)"
on page 927.
For more information on capturing streaming video, see chapter "Capture
streaming video" on page 576.

Miqus Video

The Miqus Video series is a dedicated video camera for capturing of MJPEG com-
pressed video. The following types of video cameras are available: Miqus VM
(monochrome), Miqus VC and Miqus VC+ (color). The Miqus Hybrid camera can
also be used as a color video camera. The Miqus Video series is configured with
a white strobe and a built-in filter that blocks the IR light for a better image
quality. The camera is always configured for video using In-camera MJPEG com-
pression for effective streaming. For information about maximum video

SYSTEM SETUP 434


capture frequencies at different presets, see chapter "Maximum capture rate
for streaming video" on page 578. For detailed specifications, see "Miqus Video
specifications" on page 942.

NOTE: On newer Miqus Video cameras, one of the LEDs in the strobe is
infrared, so that it can be used to trigger the active calibration kit in a
video-only system.

It is possible to use up to three Miqus Video cameras in a single Qualisys sys-


tem. For the use of more than three Miqus Video cameras a special setup is
required, see chapter "Connecting a Miqus Hybrid or Video system for mark-
erless mocap" on page 452.

Oqus color video camera (2c-series)

The Oqus color video camera (2c-series) is a dedicated video camera for cap-
turing of MJPEG compressed video. It is configured with a clear glass to let in
the visible light. There is a filter on the lens that filters out the IR light for better
image and the strobe is white.
The camera is always configured for capturing streaming video (MJPEG) by
default. When switching off the In-camera MJPEG compression, it can also be
used as a high-speed video camera. For more information, see chapter "Cap-
ture high-speed video" on page 579.

Miqus Hybrid

The Miqus Hybrid camera is a two-in-one camera which can be used for both
marker tracking and color video recording. The dual functionality of the camera
makes it especially useful for the exploration of markerless applications.
In Marker mode, the camera uses the near IR strobe (850 nm) for the illu-
mination of markers. The resolution and frame rate are similar to the Miqus M3
camera.
In Video mode, the specification are similar to the Miqus Color Video camera.

SYSTEM SETUP 435


IMPORTANT: On the use of Miqus Hybrid:
l The Miqus Hybrid camera is designed for indoor use. In Marker
mode, the camera is more sensitive to ambient light compared with
dedicated marker cameras. In Video mode colors may become
washed out due to infrared light from the sun.
l It is discouraged to use the Miqus Hybrid cameras in Video mode
together with other cameras in Marker mode. This will lead to
washed out colors due to the near IR illumination from the other
cameras.

Oqus high-speed video camera

The high-speed video version of a Oqus camera is adapted to capture full-


frame, full-speed, full-resolution high-speed video. In this configuration the
camera is therefore equipped with a large buffer memory and a clear front
glass to get the best possible performance out of the image capture.
The clear front glass is mounted so that all of wavelengths can be captured by
the camera. The normal dark front glass is an IR-filter that removes the visible
light. However the high-speed version is also delivered with a removable IR-fil-
ter on the lens. Which is important to mount if the camera is in the marker
mode, because the data is improved when the visible light is removed to
increases the contrast between the background and the marker. For instruc-
tions on how to get access to the lens see "How to change strobe unit" on
page 970.
For more information on how to capture high-speed video, see chapter "Cap-
ture high-speed video" on page 579.
Cameras for special environments

Qualisys provides cameras that are adapted for special environments.

Weather protected cameras

Qualisys provides weather protected cameras that are specially adapted for out-
door use or use in industrial environments. The housing, including connectors
and cabling, is IP67/NEMA 6 classified, making the whole system water- and
dust resistant.

SYSTEM SETUP 436


For the currently available models, see the Qualisys website:
https://www.qualisys.com/weatherproof-motion-capture/
Additional features for improving outdoor measurements are:
l The sun filter, a narrow band pass filter that effectively filters out the sun
light while maximizing the visibility of the IR reflections of the markers. The
sun filter is included with all Arqus protected cameras.

l Active filtering, the possibility to subtract background illumination from the


reflected light of the markers.

l An extended operating temperature range of -15° to 45°C for Arqus pro-


tected cameras.

l The possibility to use long-range active markers.

Underwater systems

Qualisys provides the possibility to measure under water, (e.g. in indoor ocean
basins used for ship scale model testing or ordinary basins, used for water
sports) by using specially modified cameras. The camera housing is IP68 clas-
sified and pressure tested to 5 bar (40 m depth) and corrosion-protected for
use in salt water tanks or chlorinated swimming pools. Weight and volume is
balanced to give the camera neutral buoyancy for easy handling in water.
Qualisys underwater cameras are equipped with a special strobe with high
power cyan LEDs. These LEDs are not limited to a flash time of 10% of period
time as the regular strobe. Therefore the exposure time can be set to almost
the period time. The long exposure times are needed to get enough light in the
water. Because the water absorbs more light than air it also means that the
measurement distance are more dependent on the exposure time.
The cameras are connected with underwater cables to a connector unit on
land. One connector unit can be used to connect up to three cameras. For
more than three cameras the respective connector boxes can be chained
together or connected via an Ethernet switch.
For the currently available models, see the Qualisys website:
https://www.qualisys.com/cameras/underwater/.

SYSTEM SETUP 437


NOTE: The FOV is changed by refraction index of water (1.33). For
example, a lens with 40º FOV is reduced to 31º FOV when used under-
water.

CAUTION: When using an underwater camera above water the strobe


unit can easily overheat. Do not have the camera connected for more
than an couple of minutes when testing the camera out of water. Do not
have the camera in preview or capture longer than 30 seconds and do
not use exposure times longer than 1000 ms.

MRI compatible cameras

Qualisys provides cameras, including cabling and accessories adapted for use
inside MRI rooms. A minimal camera system of only three cameras can be suf-
ficient to track hand and finger movements outside the scanner bore. A larger
six camera system can provide the possibility to look into the bore, for example
for monitoring the movements of the head during a scan.
The housing is made of die-casted aluminium, steel fasteners are replaced by
brass screws and the calibration L-frame and wand is made from glass fiber.
The camera and cables are completely shielded to keep electro magnetic emis-
sions to a minimum and enabling cameras to operate just a meter away from
the scanner without causing artifacts in the MRI image. The camera system is
connected with the computer in the control room via optical Ethernet com-
munication. The power supply is outside the room and connected via a filter
through the MRI room panel.
For the currently available models, see the Qualisys website:
https://www.qualisys.com/cameras/arqus-mri/

Qualisys accessories
Qualisys offers a range of accessories for motion capture applications, includ-
ing:

SYSTEM SETUP 438


l Markers

l Marker accessories (clusters, tape, spare parts)

l Calibration kits

l Traqr - Active and passive rigid bodies

l Sync units, analog interfaces, triggers, etc.

l Mounting equipment

l Computers

l Motion capture suits

For an overview the currently available accessories, refer to the Qualisys web-
site: https://www.qualisys.com/accessories/
For technical specifications of selected products (calibration kits, active mark-
ers), see chapter "Qualisys accessories specifications and features" on
page 995.
Traqr Configuration Tool

The Traqr Configuration Tool is used the configuration of the Active Traqr and
the Naked Traqr.
The Traqr Configuration Tool is downloaded from your Dashboard on the
Qualisys website. For more information about how to use the tool see its
manual, which can be found on the Help page in the tool.

Setting up the capture space


The following chapters contain information about planning and setting up the
capture area and mounting the cameras.

Camera positioning
Cameras must be mounted firmly on tripods or other stable structures, which
isolate the camera from movements or vibrations of any sort.

SYSTEM SETUP 439


3D motion capture

To capture 3D data the camera system must consist of at least 2 cameras. The
guidelines below can be used to set up a camera system.
l The best possible setup for a 3D motion capture system is to position it so
that all cameras can see the L-shaped reference structure during the cal-
ibration, see chapter "Calibration of the camera system" on page 543.

NOTE: The cameras can be positioned so that just two of the cam-
eras are able to see the calibration reference object. The rest of the
cameras must then overlap each others field of view (FOV) to be
able to calibrate the system. For this setup QTM will automatically
use the Extended calibration method, see chapter "Extended cal-
ibration" on page 549.

l To reconstruct 3D data, at least two cameras must see each marker dur-
ing the measurement. Therefore, it is best to position the cameras so that
as many cameras as possible see each marker during the measurement.
l The angle of incidence between any two cameras should ideally be more
than 60 degrees, but at least 30 degrees. The accuracy of the 3D data cal-
culated from only two cameras placed at less than 30 degrees can
degrade below usable levels.
l In order to avoid unwanted reflections, position the cameras so that every
camera’s view of flashes from other cameras is minimized. E.g. put the
cameras above the measurement volume so that the cameras have an
angle of about 20 degrees in relation to the floor.
l Obviously the cameras must also be positioned so that they view the
volume where the motion of the measurement subject will occur. Mark
the volume by putting markers in the corners of the base of the meas-
urement volume. Then make sure that all cameras can see all of the mark-
ers by looking at a 2D view window in preview mode. Preferably, the
cameras should not see much more than the measurement volume.
2D motion capture

In 2D motion capture just a single camera is needed, which is positioned per-


pendicular to plane that is captured.

SYSTEM SETUP 440


Connecting the system
There are three main Qualisys system platforms, Arqus, Miqus and Oqus, each
with several types of cameras or other devices. All devices are compatible and
can be connected and combined in many ways. The following chapters provide
guidelines for how to set up different system configurations.
Before connecting a Qualisys camera system, make sure that the QDS (Qualisys
DHCP server) is running on the computer and that the network interface set-
tings are correct, see "QDS" on page 462 respectively "Network card setup" on
page 461.
For further information about optimizing the camera setup and settings, see
chapter "Optimizing the camera settings" on page 479.

Connecting the cameras


Technical notes

It is possible to combine Qualisys devices in a system in many ways. However,


there are some technical limitations related to the type of cables used, the
power requirements of the cameras and the bandwidth. The technical aspects
of connecting different types of cameras are summarized below.

Power and camera cable requirements

Inter-chain mixing
Cameras using the same cable type can be mixed together in a chain,
without the need of a network switch. To mix cameras of different cable
types you need to use a switch. For an overview of cable types, see the
Table below.

Power kit capacity


Due to varying power consumption, the number of cameras a power kit
can power depends on the camera type and cable length. For an overview
of the maximum number of cameras per power kit and the maximum
total cable length per chain, see the Table below.

SYSTEM SETUP 441


Maximum chain sizes in homogeneous systems
The number of cameras that can be connected in a single daisy chain is
limited. The maximum chain size per camera type is indicated in the Table
below. To build a camera system with a larger number of cameras you
need to use multiple chains, which are joined together with a network
switch.

Maximum total cable length


The maximum total cable length per power supply is limited for Arqus and
Miqus systems. The maximum length depends on the type of power sup-
ply used. For the revision 2 (R2) power supply, the maximum cable lengths
are displayed in the Table below. For the revision 1 (R1) power supply, the
maximum total cable length is 50 m for Arqus and 120 m for Miqus sys-
tems.

NOTE: Older Miqus cameras are not compatible with the R2 power
supply. Please contact Qualisys support for more information.

Max. Max. total


Camera Max. chain
Cable type power kit cable
type size
capacity length [m]

Arqus A 5 20 75

Arqus Pro-
B 5 20 75
tected

Miqus A 10 20 140

Miqus Video A 101 32 140

Miqus
A 101 3 140
Hybrid

SYSTEM SETUP 442


Max. Max. total
Camera Max. chain
Cable type power kit cable
type size
capacity length [m]

Oqus C 5 15 -
1Even though a power supply can power up to 10 Miqus Video/Hybrid
cameras, it is
recommended to use one power/data chain per three cameras.
2When combined with marker cameras, the maximum number of
Miqus Video cameras
is 2 per chain.

NOTE: You can add a Camera Sync Unit to the chain beyond the
maximum power kit capacity or maximum chain size.

Mixing Arqus and Miqus

Arqus and Miqus cameras can be mixed in the same chain. Due to differences
in power consumption, one Arqus camera can be interchanged by two Miqus
cameras and vice versa. Supported combinations per power kit:
l 4x Arqus + 2x Miqus

l 3x Arqus + 4x Miqus

l 2x Arqus + 6x Miqus

l 1x Arqus + 8x Miqus

Adding Miqus Hybrid or Miqus Video cameras

Power consumption wise, a Miqus marker camera can be swapped out for a
Miqus Hybrid or Miqus Video camera. However, due to bandwidth require-
ments it is recommended to limit the number of video cameras in a chain to a
maximum of two video cameras.
If more video cameras are needed, they should be added in a separate chain
and joined through a network switch.

SYSTEM SETUP 443


IMPORTANT: If you need more than three video cameras, the faster 10
Gigabit switch is required.

Connecting a Qualisys system through an Ethernet switch

For most systems the recommended setup is a daisy chain configuration.


However, Qualisys systems can also be connected through an Ethernet switch.
The use of a switch allows for more flexibility in setting up and combining
Qualisys devices. The switch should have a bandwidth of at least 1 Gigabit. For
systems with more than three Miqus video cameras a 10 Gigabit switch is
required. The Ethernet cables must be Cat 6A or better.
To setup the system with an Ethernet switch, follow these instructions:

1. Check that the computer has an Ethernet card with at least the same
bandwidth as the switch.
2. Connect the Ethernet card with the switch. You can use any of the ports
on the switch.
3. Use the standard network card settings as described in the chapter "Net-
work card setup" on page 461.
4. Connect the respective daisy chains of Qualisys devices to one port each.

IMPORTANT: Do not connect the different daisy chains to each other. It


will disrupt the communication and the switch will be of no use.

The use of a switch is required in the following cases:


Large camera systems
When the number of cameras in a system exceeds 20 for Arqus and
Miqus or 15 for Oqus, the daisy chains should be split up in smaller
chains.

More than two Qualisys video cameras in a system


When there are more than two video cameras in a Qualisys system a
switch is needed for optimizing the data communication. For Oqus high-
speed cameras, the fetching time can be greatly reduced by connecting a

SYSTEM SETUP 444


maximum of 2-3 cameras per port.

Systems with more than three Miqus Video or Miqus Hybrid cam-
eras
This type of setup requires a 10 Gigabit switch with a maximum of three
cameras per port.
Connecting an Arqus system

An Arqus system consists of the following components:


Arqus cameras
Arqus cameras of one or several types.

Camera cables
The number of camera cables is the same as the number of cameras.

One or more Power kits


A power kit consists of a power supply, a power injector and a host (Eth-
ernet) cable. One power kit can power up to 5 Arqus cameras.

SYSTEM SETUP 445


Camera Sync Unit (optional)
A Camera Sync Unit is required for synchronization with external equip-
ment.

Gigabit Ethernet switch (optional)


The use of a switch is required for systems with more than 20 Arqus cam-
eras.

The Arqus system is easy to connect. The backside of each camera contains two
Data/Power ports, see chapter "Arqus camera: back side" on page 934. In a
basic setup with up to 5 cameras, the cameras are connected to each other by
camera cables in a daisy chain configuration. For the maximum number of cam-
eras and the maximum cable length per power supply, see chapter "Power and
camera cable requirements" on page 441.
The first camera or Camera Sync Unit is connected via a Power injector to the
power supply and the computer as follows:

1. Connect the power supply to the Power port on the power injector.

2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to the Ethernet port of the computer.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data port of the first camera or
the Camera Sync Unit.
For larger systems, start a new power chain when the maximum of 5 cameras
per power chain is reached. Connect the first camera of the new power chain
(camera 6, 11, 16) via a Power injector as follows:

1. Connect the power supply to the Power port on the power injector.

2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to a Power/Data connector of the previous
camera.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data connector of the first camera
in the next power chain.
For systems with more than 20 cameras, the use of a switch is required. The
system can then be subdivided in chains with up to 20 cameras. The chains are
connected with their respective host cables to a switch, which in its turn is con-
nected to the computer.

SYSTEM SETUP 446


The Camera Sync Unit can be placed anywhere in the chain, but in most cases
it will be practical to have it close to the computer.
When the cables have been connected correctly, the indicator LEDs at the
Arqus data/power ports will indicate the status of the power and data con-
nection. For more information, see chapter "Arqus camera: back side" on
page 934.

The Arqus startup sequence

Before you connect the Arqus camera system, make sure that the QDS
(Qualisys DHCP server) is running and that the network interface settings are
correct. This is needed for the cameras to receive an IP address from QDS to
communicate with other Qualisys cameras and the host computer. For more
information, see "QDS" on page 462 and "Network card setup" on page 461.
The general Arqus startup sequence is as follows (total duration about 50 s):

1. Connect the power to the cameras.

2. Booting of the cameras. The amber LED ring is lit during booting.

3. The camera receives an IP address.


If the camera does not receive an IP address the amber LED ring pulses.
Possible reasons for this might be that the system is not connected to the
computer, or that QDS is not running. For instructions on how search for
the error, refer to chapter "Troubleshooting connection" on page 1022.
4. The cameras synchronize to the master camera. The status LED blinks
green during synchronization. When synchronization is completed it lights
solid green.

Setting aperture and focus

The Arqus A12 with standard lens option has a motorized lens, which can be
controlled from QTM via the Lens Control interface in the Camera Settings
sidebar in the 2D View window, see "Camera settings sidebar" on page 91.

SYSTEM SETUP 447


IMPORTANT: When using motorized lenses, it is important to keep
some space between the strobe and the lens, otherwise the strobe may
block the lens when changing focus.

For Arqus cameras with a manual lens, you need to extend the strobe mech-
anics to get access to the focus and aperture rings of the lens. The strobe mech-
anics can be shifted as follows:

1. Press and hold the Strobe unlock button at the backside of the Arqus
camera.

2. Shift the strobe mechanics outwards to expose the lens for adjustment.

3. Shift the strobe mechanics inwards again when done. You may have to
fine adjust the position of the strobe mechanics so that it locks into the
dimples on the strobe rails.

IMPORTANT: The strobe should always be in closed position during


measurement to achieve the best strobe light distribution, and to make
sure that the strobe does not block the peripheral view of the lens.

For general recommendations on settings aperture and focus, see chapter


"Tips on setting aperture and focus" on page 481.

SYSTEM SETUP 448


Connecting a Miqus system

A Miqus system consists of the following components:


Miqus cameras
Miqus cameras of one or several types.

Camera cables
The number of camera cables is the same as the number of cameras.

One or more Power kits


A power kit consists of a power supply, a power injector and a host (Eth-
ernet) cable. One power kit can power up to 10 Miqus cameras.

Camera Sync Unit (optional)


A Camera Sync Unit is required for synchronization with external equip-
ment.

Gigabit Ethernet switch (optional)


The use of a switch is required for systems with more than 20 Miqus cam-
eras.

SYSTEM SETUP 449


The Miqus system is easy to connect. The backside of the camera contains two
Data/Power ports, see chapter "Miqus camera: back side" on page 945. In a
basic setup with up to 10 cameras, the cameras are connected to each other by
camera cables in a daisy chain configuration. For the maximum number of cam-
eras and the maximum cable length per power supply, see chapter "Power and
camera cable requirements" on page 441.
The first camera or Camera Sync Unit is connected via a Power injector to the
power supply and the computer as follows:

1. Connect the power supply to the Power port on the power injector.

2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to the Ethernet port of the computer.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data port of the first camera or
the Camera Sync Unit.
For larger systems, start a new power chain when the maximum of 10 cameras
per power chain is reached. Connect the first camera of the new power chain
(camera 11) via a Power injector as follows:

1. Connect the power supply to the Power port on the power injector.

2. Connect one end of the host cable to the to the Data port on the power
injector and the other end to a Power/Data connector of the previous
camera.
3. Connect one end of the camera cable to the Camera port on the power
injector and the other end to a Power/Data connector of the first camera
in the next power chain.
For systems with more than 20 cameras, the use of a switch is required. The
system can then be subdivided in chains with up to 20 cameras. The chains are
connected with their respective host cables to a switch, which in its turn is con-
nected to the computer.
The Camera Sync Unit can be placed anywhere in the chain, but in most cases
it will be practical to have it close to the computer.
When the cables have been connected correctly, the indicator LEDs at the
Miqus data/power ports will indicate the status of the power and data con-
nection. For more information, see chapter "Miqus camera: back side" on
page 945.

SYSTEM SETUP 450


The Miqus startup sequence

Before you connect the Miqus camera system, make sure that the QDS
(Qualisys DHCP server) is running and that the network interface settings are
correct. This is needed for the cameras to receive an IP address from QDS to
communicate with other Qualisys cameras and the host computer. For more
information, see "QDS" on page 462 and "Network card setup" on page 461.
The general Miqus startup sequence is as follows (total duration about 50 s):

1. Connect the power to the cameras.

2. Booting of the cameras. The amber LED ring is lit during booting.

3. The camera receives an IP address.


If the camera does not receive an IP address the amber LED ring pulses.
Possible reasons for this might be that the system is not connected to the
computer, or that QDS is not running. For instructions on how search for
the error, see chapter "Troubleshooting connection" on page 1022.
4. The cameras synchronize to the master camera. The status LED on the
front of the camera blinks green during synchronization. When syn-
chronization is completed it lights solid green.

Setting aperture and focus

All Miqus cameras are equipped with a manual lens. You need to extend the
strobe mechanics to get access to the focus and aperture rings of the lens. The
strobe mechanics can be shifted as follows:

1. Unlock the strobe mechanics with the Lock lever.

2. Shift the strobe mechanics outwards to expose the lens for adjustment.

SYSTEM SETUP 451


3. Shift the strobe mechanics inwards again when done and lock it with the
lock lever. You may have to fine adjust the position of the strobe so that
the lock lever locks into the dimples on the strobe rails.

IMPORTANT: The strobe should always be in closed position during


measurement to achieve the best strobe light distribution, and to make
sure that the strobe does not block the peripheral view of the lens.

For general recommendations on settings aperture and focus, see chapter


"Tips on setting aperture and focus" on page 481.
Connecting a Miqus Hybrid or Video system for markerless mocap

SYSTEM SETUP 452


Miqus Hybrid or Miqus video cameras can be used in systems for markerless
mocap. A markerless mocap system consists of the following components:
Miqus Hybrid or Miqus Video cameras
Miqus Hybrid or Miqus Video cameras or a combination of both.

Camera cables
The number of camera cables is the same as the number of cameras.

One or more Power kits


A power kit consists of a power supply, a power injector and a host (Eth-
ernet) cable. One power kit can power up to 10 Miqus cameras. However,
since the maximum number of Miqus Hybrid or Miqus Video cameras per
chain is limited to three, it is more practical to use one power supply per
chain of three cameras.

Camera Sync Unit (optional)


A Camera Sync Unit is required for synchronization with external equip-
ment.

10 Gigabit Ethernet switch


The use of a 10 Gigabit switch is required for systems with more than
three Miqus Video or Miqus Hybrid cameras.

Desktop Markerless Computer


A specially configured computer is needed with a 10 Gigabit network
adapter and a disk configuration optimized for handling large amounts of
video data.

A system with Miqus Hybrid or Miqus Video is connected in the same way as a
Miqus system. However, the maximum chain size is limited to three cameras.
The chains must be connected via a 10 Gigabit switch to the Desktop Mark-
erless Computer.
The maximum number of cameras in a system that can be used at Full FOV and
full capture rate depends on the computer processor. For specifications, see
the below table.

SYSTEM SETUP 453


Max number of Miqus Max number of Miqus
Video cameras Video Plus cameras
Processor
(full FOV and frame (full FOV and frame
rate) rate)

Intel Core i9 10900K 16 Not tested

Intel Core i9 11900K 24 Not tested


Intel Core i9 12900K
40 Not tested
and newer
AMD Ryzen 9 9950X 40 27
Intel Core i9 11950H
12 Not tested
(Laptop)
Intel Core i9 12900H
36 24
and newer (Laptop)

NOTE: If the number of cameras exceeds the maximum, one or more


video cameras may black out during a recording. However, it is still pos-
sible to do recordings by reducing the capture rate, image size or res-
olution. Alternatively, you can split up the system in multiple
interconnected systems, see chapter "Setting up multiple video systems"
on page 525.

SYSTEM SETUP 454


Connecting an Oqus system

An Oqus system consists of the following components:


Oqus cameras
Oqus cameras of one or several types.

Hybrid Power/Data cables


Cables carrying both power and data between cameras.

Data cables
Cables carrying data between cameras.

Host cable
Cable carrying data between first camera in the chain and computer or
switch.

One or more power supplies


One power supply can power up to five Oqus cameras.

SYSTEM SETUP 455


Trigger/Sync splitter (optional)
A Trigger/Sync splitter cable can be connected to the Control port of an
Oqus camera for synchronization with external devices. It is also possible
to add a Camera Sync Unit to the system. However, this requires the use
of an Ethernet switch.

Gigabit Ethernet switch (optional)


The use of a Gigabit Ethernet switch is required for systems with more
than 15 Oqus cameras or if you have more than two Oqus 2c or Oqus
high-speed video cameras.

The Oqus system is easy to connect. The connectors are unique and cannot be
connected to the wrong ports. Further, the connector color matches that of the
port. The DATA connector can be connected to any of the two DATA ports, and
the POWER connector can be connected to any of the two POWER ports, so it
does not matter on which side you put the connector. For more information on
the connectors, see "Oqus camera connectors" on page 965.

NOTE: When the cables have been connected correctly the LEDs on the
back of the Oqus will be lit. The EXT LED will be lit green and the ACT
LEDs will be blinking.

For larger systems, start a new power chain when the maximum of 5 cameras
per power chain is reached. One of the POWER ports of the first camera of the
new chain should be connected to a power supply. The DATA ports of the last
camera of the previous chain and the first camera of the new chain are con-
nected with a Data cable.

NOTE: For Oqus systems larger than 15 cameras and for systems with
many high-speed cameras, the performance can sometimes be improved
with a Gigabit Ethernet switch and then connect the cameras in shorter
daisy-chains, see "Connecting a Qualisys system through an Ethernet
switch" on page 444.

SYSTEM SETUP 456


Oqus startup sequence

Before you connect the camera system, make sure that the QDS (Qualisys
DHCP server) is running and that the network interface settings are correct.
This is needed for the cameras to receive an IP address from QDS to com-
municate with other Miqus cameras and the host computer. For more inform-
ation, see "QDS" on page 462 and "Network card setup" on page 461.
The general Oqus startup sequence is as follows.

1. Connect the power supply and the green LED on the front will blink twice.

2. After a few seconds the startup bar below is shown.

If the bar stops at two-thirds then the Oqus is waiting for an IP-address.
The reason is probably either a missing connection to the computer or
that QDS is not running, for instructions on how to search for the error
see "Troubleshooting connection" on page 1022.
3. When the camera has an IP-address the display will show an image similar
to one below. The Oqus will first synchronize to other cameras, during
that process the clock is blinking and there is a spinning plus sign instead
of the letter M or S. Wait until the clock stopped blinking and the display
shows M or S.

The M or S on the display stands for Master respectively Slave. This is only
to show which camera that is sending a synchronization pulse to the
other cameras.

SYSTEM SETUP 457


Setting the aperture and focus

The Oqus 7+ with standard lens option has a motorized lens, which can be con-
trolled from QTM via the Lens Control interface in the Camera Settings side-
bar in the 2D View window, see "Camera settings sidebar" on page 91.
For Oqus cameras with a manual lens, you need to extend the strobe mech-
anics to get access to the focus and aperture rings of the lens.

1. Turn the strobe mechanics counterclockwise to expose the lens for adjust-
ment.

2. When done, turn the strobe back (clockwise) to close it.

IMPORTANT: The strobe should always be in closed position during


measurement to achieve the best strobe light distribution, and to make
sure that the strobe does not block the peripheral view of the lens.

For general recommendations on settings aperture and focus, see chapter


"Tips on setting aperture and focus" on page 481.

Setup Oqus system for wireless communication (deprecated)

The Oqus system can run with a wireless communication from the camera sys-
tem to the computer. The camera uses the 802.11b/g@54mbps standard.
However the communication speed can be reduced depending on the signal
strength or if there are many other wireless networks.

SYSTEM SETUP 458


Setting up Oqus for wireless communication requires that the wireless adapter
of the computer is set up as a hosted network. This requires a computer run-
ning Windows 7. This feature is no longer supported for Windows 10 and
higher. The configuration requires a special version of QDS, contact sup-
[email protected] for more information.

NOTE: From QTM 2019.3 wireless connection of an Oqus system is no


longer supported by the included version of QDS. Please contact Qualisys
support at [email protected] for a special version of QDS if you need
to connect an Oqus wireless camera to a wireless network.

Connecting a mixed system

All Qualisys devices are compatible and can be combined in a single system.
When mixing camera types, the global camera settings will default to the lowest
of the camera limits. This means for example that the capture rate will be lim-
ited to 183 Hz at full field of view when Miqus M5 is combined with Miqus M3.
However, individual settings can always be set within the limit of the camera
type.
Devices with the same cable type can be connected in a daisy chain, for
example Arqus and Miqus cameras with cables of type A, see "Power and cam-
era cable requirements" on page 441. The total number of cameras that can be
connected to a power supply depends on the number of Arqus and Miqus cam-
eras, see chapter "Mixing Arqus and Miqus" on page 443.

SYSTEM SETUP 459


For combining devices with different cable type the use of a Gigabit Ethernet
switch is required. For example, for mixing Arqus or Miqus cameras with Oqus
cameras, the respective systems are connected in their own daisy chain con-
figurations. The chains are then connected with host cables to the Ethernet
switch, see chapter "Connecting a Qualisys system through an Ethernet switch"
on page 444 for more information.

SYSTEM SETUP 460


Network configuration
Network card setup

It is recommended to use a computer with two network interfaces, one


reserved for the Qualisys camera system and one for an office network or inter-
net. This could be either two Ethernet interfaces or one Ethernet and one
WLAN. If you only have one interface it should be dedicated to the Qualisys
camera system.

IMPORTANT: Do not connect the Qualisys camera system to wired USB


to Ethernet adapters. The communication cannot be guaranteed on the
adapters and there can be communication errors when capturing data.

The recommended setup is to use static IP on the Qualisys camera network,


other setups are possible but not recommended by Qualisys.
You can setup the network card through QDS in the following alternative ways:

SYSTEM SETUP 461


l Use the QDS Configuration wizard to setup the network interface, see
chapter "Network configuration wizard" on page 465.
l Set up the network interface manually using the QDS Advanced dialog or
via the Windows network adapter settings.
When setting up the network interface manually, take note of the following
guidelines:

1. Before changing the network configuration you can save the current con-
figuration in QDS, so that you can easily restore the current configuration.
This is recommended when you only have one network card.
2. Make sure to set the Address type to Static Address.

3. Use the Autoconfig button in the QDS Advanced dialog to automatically


set an available IP address and corresponding Subnet mask. When manu-
ally setting them, make sure of the following:
a. The IP address must start on 192.168., for example 192.168.254.1. It
is important to check that the first three numbers are not already
used on an existing interface.
b. Use Subnet mask as 255.255.255.0.

4. Make sure that the Enable QDS operation for this network option in
the QDS Advanced dialog is checked.
QDS

QTM comes with a DHCP server called QDS (Qualisys DHCP Server), which dis-
tributes IP addresses to the Qualisys cameras. An IP address for each camera is
required to be able to communicate with them over the Ethernet network. QDS
will be installed automatically with QTM and it must be running at all times, to
provide the cameras with IP addresses at startup. The DHCP server will only
give IP addresses to Qualisys cameras so it will not disturb your computer net-
work.

QDS menu

To open the QDS menu, right-click on the QDS icon in the status toolbar.

SYSTEM SETUP 462


The QDS menu contains the following options:
Configuration wizard
The configuration wizard can be used to setup your network interface for
Qualisys cameras, to configure an Oqus camera for wireless operation
and to setup a virtual wireless access point, see "Network configuration
wizard" on page 465.

Advanced
Advanced configuration of network interfaces on the computer, see
"Advanced" on page 467.

Network configurations
Using this sub-menu you can save or load network configurations. Click
on Save to save the current configuration and then to load a con-
figuration click on Load.

Reboot all cameras


Reboot all Qualisys cameras that have an IP-address.

Camera utilities
QDS can control the Qualisys cameras with the following commands. The
commands will be sent to all cameras, but only acted upon by the camera
models that support the command.
Green on
Switch on green LEDs on the LED ring.

Green off
Switch off green LEDs on the LED ring.

Green pulsing
Pulse green LEDs on the LED ring.

SYSTEM SETUP 463


Amber on
Switch on amber LEDs on the LED ring.

Amber off
Switch off amber LEDs on the LED ring.

Amber pulsing
Pulse amber LEDs on the LED ring.

Display on
This is the default mode for the Arqus and Oqus cameras. The dis-
play is on when you use the camera, but goes into sleep mode if not
used for 2 hours.

Display off
In this mode the camera display is turned off for Arqus and Oqus
cameras. The front LED is also turned off for all Qualisys cameras.
Display and LEDs do not turn on unless you enable them with Dis-
play on or reboot the cameras.

Camera blocklist
The camera blocklist is used to block QDS from distributing
IP addresses to specific cameras, see "Camera blocklist" on
page 469.
Enabled
Enable the camera blocklist.

Edit
Open the mac_block.txt file in the default text editor.

Show IP address on camera display


Switch the display to show the last number in the IP-address of the
camera. The Oqus cameras also show the last three numbers in the
serial number.

Show ID on camera display


Switch the display to show QTM ID of the camera. This is the default
for cameras that have been connected to QTM.

About QDS
Information about QDS.

SYSTEM SETUP 464


Start QDS automatically
Option for starting QDS automatically on computer startup.

Shut down QDS


Shut down QDS.

Network configuration wizard

The QDS network configuration wizard will guide you through the different
steps to setup the network for Qualisys cameras. If you run the wizard you do
not need to follow the instructions for network card setup in chapter "Network
card setup" on page 461. Follow the steps below.

1. Double click on the QDS icon in the Windows task bar or click on Con-
figuration wizard in the QDS menu to start the wizard. This will open the
Wizard with the Choose network connection page.
2. The list will show all enabled network interfaces on the computer. Select
the interface that you want to use with Qualisys cameras and click Next.

a. If there are more than 4 network interfaces available, use the Prev 4
conn and Next 4 conn buttons to navigate through the list.
b. Click on More to see information about the network interface. The
More info window shows the current settings of the network inter-
face. This is the same information as is shown in Advanced, see

SYSTEM SETUP 465


"Advanced" on the next page.

NOTE: The wizard will not configure all interfaces, for example a
network interface that is already connected to an internal network
and has received an IP address will not be configured because it is
considered to have a running DHCP server. However, any dis-
connected interfaces can be configured by the wizard.

3. The wizard shows how it will change the selected interface. You can save
the current network setup with the Save button for backup. Click Next to
continue.

4. Click Finish to close the wizard.

SYSTEM SETUP 466


Advanced

The Advanced settings dialog is opened with Advanced... in the QDS menu.
The QDS dialog contains settings for the enabled network interfaces on the
computer. These settings can be used instead of the QDS wizard or the Win-
dows network settings.
Select network connection to view/edit
Select the network that you want to edit in the dialog. The list is in the
same order as in Windows and if a network is disabled in Windows net-
work connections it will not be shown in the list.

Name
Current name of the network.

SYSTEM SETUP 467


Connection type
Type of network: wired LAN or wireless LAN.

Status
Current status of the connection: Connected or Not connected.

Address type
Select the wanted address type between these two types:
Received through DHCP
The network will receive its IP address from a DHCP server. The
standard setting for many networks.

Static address
The network is set to a static IP address with the settings below. This
setting must be used for QDS to give IP addresses to Qualisys cam-
eras.

IP address
Current IP address of the network. The address can be changed when
Address type is set to Static address.

IP address mask
Current IP address mask of the network. The mask can be changed when
Address type is set to Static address.

Cameras requesting IP address on this interface since QDS started


Number of Qualisys cameras requesting IP address on the network.

QDS started
Time since QDS started.

Last camera IP address request


Time since last camera IP address request.

Enable QDS operation for this network connection


Activate QDS on the network, i.e. if QDS is disabled for the network,
Qualisys cameras connected on that network will not get any IP address
from this QDS. When networks with static IP addresses are disabled it will
be shown on the QDS icon.
Some of the networks with static IP addresses have been dis-
abled.

SYSTEM SETUP 468


All of the networks with static IP addresses have been disabled.

Autoconfig
Use this button to configure the network interface for Qualisys cameras.

Camera blocklist

The camera blocklist is used for blocking QDS from distributing IP addresses to
specific cameras. Follow these steps to enable the blocklist.

1. Find the MAC addresses of the cameras that you want to block. For
example locate the system in QTM and use the System info option to get
information about all the cameras in the system. The MAC address is also
written on the label of all Qualisys cameras.
2. Open the mac_block.txt file with Camera utilities -> Camera Blocklist -
> Edit on the QDS menu.
3. Add the MAC addresses that you want to block in the list. '#' can be used
to comment a line in the file so that it is not used by the blocklist. There
can also be text after the MAC address to describe which camera it is. For
example:
C4:19:EC:00:0C:14 S/N 12345
C4:19:EC:00:0C:15 S/N 12346
# C4:19:EC:00:0C:16 S/N 12347
4. Enable the blocklist with Camera utilities -> Camera Blocklist - >
Enabled on the QDS menu.
5. Reboot the cameras. The cameras in the blocklist will not receive any IP
address.

NOTE: If you disable the blocklist or comment lines with a MAC


address then the blocked cameras will receive an IP address without
rebooting.

SYSTEM SETUP 469


QDS conflict

There will be a conflict for the IP addresses when two or more computers with
QDS are connected to the same camera system. In those cases the first QDS
that replies after the startup of a camera will give the IP address to the camera.
On the other computers the Qualisys DHCP server message below will be
shown and QDS operation will be disabled on that network. The QDS operation
can be turned on again with the Advanced option on the QDS menu. However,
make sure that the other computers are disconnected from the camera system
otherwise QDS operation will be turned off again at the camera startup.

Firmware update
QTM will detect automatically if the camera firmware needs to be updated or
downgraded in the following cases. The firmware update is done via the
Qualisys Firmware Installer (QFI). For more information about QFI, see
chapter "How to use Qualisys Firmware Installer (QFI)" on the next page.
Firmware update when locating system

The following dialog will appear when QTM has detected an old firmware when
you are locating the system on the Camera System page in the Project
options dialog.

Detailed info...
Open a dialog with information on why QTM needs to update the firm-
ware.

SYSTEM SETUP 470


Cancel
Cancel the update. However, you cannot use the system with the current
QTM version until you have updated the firmware.

OK
Start the firmware update program Qualisys Firmware Installer (QFI).
Firmware update when starting preview

The following dialog will appear when QTM has detected an old firmware when
you start a preview with New on the File menu.

Detailed info...
Open a dialog with information on why QTM needs to update firmware.

Cancel
Cancel the update. However, you cannot open a new file until you have
updated the firmware.

OK
Start the firmware update program Qualisys Firmware Installer (QFI).
How to use Qualisys Firmware Installer (QFI)

The Qualisys Firmware Installer (QFI) is used to install firmware on Qualisys


cameras. QFI is organized as a Wizard. In a regular firmware upgrade just click
Next until the upgrade is finished, an upgrade may take a few minutes to per-
form. The steps are described below.

1. Start QFI. Usually this is done via the firmware upgrade dialog, see
chapter "Firmware update" on the previous page. Alternatively, locate the
QFI.exe program in QTM installation folder and double-click on it to start.
2. Click Next to start looking for Qualisys cameras connected to the com-
puter. The cameras must have started completely before you start loc-
ating the cameras.

SYSTEM SETUP 471


3. Check that the number of located cameras is correct and click Next to pro-
ceed.

NOTE: In case there are multiple camera systems connected via


separate Ethernet adapters, select the system to be upgraded.

4. Review the system information and check that the Firmware to be


installed is correct.

SYSTEM SETUP 472


a. Optionally, click on More info for more detailed information about
the cameras.
b. Optionally, click Advanced to review or change any advanced
options, see chapter "Advanced firmware installer settings" on
page 475.
c. Click Next to start the firmware installation.

5. Wait until all the steps (Uploading files, Programming camera(s), Waiting for
camera(s) to reboot) have finished.

6. Click Finish to close QFI.

SYSTEM SETUP 473


NOTE: If QFI reports errors after the firmware upgrade, it is recom-
mended to run QFI again.

SYSTEM SETUP 474


Advanced firmware installer settings

The Advanced Firmware installer settings dialog allows to select or deselect


options to be performed as part of the firmware installation.

CAUTION: Do not use these settings unless you are absolutely sure.

Upgrade firmware
Uncheck to not download the new firmware, which is useful if you only
need to modify PTP Mode or Lens Control.

PTP Mode
The PTP mode options are:

SYSTEM SETUP 475


No change: Keep the current PTP mode. This is the default choice.

Use Qualisys PTP: This is the default PTP mode used for syn-
chronization of the Qualisys devices.

Use standard PTP: This PTP mode needs to be selected for PTP syn-
chronization of the Qualisys devices with an external clock master.
For more information, see chapter "How to use PTP sync with an
external clock master (Camera Sync Unit)" on page 501.

Lens Control
No change: Keep the current lens control mode. This is the default
choice.

Unlock settings: Enable focus and aperture control from QTM for
cameras with a motorized lens. This is the default mode.

Lock settings: Disable focus and aperture control from QTM. In this
mode, the communication with the lens is disabled and the lens con-
trol parameters are no longer shown in QTM. This setting can be use-
ful to fix the focus and aperture settings once they have been set to
their optimal values in a fixed camera setup.

TCP Keep Alive


No change: Keep the current setting. This is the default choice.

Enable: Enable TCP Keep Alive mode. When enabled the cameras
regularly check if the command channel is open when there is no
activity.

Disable: Disable TCP Keep Alive mode (default).

Allow firmware downgrade


No: Do not allow firmware downgrade (recommended)

Allow downgrade: Check to allow downgrade to a previous version


of the firmware.

Below these settings is a list of all cameras in the system. Check the cam-
eras that you do not want to upgrade.

SYSTEM SETUP 476


Starting up the system

Locating the camera system in QTM


Before the first use of the camera system it must be located QTM. Make sure
that the camera system hardware has been correctly installed, see chapter
"Connecting the system" on page 441.
The locating of the camera system in the QTM software is performed on the
Camera System page in the Project Options dialog, see chapter "Locate Sys-
tem" on page 222.

NOTE: When locating the system QTM will detect automatically if the
camera system has old firmware. The firmware must then be updated
before the system can be used. For more information see chapter "Firm-
ware update when locating system" on page 470.

Outline of how to locate the camera system

The steps below are just an outline of what should be done to automatically
connect the camera system to QTM.
Follow these steps to connect the camera system to QTM:

1. Switch on the camera system, wait for the cameras to start up properly
and start QTM.
2. Open the Project options dialog and go to the Camera System page.

3. Click Locate System. This will open the Finding camera system dialog.

4. Choose the camera system and click OK.

5. For Arqus and Miqus cameras there is the option to automatically order
the cameras using the Auto Order button.

SYSTEM SETUP 477


NOTE: In case there are problems to connect the camera system, check
the troubleshooting list in chapter "Troubleshooting connection" on
page 1022.

Starting a preview
When opening a new file in QTM, aka starting a new measurement, the cam-
eras are starting in Preview mode. This is done by pressing the New button in
the QTM toolbar ribbon or the File menu (keyboard shortcut Ctrl + N).

NOTE: The following terminology is used interchangeably for starting the


camera system:
l Starting a preview

l Starting a new measurement

l Opening a new file

Once the cameras are in preview mode, QTM can stream data in real
time. Therefore, Preview mode is also referred to as real-time mode, RT/Pre-
view modeor live preview mode.

If the camera system has not been located yet, starting a preview will auto-
matically locate the camera system.
When starting the cameras for the first time after booting the cameras, it may
take some time before the cameras are ready. The status is indicated by the
Waiting for cameras dialog.

If no camera system is found, for example when the cameras are still booting,
QTM will wait for the cameras to start up.

SYSTEM SETUP 478


You can press Cancel to interrupt the start up.

NOTE: In case not all cameras have finished booting QTM may not find
all cameras. It is recommended to locate the system first, see chapter
"Locate System" on page 222.

Optimizing the camera settings

Identifying and ordering the cameras in QTM


When the camera system is connected the cameras appear in QTM in arbitrary
order when starting a measurement for the first time. It can be helpful to order
the cameras in QTM so that the order corresponds to the physical setup. There
are several ways to achieve this.
Firstly, you will need to identify the cameras. The way cameras can be identified
depends on the camera type.
l Arqus and Oqus cameras show their camera number in QTM on the front
display.
l Arqus and Miqus cameras can be identified using the Identification tool,
which can be activated from the 2D View toolbar. For detailed inform-
ation, see chapter "Identifying the cameras with the identification tool" on
the next page.
l Another pragmatic way to identify a camera is to go to and point at it
while watching the live camera video feeds in the 2D view window. This is
easiest when you are with two persons.
Ordering the cameras can be done in two ways. For Arqus and Miqus cameras
you can use the Auto Order option when connecting the system. For more
information, see chapter "Automatic ordering of the cameras" on page 481.

SYSTEM SETUP 479


It is also possible to order the cameras manually using the Reorder tool in the
2D view toolbar. Follow these steps:

1. Start a new measurement in QTM and open the 2D view window.

2. Make sure that all cameras are selected in the Camera selection toolbar.

3. Click on the Reorder tool on the 2D view toolbar to activate it.

4. You can now drag and drop individual camera image areas to their
desired positions. Repeat this until you are satisfied with the order of the
cameras.
Identifying the cameras with the identification tool

Arqus and Miqus cameras can be identified/located by using the Identification


Tool in the 2D view toolbar. When the Identification Tool is active, the LED
ring of the cameras selected in the 2D view window will light green. Arqus and
Miqus cameras can be identified as follows.

1. Start a new measurement in QTM and open the 2D view window.

2. Click on the Identification Tool icon to activate it.

3. Select one or more cameras using the Camera selection bar. The LED
ring of the selected camera(s) will light green.

TIP: If you have multiple cameras selected, you can select a single
camera by double clicking on its image area. When you double click
again, the previous selection is restored.

SYSTEM SETUP 480


Automatic ordering of the cameras

Arqus and Miqus cameras can be automatically ordered in QTM when locating
the camera system. For automatically ordering the cameras, follow these steps:

1. Locate the camera system (Locate System button under Project options
> Input devices > Camera System > Connection).
2. When all cameras have been detected, press the Auto Order button in
the Finding Camera System dialog.
3. When the auto ordering is finished, the Auto Order button changes name
to Reverse Order. The green LED rings of the cameras will flash in the
order of the found sequence. The first camera will light continuously. By
pressing Reverse Order you can change the order of the cameras, i.e.,
the last camera of the sequence becomes the first, etc.

Tips on setting aperture and focus


It is very important to set the aperture and focus correctly for your meas-
urement volume. Incorrect settings might lead to sub-optimal marker detection
by the cameras.
Aperture and focus are set on the lens of the camera. The way aperture and
focus are set depends on the camera and lens combination.
l For cameras with motorized lenses (Arqus A12 and Oqus 7+ with standard
lens option), the focus and aperture are controlled from QTM via the Lens
Control interface in the Camera Settings sidebar in the 2D View window,
see chapter "Camera settings sidebar" on page 91.
l For cameras with manual lenses, you will need to extend the strobe mech-
anics to get access to the focus and aperture rings on the lens. For inform-
ation on how to extend the strobe mechanics for the available camera
types, see the information on setting aperture and focus for the respect-
ive cameras in chapter "Connecting the system" on page 441.
It is recommended to use the following procedure for setting the aperture and
focus.

SYSTEM SETUP 481


1. Select the camera in the 2D view window in QTM using the Camera selec-
tion bar or by double clicking on the image area.
2. Place a marker in the measurement volume at the position which needs
to be in focus. Use the same marker size as used for the actual meas-
urements.
3. Set the focus. The focus can be best set in Marker or in Video mode. It is
best to set the aperture f-value as low as possible (e.g. f/2) when focusing
since this makes differences in focus more visible.
a. In Marker mode, make sure that you can only see the marker that
you have placed in the volume. Then open a Data info window and
plot the x-size of the marker. Change the focus until the x-size is as
small as possible.
b. In Video mode, change the focus until the marker is as small and as
sharp as possible. Make sure that the Flash time is long enough in
the Camera settings sidebar. You can also zoom in and translate the
2D view for better detail.
4. Set the aperture. The recommended range of aperture is between f/2 and
f/8. The optimal setting depends on the following considerations.
l For more incident light, for example in large capture volumes, use
lower f-values from f/2 to f/2.8.
l For more focal depth, use higher f-values from f/4 to f/8. Use of
higher f-values can be helpful in smaller volumes where the relative
distance between markers and camera can vary, and there is
enough light reflected from the markers.
l Depending on the lens, use of the lowest f-values may lead to vign-
etting, which may become visible as a deformation of the markers in
the vicinity of the edges and corners on the sensor.
l Note that by increasing the aperture by one f-stop (e.g., from f/2.8 to
f/4) the intensity of the incident light is reduced by half.
5. When you have set the focus and aperture correctly, switch to Marker
intensity mode in QTM to optimize the marker settings Exposure &
Flash time and Marker Threshold. For more information, see "Tips on
marker settings in QTM" on the next page.

SYSTEM SETUP 482


6. For cameras with manual lenses, do not forget to return the strobe mech-
anics to the original position.

NOTE: For cameras with motorized lenses, you may consider to lock the
aperture and focus to the set values. This can be done using the Qualisys
Firmware Installer (QFI) by setting the Lens control option under
Advanced settings to Lock settings, see "How to use Qualisys Firmware
Installer (QFI)" on page 471. By locking lens control aperture and focus
will be locked to the current settings and the Lens control interface will
no longer be available in QTM. This can be helpful in fixed camera setups
in which focus and aperture need to be constant.

Tips on marker settings in QTM


The two important settings for marker calculation are Exposure time and
Marker threshold. The steps and suggestions below provide a basic method
for adjusting the settings. Please note that it is not possible to give more spe-
cific advice on how to adjust the settings, because of their dependence on the
context and their interrelation. There are also other settings that can help to
get a better measurement, e.g. Marker masking or Marker limits.
First of all make sure that the focus and aperture are correct, see "Tips on set-
ting aperture and focus" on page 481.
Exposure and threshold must be used together, because changing one can lead
to that you have to change the other. Follow these steps to set exposure and
threshold settings.

1. Start a new measurement and stay in preview mode. Use the Camera set-
tings sidebar to change the settings.

SYSTEM SETUP 483


2. Set the Exposure time at 200-300 microseconds and Marker Threshold
to 20. These are the default values and often it is a good starting point.
3. Start by looking at the marker intensity image. Click on the Marker
intensity button in the Camera settings sidebar to show the marker
intensity image. For more information see chapter "Video preview in
QTM" on page 584.
4. If the markers are not bright red in the marker intensity image, try increas-
ing the Exposure time until they are bright red. For example, with a
longer distance to the markers, you may need a longer exposure time.
l For higher frequencies it might not be possible to increase the expos-
ure so that the markers are bright red. However, as long as they are
brighter than the background the camera should be able to find
them by lowering the marker threshold.
l If there are extra reflections, you can try reducing the Exposure
time. Extra reflections are anything that is not a marker and has a
color different from blue. Green is the threshold level.
l Remember that you can set this setting individually for each camera,
see chapter "Camera settings sidebar" on page 91.
5. Switch back to Marker mode and check if the markers are visible. If they
are not, go back to Marker intensity mode and adjust the Marker
threshold value. It is not possible to give an exact value for the threshold
at a certain exposure, because each setup is different.

SYSTEM SETUP 484


l Increase the threshold if the background looks light blue or even
green in the marker intensity image. A light blue background will
make it harder for the camera to calculate the markers.
l Decrease the threshold if the markers are not bright red in the
marker intensity image. For example, at short exposure times of 100
µs and lower the threshold needs to be low, usually around 10-15.
l Make sure that the marker calculation is stable at the selected
threshold. If the threshold gets too low there will be a lot of extra
reflections or no markers at all. A too high threshold will result in
small markers or missing markers.
l Remember that you can set this setting individually for each camera,
see chapter "Camera settings sidebar" on page 91.
6. Finally check that the markers are large enough to give a good accuracy.
Check that the marker size is at least 200 in the Data info window.

Linearization of the cameras

About camera linearization


A camera transmits coordinates in two dimensions for each marker’s center
point in the field of view of the camera. The locations of the center points
reflect the optical view of the sensor chip in the camera. Since this cannot be
done without distortion, the QTM software must have information about the
distortion of each camera in order to get correct 3D data. This correction is
referred to as “linearization” or “intrinsic calibration”.
The linearization takes into account the alignment of the lens relative to the
sensor (center point of the optical axis) and the optical distortion of the lens.
The linearization is therefore specific for each individual camera-lens com-
bination.
The linearization procedure of the camera generates a file that is stored in the
camera's memory, and downloaded to the computer the first time you use a
camera in QTM. The camera linearizations can be inspected in the Lin-
earization page in the Project Options, see chapter "Linearization" on
page 249.

SYSTEM SETUP 485


The linearization procedure is implemented in QTM. Linearization of a camera
requires a special Qualisys linearization plate. For instructions on how to lin-
earize a camera, see chapter "Linearization procedure and instructions" below.
All Qualisys cameras are linearized at the factory and delivered with a valid lin-
earization file.

NOTE: A linearization plate is usually not included with a Qualisys


motion capture system. Contact Qualisys support at sup-
[email protected] when a camera needs to be re-linearized, for example
if you need to exchange the lens.

Linearization file warning

The linearization file in the camera is compared to the one used in QTM every
time that you start a new measurement. If the files do not match, the following
warning is displayed.

If you want to download the file from the camera click Yes, which is mostly
recommended if the file in the camera is more recent than the one in QTM.
Otherwise, click No. Optionally, check the box Do not show this message
again to avoid the warning for this file the next time you start a measurement.

Linearization procedure and instructions


Linearization concept

The linearization procedure is designed to collect data with the linearization


plate in a variety of orientations across the image sensor of the camera. The
data is collected based on the following criteria:
l The 2D image is divided into 7 times 6 squares.

l Each square is divided into four orientation cells, visualized as triangles.

SYSTEM SETUP 486


l Each orientation cell requires at least 20 valid observations of the lin-
earization plate within the required pose constraints.

l The angle of the plate to the plane of the sensor must be at least 10 degrees
to be accepted as a valid orientation.

l The size of the markers must be at least 200 subpixels.

The data for the calculations is automatically selected based on these criteria.
The user is guided by the feedback provided by QTM to facilitate the data col-
lection.
Feedback during the linearization procedure

QTM provides the following feedback to guide the user during the linearization
procedure:
l At the start of the linearization, the 2D view turns red. The image is
mirrored to make it easier for the user to the move the plate across the
area.
l The markers of the plate are colored when the plate is identified.
l If the markers are white, it may help to shortly hide the plate from
the camera and show it again.
l If QTM has difficulties to identify the plate, make sure to remove any
extra reflections and redo the linearization.

SYSTEM SETUP 487


l The distance interval is indicated by the green vertical bar to the right of
the 2D view. A white diamond shows the current distance of the plate to
the camera, which should be within the green area.
l When you are too far or too close to the camera, a large white arrow
head in the center of the image area shows you if you need to move
farther or closer.
l During the data collection, the triangular orientation cells change color as
they are filled with valid observations, from red to orange to yellow to
green. A green cell indicates that a sufficient amount of data has been col-
lected for that pose.
l A white line pointing outwards from the center represents the cur-
rent orientation of the plate.
l If you need more observations for a triangle, point the plate towards
the triangle to collect data for that orientation.
l When the markers are too small, the error Too small markers are shown
at the bottom left corner. In this case, try to move closer to the camera
within the required interval, or redo the linearization with modified expos-
ure settings.

TIP: Set a fixed, large marker size under Marker display in the 2D view
settings in case you have difficulties to see the position of the lin-
earization plate during the linearization procedure.

Linearization instructions

Preparations

1. Place the camera on a tripod. It is a good idea to place the camera side by
side with the computer screen so that you can easily follow the feedback
during the linearization procedure.
2. Connect the camera to the computer and start a preview in QTM. You can
have several cameras connected while linearizing, but you can only lin-
earize one camera at a time.

SYSTEM SETUP 488


3. Set the focus and aperture as required for your capture volume. The
focus does not need to be adapted to the distance of the linearization
plate, unless your application requires that the cameras are very close to
the subject (less than 1 meter, or so). For further tips on focus and aper-
ture, see chapter "Tips on setting aperture and focus" on page 481.
4. Take off any rings, watches or other shiny objects before performing the
linearization. If needed, change the Marker threshold and Exposure
time so that there are no extra reflections when you stand in front of the
camera.

Performing the linearization

1. Click the Linearization button or click Linearize camera in the Cap-


ture menu. The following dialog will appear:

2. Select the number of the camera that you want to linearize with Camera
to linearize.

TIP: Select the camera to linearize as the only camera in the 2D


view before starting the linearization. The camera number in the dia-
log will be automatically set to that camera.

3. Make sure that you are using the correct plate type. The standard plate
consists of 6x5 markers.
4. Enter the focal length of the current lens in Approximate focal length.
This ensures that the distance intervals are scaled correctly when per-
forming the linearization.
5. Click OK to start the linearization procedure.

6. The linearization procedure starts at the first distance interval, close to


the camera.

SYSTEM SETUP 489


7. Move the plate across the camera image while varying the orientation of
the plate.
l Use the feedback from QTM to make sure that you are at the correct
distance of the camera.
l Keep varying the position and the orientation of the linearization
plate until all triangles have turned green.
8. When the first distance interval is done, the image will be reset, and the
distance interval is changed to the next one, farther away from the cam-
era.
l Repeat the procedure at this distance, until all triangles have turned
green.
9. The linearization procedure will automatically stop. QTM proceeds to per-
form the calculations and the results are shown in the Linearization results
dialog.
10. In case you need to stop the linearization procedure before it has been fin-
ished automatically, for example if you need to change exposure settings,
press the Esc button to quit.

NOTE: During the linearization procedure, the lens must not be touched.
However, the camera can be moved if needed.

Evaluating the linearization

When you are finished with the data collection, QTM will calculate the lin-
earization parameters. The results are presented in the Linearization results dia-
log.

SYSTEM SETUP 490


The dialog shows if the linearization has passed or failed. Furthermore, the dia-
log shows the following metrics for the linearized camera:

Focal length
The calculated focal length of the lens in mm. Make sure that the value
corresponds to the lens specifications.

NOTE: For underwater cameras, the calculated focal length will be


about 30% larger than specified due to refraction.

Residual
The residual represents the remaining error of the collected data after
application of the linearization in subpixels. The residual should be lower
than 5 subpixels for standard camera-lens combinations. For wide-angle
lenses, higher values may be acceptable.

Furthermore, the dialog shown information about the linearization file that has
been generated. The file name consists of the serial number of the camera, fol-
lowed by the date and time the linearization was performed. The linearization
procedure outputs three files:
.lin - The linearization file that is used by QTM.

.stat - This file contains the settings of the linearization and also some
statistics.

SYSTEM SETUP 491


.qcl - This file contains the actual linearization measurement. It is only
named with date and time.

The files are saved in the Linearization folder under the Qualisys program data,
typically C:\ProgramData\Qualisys\Linearization. The .lin file is also uploaded to
the camera.
Click OK to close the Linearization results dialog.

Synchronization

Timing hardware
How to use external trigger

An external trigger can be used to trigger the start of the motion capture. For a
system with a Camera Sync Unit the trigger is connected to any of the trigger
ports, see chapter "Trigger ports" on page 273.
The delay between the trigger and capture start can be set, however the min-
imum is 20 ms. Therefore it is recommended to use the Sync out signal when
synchronizing with other equipment. The delay for stopping on trigger is twice
as long as the configured start delay.
For Oqus the trigger is connected to a splitter cable on any Oqus camera in the
system. If the movement that is captured is very short the external trigger can
be used together with pretrigger so that the whole movement is captured.
On the Oqus camera there can be errors if you send a trigger signal when the
camera is setting up the measurement. I.e., before it says Waiting for trigger
in the Status bar. The problem can be solved by moving the trigger button to
the master camera.
Possible external trigger devices are for example:
l Trigger button

l Photo cell

l Other systems

SYSTEM SETUP 492


How to use pretrigger

The pretrigger is used to collect frames before the arrival of a trigger event. The
pretrigger frames are saved in the camera buffer before the trigger event and
after the event the frames are sent to the measurement computer. When using
pretrigger it is impossible to know the delay between the trigger signal and the
first frame after the pretrigger frames. This is because the cameras are already
measuring and the trigger event can come at any time in reference to the
frame rate.
The trigger event can be sent to the camera either with an external trigger or
from a dialog in QTM if there is no external trigger.
When using analog boards with pretrigger the connection differs between the
boards, see below.
The pretrigger settings can be found on the Synchronization page in the Pro-
ject options dialog, see chapter "Pretrigger" on page 277.

Measurement with analog capture while using pretrigger

When analog data is collected while using pretrigger, an external trigger must
be used to start the capture. Activate the external trigger setting on the Syn-
chronization page in the Project options dialog. The external trigger is con-
nected to different connectors on the analog board depending on the analog
board type and camera type.
USB-2533
Use normal setup with a Sync out signal from one of the Oqus cameras
connected to the Sync input of the analog device, see chapters "Con-
nection of analog board" on page 752. In addition, the external trigger sig-
nal should be connected to the first analog channel on the board. This can
be done by splitting the trigger input at the camera or sync unit with a T-
connector and connect one end to the trigger button and the other end
via a BNC cable to the first analog channel on the board.

NOTE: It is important that the pretrigger buffer is full before you


press the trigger button. Otherwise you will get an error message
from the analog board.

SYSTEM SETUP 493


NOTE: The synchronization of the analog data may not exactly coin-
cide with the start of the camera frames, as the time of the trigger
start signal may fall anywhere in a pretrigger camera frame. This res-
ults in a latency within one camera frame period, which varies
between captures.

How to use external timebase

The external timebase is used to get a required frame rate in the motion cap-
ture from an external device.
The Qualisys camera system can be synchronized with an external timebase for
drift free synchronization with a external devices. When synchronizing to a peri-
odic TTL signal the Qualisys system will lock the capture frequency to that of
the external signal source, see chapter "Using External timebase for syn-
chronization to a periodic TTL signal" below for detailed information.
It is also possible to use a time code signal as an external timebase source. Cur-
rently, two standards are supported, SMPTE (requires an Oqus or Camera Sync
Unit) and IRIG (requires a Camera Sync Unit). For more information, see
chapter "Using External timebase for synchronization to a time code signal" on
page 497.
An external timebase can also be used to trigger individual camera frames,
using non-periodic signal mode. For an example of how to synchronize the cam-
era system with burst signals, see chapter "External timebase with bursts of sig-
nals with constant period (cycle) time and with delays between the bursts" on
page 499.
The settings for external timebase are on Synchronization page in the Project
options dialog, see chapter "External timebase" on page 278.

Using External timebase for synchronization to a periodic TTL signal

The synchronization input of Qualisys camera systems can be used to syn-


chronize the frames of the camera capture to an external source. This means
that the camera frequency will depend on the input signal so that there is no
drift between the two systems.

SYSTEM SETUP 494


When synchronizing a Qualisys system to a period signal, set the Signal Mode
under the External timebase options to Periodic. The synchronization signal
from the external hardware must be a periodic TTL signal between 0 and 5 Volt.
You can use a frequency multiplier or divisor to get the camera frequency that
you need. For detailed information about the External timebase settings, see
chapter "External timebase" on page 278.

The external timebase is connected to the Sync input port of a Camera Sync
Unit. For an Oqus system, you can also use the Sync in connector of a Trig-
ger/Sync splitter cable connected to the control port of one of the cameras.
Follow these instructions to use external timebase with a periodic signal.

Real-time

1. Make sure that the settings for the External timebase on the Syn-
chronization page in the Project options dialog are correct for your
setup.

CAUTION: Do not use External timebase settings that result in a


camera frequency higher than 0.975*maximum frequency at the full
image size.

2. First time the periodic sync is activated you must wait for the camera to
synchronize after clicking New on the File menu. QTM will report EXT ?
Hz in the status bar for the camera frequency and the dialog below will
appear. The following times a measurement is started the camera system

SYSTEM SETUP 495


will already be locked on and the real-time will start immediately.

3. When the camera is synchronized the current camera frequency will be repor-
ted in the right corner of the status bar. The frequency will also be fixed for
the camera system in the Camera settings sidebar and in the Project
options dialog, so that the exposure time settings are limited to the correct
values.

4. The synchronized samples can now be sent with the RT protocol to another
collection system.

Capture

The start of the capture needs to be known if there is an external system that
collects data independently of QTM. There are two ways to know the time of
the start.

1. Record the Sync out signal from the camera on the external system. If you
use external trigger on both of the systems then the start of the capture is
synchronized with the first pulse in the pulse train after the external trigger
signal is sent. Otherwise if the camera system is started without a trigger the
start of the capture is after a very short pause in the pulse train between
when you click on Start in the Start capture dialog and when the camera
starts capturing.

2. Start the camera system and the external system with the external trigger sig-
nal. Then the start of the camera system is in the default mode delayed 20
ms from the trigger pulse, which means that the camera system starts on fol-
lowing Sync in pulse. 20 ms is the minimum delay but the delay can be set to
a higher number on the Synchronization page.

SYSTEM SETUP 496


Using External timebase for synchronization to a time code signal

The Qualisys system can be synchronized to an external time code. The fol-
lowing standards are supported:
l SMPTE at frame rates of 24 Hz, 25 Hz, and 30 Hz, without dropped frames.

l IRIG A and B, Direct Current Level Shift (DCLS)

SMPTE

SMPTE time code is a set of cooperating standards to label individual frames of


video or film with a time code defined by the Society of Motion Picture and Tele-
vision Engineers. SMPTE time code signals can be generated by time code gen-
erators or external audio interfaces. SMPTE is commonly used for
synchronization in audio/video applications. Qualisys supports SMPTE stand-
ards at frame rates of 24 Hz, 25 Hz and 30 Hz, without dropped frames. For
more specific information about how to use SMPTE for synchronization of cap-
tures with audio recordings, see chapter "Using SMPTE for synchronization with
audio recordings" on page 512.
The use of SMPTE time code requires a Camera Sync Unit. Alternatively, for
Oqus systems an Oqus Sync Unit can be used. The SMPTE time code signal
should be connected to the SMPTE input port. The capture frequency can be
set to a multiple or a divisor of the SMPTE frame rate in the External timebase
settings on the Synchronization page.

CAUTION: Do not use External timebase settings that result in a cam-


era frequency higher than 0.975*maximum frequency at the full image
size.

NOTE: When using SMPTE as an external timebase source the shutter


delay (Delay from signal to shutter opening on the External timebase
page) should be at least 100 microseconds for a consistent number of
cameras frame per SMPTE frame. The shutter delay value is automatically
set to 100 μs when selecting SMPTE.

SYSTEM SETUP 497


IRIG

Inter-Range Instrumentation Group time codes, commonly known as IRIG time


codes, are standard formats for transferring timing information. Atomic fre-
quency standards and GPS receivers designed for precision timing are often
equipped with an IRIG output. IRIG is commonly used in communication sys-
tems, data handling systems, military systems, telemetry systems, high speed
cameras and in machine vision. Qualisys supports the IRIG A and the IRIG B
standards of DCLS type (Direct Current Level Shift).
The use of IRIG time code requires a Camera Sync Unit. The IRIG time code sig-
nal should be connect to the Event input port on the Camera Sync Unit. The
capture frequency can be set to a multiple or a divisor of time code frequency
in the External timebase settings on the Synchronization page.

CAUTION: Do not use External timebase settings that result in a cam-


era frequency higher than 0.975*maximum frequency at the full image
size.

NOTE: IRIG cannot be used when there are any Oqus cameras included
in the system.

Timestamps

When using time code as external timebase source, time stamps are recorded
by default, so that each camera frame includes a time stamp. When the capture
rate is set to a multiple or divisor of the time code frequency, the first camera
frame of a capture may not correspond to the start of a time code period or
frame. For synchronization to an external signal you will need to look up a cam-
era frame at which the time code increased and take the offset into account.

Using Qualisys video with External timebase

All Qualisys video cameras can be used with External timebase, but only if the
video capture rate is a divisor of the marker capture rate. The video capture
rate also needs to be an integer so you cannot use any divisor on the marker
capture rate. The presets that can be used are activated automatically for the

SYSTEM SETUP 498


current marker capture rate. For example, if the marker capture rate is 200 Hz,
you can capture video in 200, 100, 50, 40, 25 and 20 Hz. These are the divisors
1, 2, 4, 5, 8, and 10.

External timebase with bursts of signals with constant period (cycle)


time and with delays between the bursts

To use an external timebase with bursts of signals with constant period time
and with delays between the bursts follow the steps below:

1. First the possible time periods of the measurement system must be cal-
culated.
a. Calculate the smallest time interval t1 between two frames that can
occur during the measurement (i.e. the smallest time interval
between two pulses from the external timebase source that can
occur during the measurement).
b. Calculate the longest interval between two pulses from the external
timebase source that can occur during the measurement, multiply it
by 1.5 and call it t2.
c. Calculate the maximum frequency f as 1/ t1.

NOTE: The non-periodic sync cannot be faster than 120 Hz


due to network delays of the TCP/IP network.

d. Calculate the number of pulses n that will be generated during the


measurement (i.e. the number of frames that will be captured).
2. Set the Marker capture frequency in QTM to f.
If the Capture rate is set to a value that is lower than f the cameras will
flash for a larger part of the time between two pulses, which might lead to
overheating of the cameras. It might also result in that the time to cal-
culate marker positions in each frame is too short. Therefore some mark-
ers, those vertically lowest, might be lost.
3. Select External timebase (and Non-periodic for Oqus) on the Syn-
chronization page in the Project options dialog.
External timebase can be used together with an external trigger. If both
are used, the first frame will be captured on the first timebase pulse

SYSTEM SETUP 499


following the trigger event of the external trigger.
4. Set the Max expected time between two frames option to t2 seconds.
If no frame has been received by QTM within t2 seconds, QTM assumes
that an error has occurred and aborts the measurement.
5. Start a new measurement. The frequency will be reported as EXT in the
main status bar, because it is unknown.
6. Enter the number of frames (n) under the Capture period heading in the
Start capture dialog.
It is important to specify the correct number of frames, so that the last
part of the measurement is not excluded. However they should not be too
many, if the external timebase source stops firing frames after the meas-
urement that would mean that the measurement is never finished. The
best way is to specify a few frames extra and then let the external
timebase source fire a little longer than necessary, so that all of the
required number of frames will definitely receive a pulse from the
external timebase source.

CAUTION: Make sure that the capture rate and exposure time stay
within the limits. Not doing so may lead to damage of the cameras, and in
that case the guarantee will be void.
l DO NOT overclock the cameras by setting the external timebase
rate higher than the camera specification. The maximum frame rate
should not exceed 0.975 times the maximum frame rate of a cam-
era at full FOV.
l The exposure must not exceed 1/10 of the period time of the
external timebase frequency.

NOTE: The tracking can become significantly of poorer quality if you use
a timebase with a large difference between the smallest and the largest
delay between two frames. Since the tracking assumes a constant time
period between two frames.

SYSTEM SETUP 500


How to use PTP sync with an external clock master (Camera Sync Unit)

The Camera Sync Unit (CSU) can be used with any Qualisys camera system and
configured to synchronize with an external clock master using the standard PTP
protocol. Follow the steps outlined below to set up the camera system.

1. Start the correct version of the Qualisys Firmware Installer (QFI.exe), see
"How to use Qualisys Firmware Installer (QFI)" on page 471 for more inform-
ation.

2. After locating the system, click on the Advanced button.

3. Under Select PTP mode, select Use standard PTP. The Firmware
upgrade option can be deselected to speed up the process.

4. Click OK and continue installing the firmware.

5. Reboot the cameras manually to finish the change to standard PTP mode.

SYSTEM SETUP 501


Connecting to an external clock master

The external clock master should be connected to the same Local Area Network
as the cameras. The following requirements apply to the external clock master:
l Standard: IEEE 1588:2008 (PTPv2)

l ipv4/udp

l Delay mechanism: End-to-End (E2E)

l Two-step clock

l Sync message interval: 1s

For adding the timestamp to the motion capture frames, activate the
timestamp option on the Synchronization page under Project Options and
set the type to Camera time, see chapter "Timestamp" on page 284.

NOTE: Oqus systems can use PTP sync without a Camera Sync unit, but it
requires extra configuration, see chapter "How to use PTP sync with an
external clock master (Oqus)" below.

How to use PTP sync with an external clock master (Oqus)

Oqus systems can be configured to synchronize with an external clock master


using the standard PTP protocol. Follow the steps outlined below to set up the
camera system.

1. Start the correct version of the Qualisys Firmware Installer (QFI.exe), see
"How to use Qualisys Firmware Installer (QFI)" on page 471 for more
information.
2. Click on the Advanced button.

3. Under Select PTP mode, select Use standard PTP. The Firmware
upgrade option can be deselected to speed up the process.

SYSTEM SETUP 502


4. Click OK and continue installing the firmware.

For Oqus systems without a Camera Sync Unit, one of the cameras need to be
manually set as system master. To manually set the system master, follow
these steps.

1. Start a telnet client and log in to the selected camera (login: oqus; password:
oqus)

2. Type the command forcetosystemmaster true and press enter.

3. Type the command reboot and press enter.

4. Wait for the camera to reboot.

SYSTEM SETUP 503


5. Type the command exit and press enter to quit the telnet session.

NOTE: It is recommended to use the camera that is the master camera


when there is no external clock master connected to the system, so that
the system will work even if the external clock master is removed from
the system.

The following should be kept in mind for the system master:


l A camera that is PTP master is also system master.

l A camera that has been set to system master with the forcetosystemmaster
command will always be system master.

l There must be exactly one system master.

l If no system master is detected, it is not possible to start a measurement.


The following error message will be displayed.

Connecting to an external clock master

The external clock master should be connected to the same Local Area Network
as the cameras. The following requirements apply to the external clock master:

SYSTEM SETUP 504


l Standard: IEEE 1588:2008 (PTPv2)

l ipv4/udp

l Delay mechanism: End-to-End (E2E)

l Two-step clock

l Sync message interval: 1s

For adding the timestamp to the motion capture frames, activate the
timestamp option on the Synchronization page under Project Options and
set the type to Camera time, see chapter "Timestamp" on page 284.

NOTE: For Oqus systems, the camera timestamp in exported files needs
to be converted as the number of bits in which the timestamp is stored is
limited to 48. It is recommended to add a Camera Sync Unit to the sys-
tem to avoid the need to convert the timestamps. Contact Qualisys sup-
port of you need more information about this conversion.

Synchronizing external hardware


How to synchronize external hardware

It is possible to synchronize other external hardware that are not directly integ-
rated in QTM. This can be achieved either by triggering the external hardware
from the cameras or sending a signal from the external hardware to the cam-
eras. Which option to use depends on the specification of the external hard-
ware, read the following chapters and contact Qualisys AB if you have any
questions.
The external hardware must be able to trigger on a TTL pulse or send out TTL
pulse according to one of the following alternatives.

Using Sync out for synchronization

The recommended option is to use the Sync out signal from the cameras. This
is the same signal that is used for synchronizing the analog boards. The Sync
out signal is a pulse train with a TTL pulse that is only active when the camera
is measuring.

SYSTEM SETUP 505


The signal can be modified to give you different signals see chapter "Syn-
chronization output" on page 285, but the default is to send a pulse for each
camera frame where the pulse has the same length as the exposure time. The
Sync out signal can also be set independently for each camera in the system.
Which means that you can set different outputs for different cameras and
adapt the output to different hardware. The different outputs of the camera
are shown in the image below.

The signal can be used in two different ways:


Start the external capture on the first TTL pulse and use an internal
frequency
In this mode there will be a small drift, between the external hardware
and the camera system. How much it will be depends among other things
on the frequencies of the two systems and also the accuracy of the
external equipment.
The external capture can also be stopped by the camera system if the Fre-
quency multiplier mode is used and the pulse duration is the same as
the period time. Then the signal will for example be low during the whole
measurement until it stops, see chapter "Synchronization output" on
page 285.

Frame synchronize the capture of the external hardware


In this mode there is no drift between the two system because the Sync
out signal is synchronized with the camera frequency. For more inform-
ation on how to get different frequencies see chapter "Synchronization
output" on page 285.

SYSTEM SETUP 506


NOTE: If you want to use a sync output signal that is faster than
5000 Hz on the Oqus camera, then it is recommended that you use
the connector on the Master camera. The master camera is dis-
played with an M next to the ID in the camera list to the left of the
settings.

In a measurement the pulse train for the default mode with the setting Shutter
out will look as the figure below.

1. Preview
During preview a pulse is sent on each preview frame.
2. Capture
Click on Capture in the Capture menu.
3. Start capture dialog open
When the Start capture dialog is open the pulse continues because the
preview is still being updated.
4. Start
Click on Start in the Start capture dialog.
5. Waiting for measurement
When the camera waits for the start of the measurement the sync output
signal is stopped. How long this period is depends mostly on two things.
l With external trigger this period continues until you press the but-
ton. Therefore we recommend that you use external trigger so that
you have time to initialise the measurement on the external device.
l Without external trigger the period is less than a second.

SYSTEM SETUP 507


6. Measurement start
For the default Oqus setting the measurement starts on the negative edge
of the first pulse, which means that the external hardware must trigger on
the same edge.
7. Measurement
During the whole measurement a TTL pulse is sent for each frame.
8. Measurement stop
The measurement stops and the Sync out signal is high until the next pre-
view.

NOTE: In batch capture there is a couple of preview frames


between each measurement. These preview frames will be seen as
pulses on the Sync out signal and must be considered by the
external hardware.

Using Trig in for synchronization

Another way to synchronize is to use the External trigger input signal. This sig-
nal must be sent to the camera either by a button or by another hardware.
However with this method there is a longer delay from the trigger event to the
start of the measurement. The signal that is used will look as the figure below.

1. Start capture dialog open

2. Start
Click on start in the Start capture dialog.
3. Waiting for measurement
The camera waits for the trigger event on the External trigger signal.

SYSTEM SETUP 508


4. Measurement starts
The measurement starts with a short delay compared to the trigger event.
The delay is adjustable, but the minimum delay is 20 milliseconds.
5. Measurement

6. Measurement stop
The measurement can end in three different ways. Only one of them will
have a trigger signal on the External trigger input signal.
l The measurement comes to the end of the specified measurement
time.
In this case there is no pulse on the External trigger signal. The
external hardware measurement must be setup to measure for the
same time as QTM.
l The measurement is stopped manually in QTM.
In this case there is no pulse on the External trigger signal. The
external hardware measurement must be stopped manually as well,
the stop will therefore not be synchronized.
l The measurement is stopped with external trigger.
In this case there is a pulse on the External trigger signal and the
measurement will stop on the next frame. Because the stop pulse
can come at any time it is impossible to tell the delay until the meas-
urement stops.

Using Oqus sync unit for synchronization

By connecting the Oqus sync unit to the master camera in the Oqus system,
you can use an SMPTE signal to timestamp and synchronize the measurements.
The sync unit can also handle a video signal (black burst or tri-level) for syn-
chronization (also called genlock). For more information about how the syn-
chronization works in sound applications see chapter "Using SMPTE for
synchronization with audio recordings" on page 512.

SYSTEM SETUP 509


The sync unit has the following opto-isolated connections.
Trig in/SMPTE in
This connection is used either as the standard Trig in connection or to con-
vert the SMPTE signal to a signal compatible with the Oqus camera. The
SMPTE signal can be used for both timestamp and synchronization. The
LEDs next to connector will indicate which input you are using in QTM.

NOTE: For the best detection of the SMPTE timecode use one of the
Analog outputs on the Motu device so that you can increase the
amplitude of the signal. Move the dial in the SMPTE program so that
it is pointing right to get a high enough amplitude. The SMPTE time-
code output on the MOTU device is often too low for the Sync unit
to get a stable detection of the SMPTE timecode.

Timestamp
To activate the SMPTE timestamp open the Synchronization page in
the Project options dialog. Then select Use SMPTE timestamp
below the SMPTE heading also make sure that you select the correct
SMPTE frequency, see chapter "Timestamp" on page 284.

The timestamp is displayed in the Timeline control bar. Because


the timestamp is sampled at for example 30 Hz there can be several
frames per timestamp. Which is indicated by the last number (sub-
frames) after '::' in the timestamp.

SYSTEM SETUP 510


Synchronization
To activate the SMPTE synchronization open the Synchronization
page in the Project options dialog. Then select SMPTE as Signal
source below the SMPTE heading, see chapter "External timebase"
on page 278.
Note: Remember to select the correct SMPTE frequency under the
SMPTE heading.

Sync in/Video in
The connector can be used either for Sync in or Video in synchronization.
The LEDs next to connector will indicate which input you are using in
QTM.
Sync in
The Sync in connection is standard connector for an external
timebase connection, see chapter "External timebase" on page 278
and "Using External timebase for synchronization to a periodic TTL
signal" on page 494. For example the word clock of up to 48 kHz
from a sound sampling board can be connected to this connection
and then divided to the wanted frequency on the Synchronization
page in the Project options dialog. Make sure that you have set a
divisor that gives a camera frequency that works with the Oqus cam-
era.

NOTE: If you use SMPTE for timestamp and another external


signal for synchronization, both must be connected to the
same camera.

Video in
To activate the video synchronization open the Synchronization
page and select Video sync on the Control port setting, see chapter
"External timebase" on page 278. When the Video in option is selec-
ted the sync unit will decode a black burst or tri-level video signal so
that the Oqus camera can lock in on the signal. Just as for the stand-
ard sync-in you need to have time reference to know when the meas-
urement starts, for example you can use the SMPTE signal.

SYSTEM SETUP 511


Sync out
This is the standard Sync out connection from the camera, see chapter
"Synchronization output" on page 285 and "Using Sync out for syn-
chronization" on page 505.
Using SMPTE for synchronization with audio recordings

The Qualisys system can be synchronized with audio recordings by using the
SMPTE timestamp via the Oqus or Camera Sync Unit. QTM has been tested with
the MOTU 828mk3, but in theory any equipment with an SMPTE signal can be
used. Check the following information to synchronize QTM with audio record-
ings. For information about the settings for external timebase see chapters
"External timebase" on page 278 and "Timestamp" on page 284.

1. It is important to enter the nominal frequency and the SMPTE frequency


on the Synchronization page in the Project options dialog. Otherwise,
the data cannot be compared correctly with the audio recordings.
2. For Oqus systems, make sure to connect the Oqus sync unit to the master
camera, the camera with an M in the display, see chapter "Using Oqus
sync unit for synchronization" on page 509.
3. If you want a sync output frequency from the camera system above 5000
Hz, you must use the master camera.
4. Synchronizing with Word clock or SMPTE
In theory the Word clock should be better to use as external timebase
than SMPTE. However, the SMPTE signal from the MOTU unit is stable
enough for the camera synchronization and it has the advantage that the
marker capture will start without a variable offset to the SMPTE time code
signal.
a. If you use SMPTE as the sync source then the start of the meas-
urements will be aligned with the SMPTE frames, see image below.

NOTE: If you use SMPTE as external timebase it is important


that you do not use a too large divisor so that the resulting fre-
quency becomes lower than 2 Hz. This will result in the camera
not being able to synchronize to the signal.

SYSTEM SETUP 512


b. When using Word clock it is important that you also use the SMPTE
clock for reference of the start of the measurement. However, align-
ment of the camera frames relative to the SMPTE frames will not be
exact and there will be a variable delay of up to one camera frame.
For example at 200 Hz the maximum start offset is 5 ms, see image
below. This is because the camera can start at any time within an
SMPTE frame and then the camera frame will not be aligned with the
SMPTE signal.

5. It is recommended to use a multiple of the SMPTE signal as camera fre-


quency. This makes the data easier to compare to other systems. The
data is synchronized even if it is not a multiple, but then you will have to
take into consideration that the number of camera frames per SMPTE
frame will vary.
6. The camera frequency with external timebase cannot be increased bey-
ond the maximum frequency at full FOV when using reduced image size.
This is because the preview mode uses the full image size of the camera.

Audio recording with the MOTU 828mk3

To be able to synchronize the QTM capture with the audio data it is important
that the SMPTE time code is associated with the audio recording. This can be
done in the following three ways with the MOTU 828mk3.

1. When using the MOTU device on a Mac you can use a special setting
called Generate from sequencer in the MOTU SMPTE console. With this
option the SMPTE signal is generated based on the time position in the
MOTU Digital Audio Workstation (DAW) software used for recording
audio. However, this requires that you start the recording in the MOTU
DAW software before you can start preview in QTM and that the recording
must keep running while the cameras are running. Then it is important to
select the BWF format for the files in the DAW software so that the start of
the SMPTE time can be read from the WAV file header. For detailed inform-
ation about the BWF standard, see

SYSTEM SETUP 513


https://tech.ebu.ch/docs/tech/tech3285.pdf.
2. On Windows you must connect another SMPTE source to the MOTU unit
and use a program that supports ASIO synchronization (e.g. Cubase), if
you want to see the same SMPTE time in the program as in QTM and save
audio files in BWF format with the correct timestamp.
3. The third alternative works on any multitrack audio recorder and that is to
record the SMPTE signal on a parallel audio track. In this case you must
decode the SMPTE signal to synchronize the audio files. For Matlab, a solu-
tion is available for pairing and cropping audio files to mocap files using a
SMPTE decoder. Contact Qualisys support if you are interested.

Combining multiple systems

Twin systems
The Twin system feature enables QTM on one computer to control a camera
system that is connected to another computer. With this feature you can cap-
ture data from two systems and then process them in the same QTM file. It is
for example useful if you have an underwater and above water system or if you
want to capture a small volume within the larger volume at a higher frequency.
For information about the settings for twin systems see chapter "Twin System"
on page 333.
How to use frame synchronized twin systems with separate volumes

Follow this procedure to use frame synchronized twin systems with separate
volumes, for example a system above water and a system underwater.

Requirements

1. Two Qualisys systems (Qualisys cameras of any type or mixed).

NOTE: For Oqus systems at least one camera must be from the 3+-,
4-, 5+-, 6+ or 7+-series if you want to use frame synchronization.

SYSTEM SETUP 514


2. Synchronization devices or cables. Each system needs its own Camera
Sync Unit or Oqus control splitter.
3. A trigger button with BNC T-splitter and BNC cable.

4. Two computers with two Ethernet ports each.

Procedure

1. Set up the two camera systems and connect them to two separate com-
puters.
l It is important that there is a large plane that connects the two
volumes so that you can move the wand in the Twin calibration.
2. Calibrate the two systems separately.

3. Connect the free Ethernet ports of the two computers so that they can
communicate with each other. Use for example one of the alternatives
below:
a. Connect them both to the same internal network.

b. Set static IP addresses on the Ethernet connections and connect a


cable directly between the computers.
4. Select the system that are going to use the highest frequency as Twin
Master system. Usually that is the above water system in above/un-
derwater twin system.
5. Connect the Synchronization output (Out 1 on Camera Sync Unit or Sync
out on control splitter) of the device that is selected as Frame Sync
Master Source to the Synchronization input (SYNC on Camera Sync Unit
or Sync in on control splitter) of the other system.
6. Connect the same trigger button to Trig in (control splitter) or Trig NO
input (Camera Sync Unit) on both of the systems.
7. Open the Project options dialog for the Twin Master system.
a. Go to the Twin system page. For more information about the set-
tings, see chapter "Twin System" on page 333.

SYSTEM SETUP 515


b. Select the option Enable Twin Master.
l Click on Find Slaves and select the computer from the list that
appears in the Address option.
c. Select Enable Frame Sync
i. Enter the desired frequencies of the Twin Master and Twin
Slave systems.
ii. Select the device that is the Frame Sync Master Source. It
should be is the device generating the Sync output. Use the
upper drop-down box for the devices in the Twin master sys-
tem and the lower box for the devices in the Twin slave system.

NOTE: If you have a system with only Oqus 1, 3 or 5 series


cameras, frame synchronization is not supported. The systems
will only be start synchronized in this case.

d. Select Merge trajectories.

e. Click OK to close Project options.

8. The systems are now connected. You do not need to change anything on
the Twin slave system, all of necessary settings are controlled by the Twin
master system.

SYSTEM SETUP 516


9. Start the measurement with Ctrl+N on the Twin master system. Both sys-
tems will start up in preview. The first time you start a measurement you
will have to wait while the two systems synchronize to the 100 Hz signal.
10. The systems then need to be twin calibrated to align the coordinate sys-
tems. For more information on the twin calibration, see chapter "Per-
forming a Twin calibration" on page 519.
a. Click on Calibrate Twin System on the Capture menu.

b. Enter the Calibration capture time you want to use for the cal-
ibration. It must be long enough to move the wand twice in the
whole plane between the two systems.
c. Enter the Wand length that is used for the twin calibration. If you
enter the 0 the wand length is auto calculated but then you must
make sure that both markers are in one of the camera systems at
the beginning of the measurement.
d. Click OK. The twin calibration is done at the frequency of the Twin
slave system, therefore the Twin master system might need to syn-
chronize to the 100 Hz signal again.
e. Start the twin calibration with the trigger button when you are
ready.
f. Then move the wand with one marker in each camera system. It is
extremely important to move the wand so that you cover as much as
possible of the plane between the two systems.
g. When finished you will get a result where it is important to check
that the wand length is close to what you expect.
11. Activate the processing steps that you want to use.
a. Make sure to always activate 3D tracking on both systems. On the
Twin slave system it is actually not necessary to activate anything
else.

SYSTEM SETUP 517


b. Merge with Twin Slave must be activated on the Twin master.
Otherwise no data is transferred automatically.

c. Gap-fill, AIM and 6DOF is best to only activate on the Twin master. It
can be done on the Twin slave, but it can result in unnecessary dia-
logs that interrupts the merging process.
d. The export is of course best to do from the Twin master system.

12. You can now start doing measurements controlled by the Twin master sys-
tem.
a. The saving settings on the Start capture dialog controls the beha-
vior of both the Twin master and Twin slave system. This means that
it is recommended to save the measurements automatically,
because then the Twin slave file is merged automatically with the
Twin master file. The Twin slave file is then saved both on the Twin
master computer and the Twin slave computer.
b. The Twin master file will contain the 3D data from both systems and
can be processed just as any QTM file. The data from the Twin slave
system will be labeled with the type Measured slave in the Tra-
jectory info windows. For information on how to work with the twin
files see chapter "Working with QTM twin files" on page 522.
c. Any video on the Twin slave system is not transferred to the Twin
master computer. Video files recorded with the slave system can be
added to the merged capture later by importing the video link via
the menu File > Import > Link to Video File.

SYSTEM SETUP 518


NOTE: Video recordings on the slave system will only be
saved on the slave computer if automatic saving of the capture
is enabled on the master system.

NOTE: If you import a video from a Qualisys camera that is


included in the calibration of the slave system, the calibration
information will not be transferred. This means that you can-
not apply 3D overlay to the imported video.

Performing a Twin calibration

When using Twin systems it is necessary to specify the relations between the
coordinate systems of the Twin master and Twin slave systems. Therefore the
two systems must first be calibrated separately with wand calibration. The twin
calibration is then created to transform the Twin slave data to the coordinate
system of the Twin master. The relations are usually established in a Twin cal-
ibration procedure. However it can also be entered manually if actual relations
are not critical.

NOTE: If the two systems share any volume it is recommended to place


the L-frame for the calibration there. Then you do not need to transform
the Twin slave data at all, since the two systems share the same coordin-
ate system definition.

Preparations

To perform a Twin calibration you need to first make sure that you have made
the following preparations.

1. Set up the two systems so that the Twin master controls the Twin slave,
see chapter "How to use frame synchronized twin systems with separate
volumes" on page 514. And make sure that both systems are calibrated.
2. Make sure that you have a wand that reaches between the two volumes.
It is recommended to use one where you know the length, but you can
also use one of the systems to measure the wand during the twin

SYSTEM SETUP 519


calibration.
3. Make sure that the plane between the two systems are as large as pos-
sible. Since a larger plane will give a more accurate twin calibration.

Performing a twin calibration

Follow this procedure for the twin calibration:

1. Start a measurement with Ctrl+N on the Twin master system.

2. Click on Calibrate Twin Systemon the Capture menu.

3. Enter the Wand length and the number of seconds needed to at least
move the wand in the whole plane twice.

NOTE: If you don't have a wand length you can enter 0 and then
make sure that both markers are clearly visible in one of the sys-
tems when you start the twin calibration.

4. Click on Start and wait for the systems to synchronize. The twin cal-
ibration is always done at Twin slave frequency, so the Twin master sys-
tem might need to synchronize before you can start.
5. Press the trigger button to start the twin calibration and start moving the
wand.
6. Move the wand so that you have one marker in each volume. The twin cal-
ibration will be better if you fulfill these requirements:
a. Try to move it so that you cover the whole plane between the two
volumes.

SYSTEM SETUP 520


b. Move the wand at least twice in the whole plane.

c. Make sure that the angle of the wand differs when you move it
around in the volumes.
7. When finished the Twin slave data is transferred automatically and then
you get the result of the calibration. Make sure that the wand length is
close to what you have entered and that the standard deviation is not
much higher than the deviations of the regular calibrations.

NOTE: The twin calibration fails if the wand length differs more
than 1 % of the entered value. The reason could be that other mark-
ers alters the twin calibration. Open the files that are saved in the
calibration folder of the project and check the 3D data. Delete any
3D data that is not the wand markers. Then reprocess the Twin cal-
ibration according to the instructions below.

8. If you want to check the actual translation and rotation of the slave data ,
open the Twin system page on the Twin master system and then click on
Calibration.

Reprocessing a twin calibration

It is possible to reprocess the Twin calibration from the Project options dialog
or in the reprocessing dialogs. Follow these steps to change the twin cal-
ibration:

1. Click on Calibration on the Twin system page to open the Twin System
Calibration page.
2. Then click on Calibrate to open the Twin system calibration dialog, see
chapter "Twin System Calibration dialog" on page 337.

SYSTEM SETUP 521


3. In this dialog you can change the QTM files and wand length used for the
Twin calibration.
a. The twin calibration files are saved in the Calibration folder of the
current project. Open the file and retrack it if you think that errors in
the tracking caused problems with the twin calibration. After you
retracked the file you need to reprocess the twin calibration.
b. You cannot reprocess the twin calibration if you do not have the files
created in the twin calibration. However if you make sure to save all
of your data in the same project then they are always saved in the
Calibration folder of that project.
4. Then click on Calibrate inside the Calibrate Using Two Measurements
heading to reprocess the twin calibration.

NOTE: You can also enter a new translation and rotation manually
under the Manual Calibration heading, but it is not recommended
if you want the best accuracy.

Working with QTM twin files

The complete data from a QTM twin system measurement is always stored in
two separate files. If you have used the Merge with Twin slave option in the
processing, the slave file automatically gets the same name as the master with
the ending _slave. When the twin slave file is merged with the twin master file,
it is then only the 3D data of twin slave that is merged into the file. The twin
slave cameras are also displayed as greyed cameras in the 3D view.

SYSTEM SETUP 522


The merged twin slave 3D data is marked as Measured slave in the trajectory
info windows, but that is the only difference compared with the other data. You
can then identify and work with all of the 3D data as usual, for example apply
an AIM model to all the data. However if you need to retrack the data you need
to do that separately in the twin slave file, since the 2D data is not imported to
the master file.
Since only the 3D data is merged from the Twin slave the following processing
and input must be performed on the twin master.
l Video (Miqus, Oqus or external video devices)

l Analog data (Analog board or EMG)

l Calculate 6DOF

l Calculate force

l Export data

Merging Twin files in reprocessing

The data of two QTM files can always be merged in reprocessing if you have not
been using the Merge with Twin slave option or if you need to redo it because
the tracking of the Twin slave file has changed.

NOTE: All of the previous Twin slave data is deleted when you merge
twin slave data in reprocessing.

SYSTEM SETUP 523


The merging is always done by processing the Twin master file and activating
the Merge with Twin slave option on the Processing steps list. You can edit
both which Twin slave file that is used and the Twin calibration before you
reprocess the file. The twin calibration that is used in the reprocessing is shown
on the Twin system calibration page. You can change the twin calibration by
clicking the Calibrate... button, but you must make sure to click the correct Cal-
ibrate button (Manual or Using two measurements) in the Twin system cal-
ibration dialog to apply the change, for more information on the Twin
calibration settings see chapter "Twin System Calibration" on page 336.
If you use the Reprocess option on the Capture menu to merge the files,
then you copy the Twin slave file location either from measurement or pro-
ject. However you can edit the file location and the Twin calibration on the
Twin system page.
If you instead use the Batch process option on the File menu, which file is
used is directly controlled by where you copy the settings from.
processed file
The Twin slave file and all other Twin system settings are copied from the
file that is processed and cannot be changed.

project and present file


The processing automatically uses the file with the same name and the _
slave extension. However you can edit the Twin calibration if you need to.

Twin 3D data interpolation

It is recommended to use the same frequency in the two systems to avoid any
interpolation. If you have to use a lower frequency for the slave system then it
is recommended that you use a divisor of the twin master frequency to min-
imize the interpolation.
The interpolation is a linear interpolation meaning that a line is drawn between
the twin slave 3D data. Then any time where the twin slave data does not exist
the linear data will be used instead. The interpolated data will have a zero resid-
ual so that you can distinguish it in post processing.
Twin system with a shared volume

It is possible to use a twin system with a shared volume, if you want to capture
at a higher frequency in parts of the volume. For example if you want to cap-
ture the impact of a golf swing.

SYSTEM SETUP 524


In this case it is best to place the L-frame in the shared volume so that you can
easily merge the data of the two systems. You can then reset the calibration on
the Twin system calibration dialog. There can still be a difference in the data
of the two systems that is less than a mm. If you find a constant difference you
can change the twin calibration data manually to improve the match. It is not
possible to use the automatic twin calibration.
When merging the data in a shared volume case, it is often best to identify the
data in both of the files before merging them in reprocessing, see chapter
"Working with QTM twin files" on page 522. Then if you use the Merge tra-
jectories option the twin slave data will be used automatically where the twin
master data is missing. Any labeled twin slave data that overlap some twin mas-
ter data will be deleted and if you want to use it you have to remove the twin
master data that is wrong and merge the files again.

Setting up multiple video systems


A Miqus Hybrid or Miqus Video system that have more video cameras than the
computer model can handle must be setup in multiple systems. A computer
with the 11th generation of Intel I9 processor can capture from up to 24 video
cameras. A computer with the 10th generation of Intel I9 processor can capture
from up to 16 video cameras.
With this setup you can capture synchronized video from all cameras and then
process the data in Theia. For instructions on how to calibrate and capture with
multiple video systems see chapter "Capturing with multiple video systems for
markerless tracking" on page 588.
Follow these steps to setup the system.
Connect each video system

1. Follow the instructions in chapter "Connecting a Miqus Hybrid or Video sys-


tem for markerless mocap" on page 452 to connect each of the video sys-
tems with less than 16 or 24 cameras. It is recommend that you divide the
number of cameras equally in the systems and at the same time have 3 video
cameras in each chain if possible.

2. Test each video system separately, before you connect the systems together.

SYSTEM SETUP 525


3. Copy the camera MAC addresses for each system. The easiest place to
find the MAC addresses is to Locate system on the Cameras page in Pro-
ject options and then click on System info. Copy the whole list with
information to a text editor and edit the file so that it looks like this:
C4:19:EC:00:0C:14 S/N 12345
C4:19:EC:00:0C:15 S/N 12346
C4:19:EC:00:0C:16 S/N 12347
It is recommended to add the serial number after the MAC address so
that it is easier to find which camera it belongs to.

4. Decide which system that is going to be the Main system and connect the
Camera Sync Unit in that system.

5. The other systems will be Agent systems and must not have any Camera Sync
Unit connected in the system.

6. Make sure that the computers are connected to the same local network.

Configure each system to prepare for connecting them together

1. Start the Qualisys Firmware Installer (QFI.exe) and enable Standard PTP in all
of the systems. QFI.exe is located in the Camera_Firmware folder in the
Qualisys Track Manager folder.

a. Start QFI.exe.

b. After locating the system, click on the Advanced button.

c. Under Select PTP mode, select Use standard PTP. The Firmware
upgrade option can be deselected to speed up the process.

SYSTEM SETUP 526


d. Click OK and continue installing the firmware.

e. Reboot the cameras manually to finish the change to standard PTP


mode.

2. Use QDS to set different subnets for the camera interface on each computer,
e.g. 192.168.11.1, 192.168.12.1 and 192.168.13.1.

3. On each computer enable the QDS blocklist feature, see chapter "Camera
blocklist" on page 469. Use the MAC addresses saved earlier and block the
MAC addresses of the cameras in the other systems. This means that when
the cameras start they will only get an IP address from the subnet they are
supposed to be on.

Connect the video systems together

Daisy chain the Ethernet switches so that all of the camera systems are con-
nected. The systems must be connected for the cameras to be synchronized to
the same PTP master clock, which is the Camera sync unit in the Main system.
Below is an example of how the systems are connected.

SYSTEM SETUP 527


Configure QTM for synchronous start

1. For the Main system, set the Start delay to 1 second on the Cameras page
in Project options, see chapter "Start delay" on page 248. This setting is to
ensure that there is enough time to send the capture start time from the
Main system to the Agent systems.

2. For each of the Agent systems, set the Wireless/software Trigger function to
Start capture and then set Start/stop on UDP packet to Listen for Main
QTM instance, see chapter "Wireless/software Trigger" on page 267. This set-
ting configures QTM so that it only listens for the UDP packet with the cam-
era start time and not the generic UDP start/stop packet.

3. If there are other computers with camera systems on the local network, then
the port used for the UDP packet must be configured so that they are dif-
ferent. For both Main and Agent systems, change the port to e.g. 8990 in the
Capture Broadcast Port option on the Real-Time Output page in Project
options, see chapter "Real-Time output" on page 387. If the port isn't
changed then there is always a risk that a system which isn't the Main system
triggers the start of the Agent systems.

SYSTEM SETUP 528


Running t he syst em

Preparations

Choice of markers
Motion capture measurements require the use of markers. The markers can
either be reflecting or emitting near infrared light. The first type is referred to
as passive markers, and the latter is referred to as active markers.
Passive vs Active markers

Passive markers can be made very light-weight and in many different sizes,
therefore it is usually the best option. However, in some setups it can be good
to have an active marker, for example because of unwanted reflections or at
very long distances.
Qualisys offers several types of active markers for different applications. For
more information about Qualisys active marker solutions, see chapter "Active
marker types" on page 1000.
You can select the Untriggered active marker option if you are using generic,
constantly lighting active markers. When using this setting, the strobes of the
cameras are inactivated to minimize the amount of unwanted reflections.
For more information about how to use active markers, see chapter "How to
use active markers" on page 531.
Marker size

It is important to choose the correct marker size, since a larger marker gives a
larger area to use to determine the central point of the marker. The central
point data is the input to the tracking of the markers movement and is there-
fore very important.
Rules to determine which marker size to use are:
l Use as big markers as possible without the markers being in the way or
restraining normal movement.

RUNNING THE SYSTEM 529


l If the markers are attached very closely to each other, use as large mark-
ers as possible without the markers merging in the 2D view of the cam-
eras. If the markers merge, the central point calculation of these markers
will be ambiguous.
l To achieve a good accuracy the size of the markers in the 2D view should
be at least 200 subpixels. Check the size of the markers in the 2D view for
each camera in the Data info window. Right-click in the Data info win-
dow and click Display 2D data to choose the camera. If the size is smaller
the central point calculation will be poor.
Marker placement

The markers should be placed in a way so that they are visible during as much
of the measurement time as possible. Check that clothes or rotating objects,
such as body parts, do not move in a way that hides the markers from the cam-
era.
There is a wide range of marker sets for tracking humans for biomechanical
and animation applications. For detailed information, refer to the following
resources:
l Marker guides for animation, available via the Skeleton menu.

l Marker guides for analysis modules, available via the Show guide button
in the PAF project view.
l QAcademy tutorials. There are several QAcademy tutorials available on
how to apply specific marker sets, see https://www.qualisys.-
com/qacademy.

RUNNING THE SYSTEM 530


Qualisys active markers

Active markers can be used in situations when it is hard to use passive markers,
for example, when measuring outdoors at long distances, or when there are
many unwanted reflections from the measured object or the capture volume.
In addition, the Active/Naked Traqr and the Short Range Active Marker use
sequential coding for automatic identification of markers or rigid bodies.

How to use active markers

QTM settings for active markers

When using active markers you need to specify the correct marker type in the
Project Options under Cameras > Marker Mode. The following active marker
options are available:
Active
Use this mode for the sequentially coded active markers: Active or Naked
Traqr or the Short Range Active Marker. In this mode the camera strobes
are used to send a pulsed signal ahead of the exposure to trigger the act-
ive markers. Only active markers are visible as the camera strobe is inact-
ive during exposure.

Active + Passive
In this mode the camera will capture both passive and sequentially coded
active markers. This mode can be used if you need to add some tem-
porary markers to the subject and do not want to add active markers. If
you mix the passive and active markers all the time you lose some of the
advantages of both types, see chapter "How to use Active + Passive mode"
on page 533.

Long range spherical active markers (or reference markers)


Use this mode for the Long Range Active Marker or reference markers.

Untriggered active markers


Use this mode when you are using generic continuously lighting active
markers. In this mode the cameras strobes are inactive.

RUNNING THE SYSTEM 531


How to use sequential coding

Sequential coding is implemented in the Active and Naked Traqr and the Short
Range Active Marker. By using sequential coding, the trajectories are auto-
matically identified in QTM without the need of an AIM model or a rigid body
definition.
The sequential coding configuration and supported options depend on the type
of active marker and the camera system.
Active/Naked Traqr
The Active and Naked Traqr is configured using the Traqr Configuration
Tool. The Traqr Configuration Tool can be used to set the ID range and
the marker IDs of the individual markers on the Traqr. The Traqr supports
the use of the extended ID range option. The ID range is a global setting
that applies to the whole system. The options are:
Standard (1-170)
Standard ID range of 170 uniquely defined markers. This is the
default option.

Extended (1-740)
Extended ID range of 740 uniquely defined markers. Use this option
if you have more than 170 markers. This option is only supported
with Arqus or Miqus cameras. The extended range option cannot be
used if there are any Oqus cameras included in the system.

Short Range Active Marker


The IDs of the markers are controlled by the switch on the driver unit and
the connector that the marker chain is attached to. For example driver ID
= 2 and the second marker connection will give you marker IDs 41-48. The
Short Range Active Marker can only be used with standard ID range. The
settings on the driver allow for a maximum of 160 uniquely defined mark-
ers.

Depending on the ID range the identification needs at least 21 or 41 frames to


identify the trajectories. When the trajectory is shorter, for example due to
occlusions, the ID will be set to 0. For the best results with active markers it is
important to optimize the camera setup to avoid fragmentation of trajectories.

RUNNING THE SYSTEM 532


The IDs of the markers are displayed in the ID# column of the Trajectory info
windows. QTM will automatically join all of the trajectories with the same ID.
However you can manually join any trajectories with an ID with a trajectory
with ID 0, by dragging the one with ID to the one without ID. The merged tra-
jectory will get the ID of the identified trajectory. You can undo the merge with
the Undo command. It is not possible to change the ID of an identified tra-
jectory.
To automatically label the trajectories with labels of your choice you need to
make an AIM model. However, in this case you do not need any movement or
correct placement of the markers to create the AIM model. You just capture a
file with the correct marker IDs and then label them as you want to place them
on the subject. Create the AIM model and the next time you apply the AIM
model the trajectories will be labeled according to their marker ID.

NOTE: In the Passive and Active mode the AIM model does not work
the same way as when you only have active markers.

How to use Active + Passive mode

The Active + Passive mode can for example be used in setups where you need
to add some temporary markers for a static model. In that case it is often
easier to add passive markers than to add more sequentially coded active mark-
ers. There is however disadvantages to the mixed mode and it is therefore not
recommended to be used in all of your measurements. The disadvantages com-
pared to just using the active markers are the following.

RUNNING THE SYSTEM 533


l The maximum frequency is reduced to 250 Hz because you need to flash the
strobe both to activate the active marker and to see the passive markers.
This also means that you will have less exposure time for the markers for
example at 250 Hz the maximum exposure time is 210 µs.

l There can be unwanted reflections in shiny materials, because you have a


flash of light during the exposure for the passive.

l If the AIM model includes both passive and active markers it has to be cre-
ated from a file with the correct marker placement.

l It can be harder to identify the correct IDs for the active markers, since a
passive marker that is badly tracked can be mistaken for an active marker.

Removing unwanted reflections


Delayed exposure to reduce reflections from other cameras

When to use delayed exposure

Exposure delay can be used in environments where the strobe light from one
set of cameras disturbs marker capture in other cameras. It is good to first con-
sider the following changes to the setup.

1. Test a different camera positioning. The reflections can be reduced if the


cameras do not look straight at each other.
2. Can anything be done to the floor or any other reflective area of the room
to make it less reflective.
3. It can also help to reduce the exposure time or increase the marker
threshold, because the unwanted reflections are usually less intense than
the markers.
Exposure delay will shift the time of exposure for different groups of cameras.
For example, a 6-camera system has three cameras located on one side and
three on the other side of a room, with a highly reflective floor, where the
strobe from cameras on one side disturbs cameras on the other side. Assigning
the first three cameras to one exposure group and the other three cameras to
another exposure group will eliminate the problem of strobe interference.

RUNNING THE SYSTEM 534


Setting up delayed exposure

Follow these instructions to use the delayed exposure.

1. To use the delayed exposure, you must first find out which cameras are
causing reflections in other cameras. It is usually best to use the marker
intensity mode to see where the reflection comes from.
2. When you know the cause of the reflections you place the cameras in dif-
ferent groups on the Cameras page in the Project options dialog, see
chapter "Exposure delay" on page 236.
a. Activate the Exposure delay mode called Camera groups, which
will calculate the delay for each group automatically by comparing it
to the longest exposure time of the cameras in the previous group.
Do not use the Advanced mode unless you are absolutely sure of
what you are doing.
b. Select the cameras that you want in the group from the list to the
right.
c. Then select the group on the Camera groups setting. Make sure
that you always start with group 1 and continue with 2 and so on.
d. Repeat steps b and c for each group that you want to create. Usually
for a setup with cameras on two sides of the volume it is enough
with two groups, one for each side. It is also possible to change the
exposure group of a camera by right-clicking on the camera live feed
in the 2D view window and selecting the exposure group via the con-
text menu.
e. When the exposure delay is activated Using delayed exposure is
displayed in the status bar.
3. To check which groups the cameras are in, go to the 2D view and check
the delayed exposure setting next to the camera number.

Guidelines for use of exposure groups for fast movements

By delaying the exposure between camera groups, the markers may have
moved between the respective group exposures. The 3D tracking algorithm
applies a compensation for this displacement of the 2D positions for the
delayed cameras based on the measured movement of the markers. The com-
pensation is effective for most capture applications of human movement, such

RUNNING THE SYSTEM 535


as gait or running. For optimizing the compensation for fast movements, for
example for baseball pitching or golf, take the following guidelines into
account:

1. Minimize the number of exposure groups

2. Make sure that most cameras are assigned to the lowest possible expos-
ure groups, so that most cameras are in exposure group 1, followed by
group 2, etc.
3. Use short exposure times to minimize the delay

4. Use high capture frequencies

The compensation for the 2D positions is also used in the calibration. It is


important to move the wand slowly for an optimal calibration. There is a warn-
ing in the Calibration results dialog if the wand speed is considered too high.

NOTE: Delayed exposure is not supported on the 5-series camera.

NOTE: The time of exposure will return to the default value if the camera
is in video mode.

NOTE: Reflections that come from the flash of the camera itself cannot
be removed with a delay. Then you have to either cover the reflective
material, optimize the exposure settings or use marker masks.

Marker masking

The marker masking is a tool to delete unwanted markers in the 2D data. The
masking areas are defined per camera and can be drawn manually in the 2D
view window, or created using the Auto mask function. The masks can also be
viewed and managed on the Cameras page in the Project options dialog. For
explanation of the settings, see chapter "Marker masking" on page 232.
Camera masks are applied during the measurement. The masked data is not
transmitted to QTM, and it will not be possible to restore any masked markers.

RUNNING THE SYSTEM 536


NOTE: If you want to apply masking post-hoc to a file, you can use soft-
ware masks instead, see chapter "How to use software marker masks" on
page 611.

How to use marker masking

Follow these steps to add a masking area to a camera.

1. Open a new file by clicking the New file icon .

2. Open a 2D view window so that you can see the 2D markers in preview.

3. View the cameras in either Marker or Marker intensity mode.

4. Select the Masker mask tool in the 2D view toolbar and draw a mask
over the area with the unwanted marker. The mask is indicated as a dark
green area on the 2D view of the camera.
a. The mask can be resized and moved by placing the cursor and hold
down the mouse button on the edges of the mask respectively on
the mask. To delete a mask or all masks for the current camera,
right-click on it and select Delete this mask, Delete all masks or
Delete all masks from all cameras.

NOTE: Marker masks are not drawn linearized. To see the true pos-
itions of the masks, you need to turn off the Show linearized data
option on the 2D view settings page, especially for wide-angle
lenses.

5. Keep adding masks until all of the unwanted markers are covered. There
can be up to 20 masks per camera for Arqus and Miqus cameras and 5

RUNNING THE SYSTEM 537


masks per Oqus camera.

How to use auto marker masking

The auto marker masking can be used to remove unwanted static reflections
automatically. Follow this procedure to apply the auto marker masking.

1. Set up the camera system so that it covers the correct volume.

2. Check if you have any unwanted markers in the cameras. Before using
auto marker masking try to remove the physical cause of the marker. Also
try changing the Exposure and Marker threshold settings to remove the
markers.
3. If the unwanted markers cannot be removed, make sure that you are in
preview mode and then open a 2D view window.
4. Select the cameras where you want to add marker masking with the cam-
era buttons. You can select all of the cameras or just some of the cam-
eras.
5. Click the Auto-create button on the Camera settings sidebar to open
the Create marker masks dialog.

6. Click Start to get the position of the unwanted markers. The auto mask
function will identify the largest reflection and mask them. If there are
reflections close to each other, the auto mask function will try to join the
masks.

RUNNING THE SYSTEM 538


a. If there already are masks on the camera you will have to choose
whether to remove the old masks or not. Answer Yes if you want to
remove them and add the new masks. Answer No if you want to
keep the old masks and append the new ones to the list.
b. There can be up to 20 masks per camera for Arqus and Miqus cam-
eras and 5 masks per Oqus camera. If there are more unwanted
reflections you will need to manually edit the masks.

NOTE: Marker masks are not drawn linearized. To see the true pos-
itions of the masks, you need to turn off the Show linearized data
option on the 2D view settings page, especially for wide-angle
lenses.

7. To see that the masks really cover a marker you can deselect the Enable
marker masks checkbox. The masks will then be inactivated and the
markers below will appear.
Active filtering for capturing outdoors

The active filtering mode is most useful in daylight conditions, because it filters
out background light. This is done by capturing extra images without the IR
flash, these images are then used to remove the background light from the
images used by the camera. Active filtering is available on all Qualisys camera
types, except the Oqus 3- and 5 series. The Miqus Video and Miqus Hybrid cam-
eras also support active filtering during calibration and in marker mode.
The standard and recommended setting is the Continuous mode in which an
extra image without IR flash is capture before each actual image. Therefore the
maximum frequency is reduced by about 50% compared to the normal max-
imum frequency.

NOTE: Active filtering does not help if you have problem with extra reflec-
tions from the IR strobe of the cameras. In that case you must try to
remove the reflective material or use the delayed exposure setting, see
chapter "Delayed exposure to reduce reflections from other cameras" on
page 534.

RUNNING THE SYSTEM 539


How to use active filtering

Follow these instructions to set up active filtering for outdoor measurements.

1. Set up the camera system and focus the cameras.

2. Turn off the Active filtering option in the Camera settings sidebar and
look at the marker intensity mode.
l Use a quite low Exposure time around 200 µs. Then change the
exposure time so that the markers are just at the maximum end of
the red color.
3. Turn on the Active filtering option and check that the markers are clearly
distinguishable in preview. It is best to do this check on the subject that
you are going to measure and not on static markers.
l If the markers are not sufficiently distinct from the background, try
to increase the Marker threshold setting until it is better.
4. Continue to calibrate the system as usual.

On-camera marker discrimination


The marker settings of the cameras include settings that can be used to filter
the markers on the camera. The available methods are:
l Marker limits: for the marker limit settings, see chapter "Marker limits"
on page 235.
l Marker masking, see chapter "Marker masking" on page 536.

Discarded markers are not sent to QTM, which means that the on-camera
marker discrimination is irreversible, and cannot be undone by reprocessing a
file. The advantage of on-camera marker discrimination is that it can unload
QTM from processing obsolete markers, which can lead to increased pro-
cessing speed. This can be important, especially for real-time applications.
If you want to keep all data, filtering can also be applied post-hoc as a pro-
cessing step by preprocessing the 2D data in QTM. The alternative methods
are:

RUNNING THE SYSTEM 540


l Marker size filtering, see chapter "How to use marker size filter" on
page 613.
l Software marker masks, see chapter "How to use software marker
masks" on page 611.
On Oqus cameras, marker circularity filtering is also available. This type of fil-
tering is used in combination with 2D preprocessing of the data, which allows
to choose how to use the information provided by the filter. For more inform-
ation, see chapter "Marker circularity filtering (Oqus only)" below.
Marker circularity filtering (Oqus only)

The non-circularity marker settings can be used to correct or delete bad or par-
tially hidden markers. The quality of 3D data and how to use the filter depends
a lot on the number of cameras in the system. If you have three or more cam-
eras covering the same markers, it is usually better to try to remove the non-cir-
cular markers. Because then you have enough data to create the 3D data
anyway. However if there is usually just two cameras viewing the same markers
the data can become more complete by correcting the non-circular markers.
This will result in more 3D data, which otherwise cannot be calculated.

NOTE: Marker circularity filtering is a legacy feature, which is only sup-


ported by Oqus cameras.

How to use non-circular marker settings

The settings for the Qqus camera is activated on the Cameras page in the Pro-
ject options dialog, see chapter "Marker circularity filtering (Oqus)" on
page 234. The markers used by the filter are filtered out by the camera depend-
ing on the Circularity level setting.
How to handle the non-circular markers in then set on the 2D Preprocessing
and filtering page by the two options: Correct center point of non-circular
markers and Discard non-circular markers, see chapter "How to use cir-
cularity filter (Oqus only)" on page 610.

RUNNING THE SYSTEM 541


NOTE: If you change this option when processing a file, then you must
also retrack the 3D data for the change to have any effect.

When you turn on the filtering the markers will be color coded in the 2D view
window according to his list:
White - Circular markers
These markers are below the Circularity level and are therefore not
handled by the filter.

Red outlines
These are the outlines created from the segments sent from the camera.
If there is no marker inside the outline, then it has been discarded. It can
only happen with the Discard non-circular markers option.

Light Yellow - non-circular markers used by the 3D tracker


These markers are detected as non-circular by the camera, but has not
been corrected in the pre-processing since there was no circular fit that
was good enough. It can only appear with the Correct center point of
non-circular markers option.

Green - corrected markers


These markers have been detected as non-circular and then corrected by
the filter. The pre-processing tries to fit a circle to the best arc of the red
outline.

When to use the marker filtering and its effectiveness depend a lot on the cam-
era setup. Check these points to figure out if it will be useful in your setup:
Marker size
One thing that must be considered is the 2D size of the markers. To be
used by the filter the marker has to be larger than 320 in marker size.
Therefore all markers below this size are considered as circular markers.
Also markers that are too large (>2000) will make it hard for the camera to
have time to calculate whether the marker is circular or not.

Number of markers
The Qualisys camera needs to store the segments in a temporary memory
when trying to find the markers that are non-circular. This memory can be
full if there are too many markers that are on the same horizontal level in

RUNNING THE SYSTEM 542


the 2D view and then the camera will stop looking for non-circular mark-
ers. There is no certain number of markers that always work so you need
to try it in your setup, but usually less than 10 markers can be processed
in the camera. In the case that the camera cannot process all marker
there is a red warning in the left upper corner of the 2D view and all the
markers that the camera has not been able to process are considered to
be OK.

Camera capture rate


A faster camera capture rate decreases the available time to process the
markers in the camera, so that not all markers can be processed. In that
case there is a red warning in the left upper corner of the 2D view and all
the markers that the camera has not been able to process are considered
to be OK.

Calibration of the camera system

Introduction to calibration
The QTM software must have information about the orientation and position of
each camera in order to track and perform calculations on the 2D data into 3D
data. The calibration is done by a well-defined measurement procedure in
QTM. Calibration is, however, not needed for a 2D recording with only one cam-
era.
There are two methods that can be used to calibrate a camera system:

Wand calibration
The most common type of calibration, see chapter "Wand calibration
method" on page 547.

Fixed camera calibration


A special type of calibration mostly for large volumes with cameras moun-
ted in fixed positions, see chapter "Fixed camera calibration method" on
page 557.

The following items are important to think about before the calibration:

RUNNING THE SYSTEM 543


l Make sure that the capture area is free from unwanted reflections.
l The cameras should be setup correctly, see chapter "Optimizing the
camera settings" on page 479.
l Follow the procedures for removing unwanted reflections, see
chapter "Removing unwanted reflections" on page 534.
l Before the calibration make sure that the calibration settings are correct
on the Calibration page in the Project options dialog, see chapter "Cal-
ibration" on page 253.
l When using Wand calibration it is important that the camera system have
been placed correctly to achieve a high-quality calibration, see chapter
"Camera positioning" on page 439.
l For Fixed camera calibration the cameras should be mounted in
their final, known positions before the calibration.
It is recommended that the motion capture system is calibrated regularly, for
example before each measurement session, to make sure that the captured
data has high quality.
l Regularly check that the calibration is OK during long measurement ses-
sions.
l There is an automatic check if the cameras have moved, which is activated
on the Calibration quality page in the Project options dialog, see also
chapter "Calibration quality warning" on page 561.
A calibration starts with the Calibration dialog, see chapter "Calibration dialog"
on the next page. Click on the Calibration icon to open the dialog.
Each calibration is saved in a QCA file in the Calibrations folder in the project,
see chapter "Project folder" on page 61. The file name contains the date and
time of the calibration.

IMPORTANT: Whenever a camera in the system has moved, even the


slightest, a new calibration must be done.

RUNNING THE SYSTEM 544


NOTE: It is possible to apply a new calibration to an earlier capture by
reprocessing the file, see chapter "Reprocessing a file" on page 601. This
can be useful if you discover that a new calibration was needed after fin-
ishing a capture.

Calibration dialog

A calibration process is started with the Calibration dialog. It is opened by click-


ing either Calibrate on the Capture menu or the Calibration icon .
The frequency in the calibration is fixed at a value of 100 Hz, independent of
the used capture rate. This frequency is set to ensure a good number of frames
in the calibration and to give a good visualization when the calibration is
tracked. The frequency will be lowered if the camera system cannot measure at
100 Hz with the current settings.
The dialog contains the following information and settings:

Calibration time
Set the duration of the calibration. Make sure that the calibration time is
sufficient to cover the whole volume that needs to be calibrated without
rushing.

RUNNING THE SYSTEM 545


NOTE: For longer calibration times, it is recommended to increase
the Maximum number of frames used as calibration input para-
meter in the calibration settings.

For Fixed camera calibration the length of the calibration is not especially
important, since the data of the frames is averaged. Therefore, the min-
imum calibration time of 10 seconds can be used.

Use calibration delay


Check this option if you want to have time between pressing the button
and starting the calibration. You can specify the delay in seconds .

Use sound notification


Check this option if you want to get a sound notification from the com-
puter when the calibration is started and stopped.

Current calibration settings


The text box shows information about the current calibration settings.

Linearization parameters
The linearization parameters tab shows the status of the linearization.
The linearization files must be specified, otherwise, you cannot start the
calibration. Click Load if the files have not been specified.

The dialog contains the following buttons:

Options
Open the Calibration settings to modify the settings in case they are not
correct.

Cancel
Quit the Calibration dialog.

OK
Start the calibration.

RUNNING THE SYSTEM 546


Wand calibration method
The Wand calibration method uses a calibration kit that consists of two parts:
an L-shaped reference structure (L-frame) and a calibration wand. For a descrip-
tion of the available calibration kits, see chapter "Qualisys calibration kits" on
page 995.
Place the L-frame so that the desired coordinate system of the motion capture
is obtained. It is best if all cameras in the system can see all markers on the L-
frame. If some cameras cannot see the reference structure, the “Extended cal-
ibration method” is used automatically, see chapter "Extended calibration" on
page 549.
The calibration wand is moved inside the measurement volume in all three dir-
ections. This is to assure that all axes are properly scaled. The calibration
algorithms will extract each camera's position and orientation by evaluating the
camera's view of the wand during the calibration. For more information on how
to move the wand, see chapter "Calibration tips" on the next page and "How to
move the wand" on the next page.
It is possible to refine the calibration without the L-frame. This method can be
used in situations where it is hard to place the L-frame. For more information
see chapter "Refine calibration" on page 550.
The Advanced calibration is a wand calibration method that includes optim-
ization of the linearization files of the cameras. For more information, see
chapter "Advanced calibration (beta)" on page 552.
Outline of how to calibrate (Wand calibration)

The steps below are just an outline of what should be done to calibrate the cam-
era system with Wand calibration method.
Follow these steps to calibrate the camera system:

1. Switch on the camera system and start QTM.

2. Open a new file by clicking the New file icon .

3. Place the L-frame in the measurement volume.

4. Set the settings on the Calibration page in the Project options dialog,
see chapter "Calibration" on page 253.
5. Click OK.

RUNNING THE SYSTEM 547


6. Click the Calibration icon and set the settings in the Calibration dialog,
see chapter "Calibration dialog" on page 545.
7. Click OK.

8. Move the calibration wand in the measurement volume.

9. Check the Calibration results and click OK.

NOTE: If you have force plates there will be a warning reminding you of
measuring the force plate position again. Since it has most probably
changed some with the new calibration.

NOTE: If any problems with the calibration process occur, check the set-
tings on the Calibration page in the Project options. If that does not
help, check the troubleshooting list in chapter "Troubleshooting cal-
ibration" on page 1024.

Calibration tips

During the calibration, the wand is preferably moved in the measurement


volume in a way that allows all cameras to see the wand in as many ori-
entations as possible. This way the cameras will be properly calibrated in all dir-
ections.
To assure that all cameras can see the wand as much as possible during the cal-
ibration, make sure that the cameras are not blocked by the person moving the
wand around in the measurement volume. This ensures that no camera will be
blocked for a longer time period.
How to move the wand

One suggestion on how to move the wand, is to move it in one direction at a


time. Start by holding the wand positioned in the Z direction, i.e. the straight
line between the two wand markers should be parallel to the Z axis. Move the
wand in the entire measurement volume. It is important to fill the entire meas-
urement volume with calibration points. Make sure that both the lower and

RUNNING THE SYSTEM 548


upper parts of the volume are covered. Repeat the same procedure with the
wand positioned in the X and Y direction. It is particularly important to collect
points where there will be many markers during the motion capture.

In the pictures above the reference structure is not indicated to make the pic-
ture more distinct. The reference structure must of course always be present
during the calibration. The box in the figure represents the measurement
volume.
It is not necessary to hold the wand in distinct directions as described above,
but the wand can be moved more freely as well. The most important is that the
measurement volume is well covered and that the wand orientation is varied.
The best moving method may vary depending on the application. It is recom-
mended that the moving method is systematic and easy to repeat for con-
sistent results.

IMPORTANT: Keep the movements of the wand smooth and controlled.


Do NOT spin the wand at a high speed.

Extended calibration

To enable a larger measurement volume QTM uses a method called “Extended


calibration”. The Extended calibration is used automatically as soon as a cam-
era cannot see the L-frame, but has an overlap with other cameras.
In Extended calibration there is only some of the cameras that can see the L-
frame. The other cameras are calibrated by the overlap with the reference cam-
eras field of view. The result is a larger volume but with reduced 3D data accur-
acy compared with when all cameras can see the L-frame.
To perform an Extended calibration, make sure that at least two cameras can
see all markers of the L-frame. Then to extend the measurement volume, the
following three conditions must be fulfilled:

RUNNING THE SYSTEM 549


1. All parts of the measurement volume must be seen by at least two of the
cameras. Since it is impossible to calculate 3D coordinates in parts of the
volume where only one camera can see the markers.
2. The whole measurement volume covered by the cameras must be con-
nected. This means that a marker must be able to move through the
whole volume without passing any volume that less than two cameras
cover.
3. In volumes where only two cameras overlap, the size of the volume must
be large enough to fit the calibration wand, i.e. the diameter must be lar-
ger than the wand length.
When performing the calibration in an Extended calibration, it is very important
to consider how the calibration wand is moved.
To achieve the highest accuracy, and indeed to make a successful calibration at
all, it is vital that the calibration wand is moved considerably in the volume of
the cameras that can see the reference object. It is even more important to
move the wand in the volumes of overlap between different cameras. To
ensure that you have time to do this increase the calibration time and also
increase the Maximum number of frames used as calibration input setting
on the Calibration page in the Project options dialog.
For example, if two cameras can see the reference object and three are posi-
tioned to extend the volume, it is very important to move the wand in the
volumes of overlap between the two base volume cameras and the three exten-
ded volume cameras. The points collected in these volumes are the points used
to calibrate the volumes relative to each other. If the wand is not moved
enough in the volume of overlap, the extended volume of the three cameras
will be very poorly calibrated relative to the base volume.
In addition to these volumes of higher importance, it is still important that all
cameras can see the wand in as many orientations and positions as possible,
just as in a normal calibration.
Refine calibration

Refine calibration is meant to be used in situations where it is hard to place the


L-frame. The calibration is then done with only a wand, this however means
that coordinate system rotation and position can change since there is no ref-
erence for the coordinate system. If the position of the coordinate system is
critical then it is highly recommended that you either have other markers that

RUNNING THE SYSTEM 550


can define the coordinate system after refine calibration or that you use the cal-
ibration with L-frame. There is no automatized way to use other markers to
define the coordinate system so you need to calculate the rotation and trans-
lation in another program and then use the Transformation page to update
the current calibration.
It is required that the wand can be tracked in the old calibration to be able to
refine the calibration. This is because the old wand positions are used to trans-
late and rotate the coordinate system to the best position. This means that if all
cameras have moved a lot then refine is not possible to use. However just one
cameras or a few cameras can be moved to a completely new position as long
as the wand can be tracked and refine will still work.
To use refine calibration start preview and then select Refine calibration on
the Capture menu. Refine calibration uses the same settings as a standard cal-
ibration, but in the result you get some extra information.

Extra results for refine calibration:


3D point refit residual
The refit residual is a value on how well the new positions of the wand
matches the wand positions using the previous calibration. It is
impossible to give exact values of when the residual is low enough, since
it depends on the volume that is calibrated. However if the Refit residual
is higher than 1 mm, then there is a warning in the results because it is
likely that the position or rotation has changed.

Number of consecutive calibrations without reference


The the number of calibrations that has been done without an L-frame.
The changes in positions of the coordinate system will increase with each

RUNNING THE SYSTEM 551


new refine calibration. It is not recommended to do more than 5 without
an L-frame, unless you have other markers that are used to define the
coordinate system in reprocessing.

Start guess used


This displays whether the old calibration position was used as the start
guess. For example it will be No if you have moved a camera.
Advanced calibration (beta)

The advanced calibration can be used to simultaneously optimize both the cam-
era linearizations (intrinsic calibration) and the capture volume calibration
(extrinsic calibration). In many cases, this will lead to decreased 3D residuals of
the measured trajectories and improved 3D tracking.
The advanced calibration can be specially beneficial in the following cases:
l Systems with wide-angle lenses

l Large capture volumes

IMPORTANT: The advanced calibration is currently available in QTM as a


beta feature. If you want to use the advanced calibration, it is important
to evaluate the calibration results compared to a standard calibration.

Requirements for the advanced calibration

For the advanced calibration, it is important that all cameras are sufficiently
covered by the movements of the calibration wand. Mostly, this can be
achieved by optimizing the camera setup for the used capture volume accord-
ing to the following guidelines.
l Make sure that the cameras are pointed in a way so that the central part
of the sensor is used.
l Make sure that for each camera a significant part of the sensor is covered
by the movements of the wand. The recommended sensor coverage is at
least 75%.
l Make sure that there is sufficient depth coverage for all the cameras.

RUNNING THE SYSTEM 552


How to perform an advanced calibration

The advanced calibration uses the same calibration settings as the standard
wand calibration, and is performed in the same way.
Follow these steps to perform an advanced calibration:

1. Make sure that the calibration options are correct under Project Options
> Input Devices > Camera System > Calibration.

IMPORTANT: It is important to use a sufficiently large number of


calibration points. The exact length depends on the duration of the
calibration. The recommended number of points is 3000 or more.

2. Place the L-frame in the desired location

3. Start a preview.

4. Open the Advanced calibration dialog via the Capture menu: Capture >
Advanced calibration (beta)....
5. Set the duration of the calibration and press OK to start.

6. During the calibration, move the wand through the whole capture volume,
varying the orientation of the wand (see chapter "How to move the wand"
on page 548).
7. When done, inspect the calibration results.

If the calibration is passed, it will be automatically used as the current cal-


ibration of the project, and the linearization files on the Linearization page in
the project will be replaced by the new ones resulting from the advanced cal-
ibration.

Evaluation of the calibration quality

Before using the Advanced calibration, it is recommended to perform one or


more standard calibrations first, which can be used as a baseline when com-
paring the calibration results. The calibration results appear directly after per-
forming a calibration, or can be viewed for the currently loaded calibration on
the Current Calibration page in the Project Options. When comparing the res-
ults, pay special attention to the following values:

RUNNING THE SYSTEM 553


l Average residual per camera (Av Res (mm)).
The average residual camera should be as low as possible, and the values
should be as equal as possible between cameras.
l Points per camera (Points).
The number of points used for the calculation per camera should be as
high as possible, and should be as equal as possible between cameras.
l Standard deviation of the wand length (in mm).
The standard deviation of the wand should be as low as possible.
The exact values will depend on several factors, so it is not possible to give any
absolute figures. If the results of the advanced calibration are significantly bet-
ter than those of the standard calibration using factory linearizations, it should
be safe to use it.
A good way to compare the advanced linearization with a standard calibration
is by first doing a standard calibration based on factory linearizations, and then
recalibrate the same file using the advanced calibration method. The com-
parison will then be based on exactly the same wand movements. To recal-
ibrate a calibration with the advanced method:

1. Open the original calibration file from the Calibrations folder in the pro-
ject.
2. In the Capture menu, click on the Advanced calibration (beta)... button
to open the recalibration dialog.
3. Click OK to start the recalibration.

4. When done, inspect the calibration results in the results dialog.

5. It is recommended to save the file with a different name, for example by


adding a suffix _advanced to the file name. This way it will be easier to
revisit the calibration results for comparison, or to restore the standard
calibration and factory linearizations if needed.
For more information about calibration results, see chapter "Calibration res-
ults" on page 558.

RUNNING THE SYSTEM 554


NOTE: The linearizations resulting from the advanced calibration are spe-
cific for the used camera setup. Using these linearizations in a different
setup may degrade the tracking performance.

Restoring to factory linearizations in a project

The advanced calibration always takes the factory linearizations that are stored
on the cameras as a starting point for the optimization process. After suc-
cessful advanced calibration, the linearization files on the Linearization page
in the project will be replaced with the new files, which are stored in the lin-
earization files folder (for the folder location, see "Folder options" on page 427).
The advanced calibration will not replace the factory linearizations that are
stored on the cameras.
Once the new linearization files are loaded in the project, they will be used for
subsequent standard calibrations. If you want to perform a standard cal-
ibration based on the factory linearizations, you will need to restore the lin-
earization files in the project. The easiest way to achieve this is when the
camera system is available.

1. Make sure the camera system is connected.

2. Navigate to the Linearization page under Project Options > Input


Devices > Camera System > Cameras.
3. Press the Load from Cameras button.

If the cameras are not connected or available, the factory linearization files can
be restored by loading a standard calibration that was based on factory lin-
earizations, or by loading them from a folder on the computer in which they
are stored. In case the cameras have been connected to the computer, the fact-
ory linearization files will be stored in linearization files folder.

Use scenarios

Starting to use the advanced calibration

Before using the advanced calibration, it is important to evaluate the quality of


the calibration compared to the standard calibration. It is recommended to per-

RUNNING THE SYSTEM 555


form one or more standard calibrations as a baseline for evaluation of the
advanced linearization. For detailed guidelines, see chapter "Evaluation of the
calibration quality" on page 553.

Mixing advanced and standard calibrations

After using an advanced calibration, the linearization files on the Linearization


page in the project will be replaced by the optimized linearization files. These
will then also be used for following standard and refine calibrations. This brings
the following advantages.
l The following standard and refine calibrations will benefit from the optim-
ized linearizations with similar decreased residuals as the advanced cal-
ibration, while
l the standard calibration or refine calibration does not require the same
level of coverage as an advanced calibration and can be performed more
quickly.
When mixing the advanced and standard calibration, it is important to regularly
perform a new advanced calibration to refresh the optimized linearization files
for the best results.

Using projects with different camera setups

When using the same cameras in different camera setups, it is recommended


to use separate projects. Calibrations performed within one project will not
transfer to another when they contain different linearization files. As a result
advanced calibrations will not be mixed between projects.
Translating origin to the floor

For the new wand kits (carbon fiber 300 and 600, 120, active 500) the origin is
automatically translated to the corner point of the L-frame at floor level.
On custom or legacy calibration kits, the origin is by default placed in the center
of the corner marker on the L-frame. If you want to move the origin to another
position you can use the settings on the Transformation page, see chapter
"Transformation" on page 259. Use the following distances (in mm) if you want
to translate the origin to the bottom corner of the L-frame. The distances are
for the Z-axis pointing upwards, the X-axis pointing along the long arm and Y-
axis pointing along the short arm.

RUNNING THE SYSTEM 556


Wand 750
x: -13.5, y: -13.5, z: -24.5

Wand 300
x: -8.5, y: -8.5, z: -20.5

Wand 110
x: -10.0, y: -10.0, z: -7.5

NOTE: For the 300 and 750 reference, the bottom corner of the L-shape
is 1 mm outside of the U-profile. The figures above are from the corner of
the bottom plate, if you want the U-profile you should subtract 1 mm
from x and y.

Fixed camera calibration method


The Fixed camera calibration method uses fixed locations of the cameras and
the reference markers to calibrate a camera system. It can be used for systems
covering large areas, where other methods are impractical. With this method a
much larger measurement volume can be used, since one camera must not see
the whole measurement volume. The disadvantage is however that the camera
system must be fixed to its location.
Before the Fixed camera calibration is started the exact positions of the camera
system and reference markers must be entered on the Calibration page in the
Project options dialog, see chapter "Fixed camera calibration" on page 257.

NOTE: For more information about how to install and use a fixed camera
system contact Qualisys AB about the QTM - Marine manual. They include
detailed descriptions of the camera installation, survey measurement,
fixed camera calibration, validation and the use of 6DOF bodies in marine
applications.

RUNNING THE SYSTEM 557


Calibration results
The Calibration results dialog is shown after a calibration is completed. It dis-
plays if the calibration passed and the calibration quality results. The error mes-
sages of a failed calibration are described in the chapter "Calibration failed" on
page 560.

At the top you get a message if the calibration passed or not. There is also a
warning if the cameras are using Exposure delay.
There are four buttons at the bottom of the dialog, clicking them have the fol-
lowing effects.
OK
Close the dialog and the calibration file.

New measurement
Close the dialog and the calibration file and open a new capture file in pre-
view mode.

View Calibration
Track the calibration file and close the dialog.

Export
Export the calibration results to a xml file. The exported file also includes
the rotation matrix for the cameras. The default folder to save in is the
Calibrations folder in the project folder, see chapter "Project folder" on
page 61.

RUNNING THE SYSTEM 558


The current calibration can be viewed on the Current calibration page in the
Project options dialog. The page can be opened by double-clicking the cal-
ibration status icon in the bottom right corner of the Status bar.

NOTE: In a capture file the calibration results can be examined by click-


ing the Reprocessing icon and then opening the Calibration page.

Quality results

The quality results under the Camera results heading are camera specific. For
each camera ID there are the following five results:
X (mm), Y (mm) and Z (mm)
The distance (in mm) between the origin of the coordinate system of the
motion capture to the optical center of the camera. The distances are
respectively in the X, Y and Z direction.

NOTE: For wand calibrations the default origin of the coordinate


system is in the center of the corner marker, for information on how
to translate the origin see chapter "Translating origin to the floor"
on page 556.

Points
Number of points used in the calculation of the distance above. The num-
ber should be as many as possible, but without large differences between
the cameras. The maximum number of points for a Wand calibration
depends on the calibration time and the number of frames used in the cal-
ibration. If the camera has more than 500 points it is usually enough for a
normal measurement volume. For the other methods it depends on the
number of markers seen by the camera.

Avg. res. (mm)


The average residual (in mm) for the points above. The residual of the
cameras should be similar in size and as low as possible. Depending on
the measurement volume the average residual can differ between 0.5 to
1.5.

RUNNING THE SYSTEM 559


NOTE: If the camera result says Unused camera, then the camera has
been deactivated on the Linearization page in the Project options dia-
log. That camera cannot be used in measurements, unless the calibration
is reprocessed, see chapter "Recalibration" on page 563.

For a Wand calibration there are also a general quality results:


Standard deviation of wand length
The standard deviation (in mm) of the wand length in the calibration.

Finally the calibration time when the calibration was performed is displayed at
the end.
View Calibration

With View Calibration the calibration is tracked and opened in a 3D view win-
dow. For a Wand calibration the movements of the wand is shown and the
measurement volume can be confirmed. For the other two methods the pos-
itions of the markers can be confirmed in the 3D view.

NOTE: If the calibration is opened, the window must be closed before a


new capture can be started.

Calibration failed

If the calibration fails the calibration result will say Calibration failed and an
error message is displayed after each camera ID.
The error messages are as follows:
General calibration failure
Something is wrong in the calibration. Check the calibration settings.

Wand calibration errors:


Couldn't find the fourth marker of the L-frame
The marker on the short leg of the L could not be found. Check the ref-
erence structure and that all four markers are visible in the preview win-
dow of the camera.

RUNNING THE SYSTEM 560


Couldn't find the three markers in line on the L-frame
The three markers in line on the long leg of the L could not be found.
Check the reference structure and that all four markers are visible in the
preview window of the camera.

One or more of the markers on the L-frame were unstable


The reference structure or the camera has moved during the calibration.
Make a new calibration.

Could not initialise camera


There is some general failure so that the camera cannot calculate where
the L-frame is. One possible reason is that it is the wrong linearization file.

Frame and Fixed camera calibration errors:


Less than 75% of the frames had the correct no of markers in stable
positions
Some of the markers have been hidden or moved during the calibration.
Make a new calibration.

Wrong number of markers seen by the camera in the first frame


The definition of markers seen by each camera does not match the mark-
ers in the first frame. Check that the calibration settings on the Cal-
ibration page in the Project options dialog are correct.

No markers defined to be seen by this camera


There are no markers specified in the calibration settings for this camera.
Check which markers that are seen and enter them on the Calibration
page in the Project options dialog.

Calibration quality warning


The calibration quality is shown visually with the calibration status icon in the
right corner of the status bar. QTM checks both the residual of the last file and
time since the last calibration. The checks are activated on the Calibration
quality page in the Project options dialog.
The default value for the residual check is 3 mm and it also checks if too few 2D
markers of the camera is used in the 3D tracker. For the time the values are 5
hours and 12 hours, so that you are reminded to calibrate the system each day.

RUNNING THE SYSTEM 561


The calibration is new and ok.

The calibration is older than the first time limit or you have had a warn-
ing from the residual check after the last measurement.

The calibration is obsolete, i.e. it has passed the second time limit. It is
recommended to calibrated again, however it can still be used.

The system is not calibrated.

If residual check is exceeded when a file is tracked after a measurement, there


is a warning describing the error. For example if one of the cameras exceed the
limit you will get the warning below.

For residuals that are just slightly over the limit the data will still be ok, espe-
cially if you have more than 2 cameras that can see each marker. However it is
recommended that you check the setup to see if it can be improved and then
calibrate the system.
l First check that the markers 2D size of the markers are not much smaller
than in the other cameras. This can cause a higher residual than normal.
l Then you should check if the camera might have moved. For example if a
screw is not tight enough in the mounting the camera might move slightly
by its own weight causing a residual that slowly increases. This can be con-
trolled by making a measurement of just static markers 1-2 hours after
the calibration. Then if you get the error message it is very likely that the
camera has moved slightly.
If the warning instead is about the camera having too few contributions, then
you get the warning below. In this case the data is not used in the 3D tracking
so the 3D data you have has not been degraded. However it is recommended
to check the 2D data to see what the reason is and then calibrate the system.
l Turn on the 3D overlay and check if the 2D markers match the 3D mark-
ers. It could be that the camera has moved so that it no longer con-
tributes to the 3D data.

RUNNING THE SYSTEM 562


l Then check if there are a lot of extra markers. The warning can also be
caused by extra markers that do not result in 3D data.

Recalibration
An existing calibration can be recalibrated to improve it or solve problems, for
example if a calibration failed. The following problems can be corrected:
l Wrong calibration kit chosen

l Wrong coordinate system orientation

l Too low number of calibration frames

l Incorrect positions of fixed cameras and reference markers

l Incorrect linearization files

l Incorrect calibration type

l Deactivated cameras

Furthermore, it is possible to apply 2D processing to the calibration file before


recalibration, for example excluding too small markers, or masking problematic
areas with software masks, see chapter "Processing 2D data" on page 609. This
may for example help to fix a failed calibration.
Follow these steps to recalibrate an existing calibration:

1. Open the calibration file of the wanted capture file. The calibration file is
in the Calibrations folder in the project folder, see chapter "Project
folder" on page 61.

RUNNING THE SYSTEM 563


NOTE: The name of the calibration file, of the current capture file,
is found on the Calibration page in the File reprocessing dialog .
To open the dialog click the Reprocessing icon , when the wanted
capture file is opened.

2. Click the Calibration icon to open the Recalibration settings dialog.

3. In the dialog the Calibration and Linearization settings can be changed,


see chapter "Calibration" on page 253 and chapter "Linearization" on
page 249.
l If a camera is deactivated on the Linearization page, that camera
will not be used when a file is reprocessed with the calibration file.
Activate the camera if you want it to be used in the measurements.
4. Click OK to start recalibration.

5. The Calibration results dialog is shown. Click Use if you want to use the
reprocessed calibration as the current calibration. OK will only close the
Calibration results dialog.

RUNNING THE SYSTEM 564


NOTE: It is possible to Use a calibration that does not match the
current camera system, because it enables you to reprocess files
easier.

6. Save the calibration file and close the file.

IMPORTANT: All the capture files that have used this calibration must
then be reprocessed, see chapter "Reprocessing a file" on page 601.

Merge calibration files


Calibration files can be merged in the case that a measurement volume cannot
be calibrated at the same time. The calibration files must include the same cam-
eras, i.e. cameras that are unused for a volume must still be connected during
the calibration of that volume. Calibration can for example be merged in the fol-
lowing situations.
l A fixed camera system when not all of the cameras can be calibrated at
once.
l Separated volumes that don't overlap, for example measuring in two dif-
ferent rooms.
Follow these steps to calibrate and then merge calibration files.

1. Disable the cameras that can't be used for a certain volume on the Lin-
earization page in Project options.
2. Calibrate with the remaining cameras in the system. Remember the name
of the calibration file.
3. Enable the other cameras on the Linearization page and disable the first
set of cameras. It is possible to merge any number of calibration files so
the system can be divided into any number of calibration files.
4. Calibrate the system again and remember the name of the calibration
files if there is more than one.

RUNNING THE SYSTEM 565


l For separated volumes it is necessary to translate the coordinate sys-
tem. It is recommended that the L-frames are placed in the same dir-
ection to make it easier to measure the distance. Measure the
distance between the L-frames and enter the distance in x, y and z
on the Transformation page in Project options, see chapter "Trans-
formation" on page 259.

NOTE: For systems above and under water it is recommended


to use the Twin system feature and use the Twin system cal-
ibration to define the relations between the coordinate sys-
tems.

5. When all calibrations are finished, open the Current calibration page in
Project options and click on Load other.
6. Select all of the calibration files made in the previous steps and click on
Open.
7. The calibrations are merged and the result of the merge is displayed in
the Calibration results. The merged calibration is saved as a QCA file but
it only includes the calibration results and not the individual 2D data. If
any reprocessing is needed for the calibration use the original files and
repeat the merge.

Capturing data

Introduction to capture
To capture data you need to have a project in QTM. It is recommended to make
a new project if you for example change the marker setups or if you want to
work with specific camera settings. For more information about projects see
chapter "Projects" on page 60.
Usually before you start a measurement it is best to open a new empty capture
file with New on the File menu. The file will be opened in preview mode where
you can check measurement volume and settings before starting the capture.

RUNNING THE SYSTEM 566


The capture is then started by clicking either the Capture icon or clicking Cap-
ture on the Capture menu. The settings needed to start the capture are set in
the Start capture dialog, see chapter "Start capture" on page 569. However if
the system is calibrated and the settings are ok the new file is not necessary
and you can click the Capture icon to open the Start capture dialog directly.
The cameras will start in the same mode as they where in the last preview or
measurement. When capturing, there will be a red border around the view win-
dow.

NOTE: When starting a preview, QTM will detect automatically if the cam-
era system has old firmware. The firmware must then be updated before
the system can be used. For more information see chapter "Firmware
update when starting preview" on page 471.

Outline of how to capture

The steps below provide an outline of how to perform a capture.

1. Switch on the camera system and start QTM.

2. Select the project you want to use or create a new project.

3. Open a new file by clicking the New file icon . The cameras will start in
the same mode as last preview or measurement.

NOTE: If the system has been calibrated and is ready for a meas-
urement you do not have to open a new file. Click the Capture icon
and go directly to step 6.

4. Check that the markers are visible, otherwise change the camera settings
see chapter "Tips on marker settings in QTM" on page 483.
5. Calibrate the system, see chapter "Calibration of the camera system" on
page 543.
6. Go the Processing page in the Project options dialog to activate any pro-
cessing steps you want to apply directly after a capture.

RUNNING THE SYSTEM 567


l Activate the Auto backup option if you want to create an temporary
backup of the 2D data before the other processing steps are per-
formed, see chapter "Auto backup" on page 572.
l Optionally, activate the Store real-time data option if you want to
save time between capture, but still need to be able to quickly
review the data in between captures, see chapter "Store real-time
data" on page 572.
7. Click the Capture icon .

8. Specify the capture settings in the Start capture dialog.

9. Check that all of the settings under the Camera system settings heading
are correct.
10. Click Start to start the capture. When the capture is finished, all of the pro-
cessing steps will be performed and then the new motion capture file is
displayed in QTM, unless batch capture is activated. In batch capture,
QTM will immediately start waiting for the next capture as soon as the pro-
cessing of the previous capture is finished. An ongoing capture can always
be stopped with Stop capture on the Capture menu or by clicking the
Stop capture icon .

NOTE: If any problems occur during capture, check the settings in the
Project options dialog and in the Start capture dialog. If that does not
help, check the troubleshooting list in chapter "Troubleshooting capture"
on page 1025.

RUNNING THE SYSTEM 568


Start capture

The Start capture dialog appears before the start of every capture or batch
capture.
Click Start to start a capture or click Options to change the settings in the Pro-
ject options dialog.
Capture period

Under the Capture period heading, the capture period is specified in seconds
or in number of frames. Fractions of a second can be specified, which will be
rounded off to the nearest number of frames. If a Qualisys camera is in Video
mode the number of Video frames, the corresponding video capture rate and
the maximum number of video frames are also displayed.

RUNNING THE SYSTEM 569


When measuring video the measurement time will be limited by the maximum
number of video frames. The maximum number of video frames depends on
the video image size, the image resolution and the internal memory of the cam-
era.
With the Continuous capture option the capture does not stop until a trigger
stop event is received or the Stop capture button is pressed.

NOTE: Continuous capture is not available when using an analog board


or EMG system. It is also not available for a Twin system.

Capture delay and notification

Under the Capture delay and notification heading there are options for delay-
ing the start of the capture (Use capture delay) and for sound notification on
start and stop of the capture (Use sound notification on start and stop).
When the Use capture delay option is used the delay is specified in seconds in
the text box next to the option.
Automatic capture control

Under the Automatic capture control heading there are options for auto-
matic saving of measurement files and whether to use batch capture.
Select the Save captured and processed measurement automatically
option to automatically save the measurement file. Enter the folder in which
the files will be saved in the Folder option. You can Browse for a folder or use
the Reset to project folder option to reset the folder to Data folder in the cur-
rent project. Set the name of the files in the Name option, an automatic
counter can also be assigned to the filename.
When Batch capture is selected, QTM will make several consecutive meas-
urements. When batch capturing Save captured and processed meas-
urement automatically and the automatic counter must be selected. When
Wait between captures is checked the user will be prompted to start each
new capture. If unchecked, the next capture will start as soon as QTM is ready
after a previous capture. For information on batch capture, see chapter "Batch
capture" on the next page.

RUNNING THE SYSTEM 570


Camera systems settings

Under the Camera system settings heading the measurement settings are dis-
played. The settings can be changed by right-clicking on the entry and then click
Change or Reset to default value. The Project options dialog can also be
reached by clicking Options.

Batch capture
With batch capture, QTM will capture several consecutive measurements. Batch
capture is activated with the Batch capture option on the Start capture dia-
log, before the start of the measurements. In this dialog, the options Save cap-
tured and processed measurement automatically and Add counter must
also be selected so that the each measurement is saved in a separate file.
Before each new measurement in a batch capture QTM will wait for a start sig-
nal and the whole batch capture is stopped by a stop signal. These signals are
given in different ways depending on whether external trigger is used or not.
During the measurement the border of the view window will indicate the
status.
External trigger
If the external trigger is used to start each measurement, to stop the
batch capture you need to press Esc or click Close on the File menu. Stop
capture on the Capture menu can be used during an individual capture
to stop just that capture.

No external trigger
Start each measurement by clicking Yes in the Next measurement dia-
log. Stop the batch capture by clicking No in the dialog. Stop capture on
the Capture menu can be used during an individual capture to stop just
that capture.

RUNNING THE SYSTEM 571


NOTE: All of the processing steps that are selected on the Processing
page in the Project options dialog will be performed before the next
measurement can start. The Store real-time data option can be used to
reduce the time between captures, but still save the data processed in
real-time for quick inspection of the data, see chapter "Store real-time
data" below.

Auto backup
The measurement can be saved automatically directly after the data is fetched
with the Auto backup option on the Processing page in the Project options
dialog. When activated a temporary file with the 2D data is saved before the
other processing steps. If the file is very large the auto backup may take several
minutes. Then if QTM crashes during the processing of the data the file can be
retrieved at the next startup of QTM.

Store real-time data


The processed real-time data is saved directly to the QTM file when the option
Store real-time data is enabled on the Processing page. This option can for
example be used to speed up the capture process, while still being able to
quickly review the data after a capture. In this mode only the export processing
steps are active and all processing steps that changes data are disabled.
All of the raw data is stored in the file, so that the data can be reprocessed.

RUNNING THE SYSTEM 572


IMPORTANT: When using the Store real-time data option, it is recom-
mended to reprocess the files afterwards. The quality of the processed
data may improve when reprocessing the file, e.g. for AIM reprocessing
which can use all of the data in the file to identify trajectories. There may
also be missing frames in case real-time frames were skipped during the
recording.

NOTE: When the capture rate is too fast, the real time frequency is
reduced. When storing the real-time data it means that 3D data is only
processed for part of the frames.

The following applies to the different data types:

Process every frame option Process every frame option


Data type
enabled disabled

All raw 2D data is included in All raw 2D data is included in


2D (raw)
the file the file
All of the 2D data is processed Some 2D data may not be pro-
2D (processed)
and used for 3D tracking cessed or used for 3D tracking
Some 3D data may not use all
All of the 3D data is tracked of the 2D data available in the
3D (unidentified)
using all 2D data file. The result can be 3D data
with reduced quality.
AIM can only use old 3D data
Same as for when the option
to identify the trajectory.
is enabled and the reduced
3D (identified) When reprocessing the data
3D data quality may also
AIM can use all of the data in
affect AIM .
the file for identification.

6DOF Same as for 3D (identified) Same as for 3d (identified)

Skeleton Same as for 3D (identified) Same as for 3d (identified)


All of the analog data is All of the analog data is
included and offsets are included and offsets are
Analog
applied to all of the analog applied to all of the analog
samples samples
Force Forces are calculated for all of Forces are calculated for all of

RUNNING THE SYSTEM 573


Process every frame option Process every frame option
Data type
enabled disabled

the samples the samples


All of the EMG samples are All of the EMG samples are
EMG
included in the file included in the file

Reprocessing is recom-
Reprocessing is recom-
mended, because the delay
mended, because the delay
between eyetracker and
between eyetracker and
6DOF data is not com-
Eyetracker 6DOF data is not com-
pensated for in real-time.
pensated for in real-time.
Gaze data is calculated for
The Gaze data is calculated
all of the frames that
for all of the samples.
include 6DOF data.
All video frames are included All video frames are included
External video and the time offset is applied and the time offset is applied
to the video. to the video.

Qualisys video capture


An important feature of Qualisys cameras is the video functionality. All Qualisys
cameras can be used in video mode. This can for example be useful when set-
ting up the camera system to see the image area of the individual cameras.
Video capture with Qualisys cameras offers the following advantages:

1. Synchronized video capture with marker data

2. The possibility to use 3D data overlay

Qualisys offers a range of camera models that can be used as dedicated video
cameras. Such cameras feature a clear front glass, a white strobe. Oqus high-
speed cameras also have a larger buffer memory buffer for storing video
frames. Dedicated video cameras are the Miqus Video and Video+ series, and
Oqus 2c. Furthermore, several Oqus types are available in a high-speed video
configuration (monochrome image). For more information about the various
types of Qualisys video cameras, see chapter "Video cameras" on page 434.

RUNNING THE SYSTEM 574


The image resolution and the available range of video capture frequencies
depend on the camera model. Some camera models have high-speed sensor
modes, which allow video capture at full field-of-view at reduced resolution.
Oqus high-speed cameras can capture video at rates up to 10 kHz (uncom-
pressed, reduced FOV). Some camera models feature in-camera MJPEG com-
pression, allowing longer recordings at a reduced maximum capture rate. In
standard marker cameras, the video capture rate is limited to 30 Hz. For an
overview of features of Qualisys video cameras, see chapters "Qualisys video
sensor specifications (in-camera MJPEG)" on page 927 and "High-speed video"
on page 960.

Using and calibrating Qualisys video cameras


Video capture with Qualisys cameras involves the following steps:

1. Make sure that the cameras that will be used for recording video are in
video mode.
2. Choice of video settings (capture rate, exposure, compression, etc.). The
available video settings and the interface for changing them in QTM
depends on the camera model. The video capture rate can be set indi-
vidually per camera and may be different from the capture rate of the
cameras in marker mode. Some camera models have the option to use in-
camera MJPEG compression. This allows a longer capture, however, at a
reduced maximum capture rate, see chapter "Qualisys video sensor spe-
cifications (in-camera MJPEG)" on page 927.
3. Calibration of Qualisys video cameras is optional, but required for 3D
overlay. Qualisys video cameras can be calibrated together with the
marker cameras in the system. During the calibration, the Qualisys video
camera will be automatically switched to marker mode, so it is important
to make sure that the marker settings are correct. If you do not want to
include a video camera in the calibration, you can uncheck it in the lin-
earization options, see chapter "Linearization" on page 249.
4. Start a new capture. Depending on the camera model and settings, video
is streamed to QTM during the measurement or buffered on the camera
and fetched after the capture is finished.

RUNNING THE SYSTEM 575


NOTE: Depending on the capture frequency streaming video cam-
eras might store some frames during the capture. These frames are
fetched after the capture is finished.

5. The video from each camera will be stored in a separate AVI file and can
be played back in QTM, see chapter "Qualisys video files" on page 586.
You can then for example activate the 3D overlay on the video, see
chapter "3D data overlay on video" on page 587, and export the view with
the Export to AVI feature on the File menu.
Calibrating Qualisys video cameras

Qualisys video cameras are calibrated in the same way as Qualisys marker cam-
eras. During the calibration, the video cameras will be automatically switched
to marker mode. Follow these steps to calibrate Qualisys video cameras.

1. First, you need to set the marker settings. Switch to Marker intensity
mode and adjust the exposure and threshold so that the markers are red,
and the background is dark blue.
2. Make sure that the cameras have enough overlap with other cameras
within the system. Preferably, the L-frame is visible, but this is not
required.
3. Start a calibration. The video cameras will be calibrated together with the
marker cameras in the system.

Capture streaming video


Streaming video can be recorded with Miqus Video/Video+/Hybrid cameras and
the Oqus 2c color video camera. When using streaming video, compressed
video data is sent to QTM during the capture, allowing for long video captures.
In-camera MJPEG compression

The functionality to stream MJPEG compressed video is available on most


Qualisys camera models, except the older Oqus camera models. In this mode,
the image will be compressed in the camera before it is sent to QTM. This will
reduce the amount of data that is sent so that you can measure video for an
unlimited amount of time.

RUNNING THE SYSTEM 576


The available capture rates are lower than those for uncompressed video since
the camera has to process the image. For example, on the Oqus 2c video cam-
era at 1920*1080 the video capture rate is limited to 24 Hz. There are multiple
high-speed sensor modes available on the 2c and 5+ series. These can be used
if you want to have a higher capture rate, but want to keep the Full FOV at a
reduced resolution, see chapter "Sensor mode" on page 246 and "Qualisys
video sensor specifications (in-camera MJPEG)" on page 927.
The in-camera MJPEG compression is selected with the Video compression
option, either on the Camera settings sidebar or on the Cameras page in the
Project options dialog, see chapter "Camera settings sidebar" on page 91 and
"Video compression" on page 242. The in-camera MJPEG compression is default
when it is available. Miqus Video cameras always uses in-camera MJPEG com-
pression, so it is not available as an option.
You can specify the quality of the MJPEG codec on the Cameras page. The
default quality is 50, you can increase it if you want higher quality but then the
file will be larger.
The video file will be saved with MJPEG compression. Therefore, the computer
needs to have codec that can read it to play the file. Mostly, Windows com-
puters can play videos compressed with MJPEG codec without having to install
a specific codec.
Streaming Video settings

The settings of streaming video cameras (Miqus Video, Miqus Video Plus and
Oqus 2c) are available under the Streaming Video settings in the Camera set-
tings sidebar. The Camera settings sidebar allows to adjust basic settings, see
chapter "Camera settings sidebar" on page 91. Advanced video settings are
available on the Cameras page in the Project options, see chapter "Video set-
tings" on page 238.
The following steps show how you can change the settings for streaming video
cameras.

1. Choose the video capture rate. The buttons show a selection of common
video capture rates. Integer divisions or multiples of the current marker
capture rate are indicated bold. Note that the maximum frequency, as
well as the frequency values of the buttons, may change depending on
the selected resolution and aspect ratio (steps 2 and 3).

RUNNING THE SYSTEM 577


2. Choose the video image resolution. The buttons show standard image res-
olutions available on the Miqus Video. Note that the field of view (FOV)
may change depending on the resolution setting.
3. Choose the aspect ratio. The buttons show standard aspect ratios of 16:9,
4:3 and 1:1.
4. Set the exposure.
l When using Auto Exposure, you can set the Exposure Com-
pensation to get a brighter or darker image. You can also select the
area on which the auto exposure is based using the Auto Exposure
Tool on the 2D view Toolbar (see "2D view toolbar" on page 89).
l When Auto Exposure is unchecked, you can manually set the expos-
ure by setting the Exposure Time and selecting the Gain.

NOTE: The file size of the recorded videos can become very large when
using a high resolution and a high video capture frequency. If you need
the video for documentation purposes, you can reduce the file size con-
siderably by using a lower resolution and a lower capture frequency (e.g.
540p @25 Hz).

Maximum capture rate for streaming video

The maximum capture rate for streaming video cameras depends on the selec-
ted resolution and aspect ratio. The available capture frequency range for the
current combination of settings is shown under Video settings on the Cam-
eras page in Project options. The below Tables give an overview of the max-
imum frequency values for Miqus Video and Oqus 2c.

Miqus Video (VC+)

Maximum video capture frequency (in Hz) as a function of resolution and


aspect ratio for Miqus VC+.

16:9 4:3 1:1


1440p - 100 120
1080p 120 160 180
720p - 400 480
540p 440 480 480

RUNNING THE SYSTEM 578


Miqus Video (VC, VM)

Maximum video capture frequency (in Hz) as a function of resolution and


aspect ratio for Miqus VC and VM.

16:9 4:3 1:1


1080p 86 111 149
720p 180 237 292
540p 333 428 565
480p 416 550 714

Oqus 2c

Maximum video capture frequency (in Hz) as a function of resolution and


aspect ratio for Oqus 2c.

16:9 4:3 1:1


1080p 24 32 41
720p 53 68 86
540p 62 77 94
480p 77 95 116

Capture high-speed video


All Oqus cameras with high-speed video capabilities can capture video up to
10000 fps. Oqus high-speed cameras can be recognized by the clear strobe
glass, and the second number in the Oqus type is 1, e.g. 310. For a standard
Oqus cameras, the video capture rate is limited to 30 Hz and the dark glass fil-
ters out the visible light, therefore the video functionality in the standard Oqus
is mostly used in preview.
The Oqus camera captures video frames that are synchronized with the marker
frames of other cameras in the system, however it is possible to use different
capture rate for marker and video. If the Oqus camera has the high-speed func-
tionality it is possible to capture video within the same limitations as the
marker capture, so the video and marker capture rates can be the same.

RUNNING THE SYSTEM 579


The Oqus video settings are controlled in the Camera settings sidebar or on
the Cameras page in Project options dialog, see chapter "Camera settings
sidebar" on page 91 respectively "Cameras" on page 225.
The recorded video data can be viewed in QTM. The video is stored in AVI
format, which can be imported in external video software for further pro-
cessing or analysis.
Information about high-speed video capture

The following issues should be considered when capturing high-speed video.


Lighting
It is important that there is enough light for recording high-speed video.
To capture video data the IR filter has been removed from the Oqus cam-
era, i.e. a clear glass has been mounted. Up to about 60 Hz it is often pos-
sible to capture video without extra lighting, however it depends a lot on
how light the room is.
For high speed video, meaning frequencies above 100 Hz, extra lighting is
definitely needed. High intensity lamps must be used and with increasing
frequency and distance you need more light intensity.

NOTE: If you have mounted the external IR filter on the lens that
must be removed, so that the visible light is recorded.

Data storage
High-speed video generates a large amount of data. During a capture, the
data is buffered in the camera. The maximum amount of data that can be
captured is 1.1 GByte. After the capture, the video data is downloaded
from the camera, which can take up to about 5 minutes.
The number of frames that can be captured depends on the frame rate
and the image size, see chapter "High-speed video" on page 960. For
another resolution multiply the amount of data with the reduced res-
olution/full resolution.
Outline of how to capture high-speed video

The following outline describes how to capture high-speed video with an Oqus
camera.

RUNNING THE SYSTEM 580


1. Switch on the camera system and start QTM.

2. Open a new file by clicking the New file icon . If you want to use the 3D
overlay functionality it is important to check that the camera system has
been calibrated.
3. Switch to 2D view if it is not visible, i.e. right-click in the View window and
select Switch to 2D view from the menu.
4. Right-click on the 2D view for the camera that will capture high-speed
video. Select Mode/Video to switch to Video mode. The 2D view for that
camera will switch to a video image.
5. Open the aperture to at least 4 and set the focus.

6. Change the settings for Video capture in the Camera settings sidebar.
a. Set the video Capture rate. This can be set independently of the
marker image rate.

NOTE: The image size is reduced automatically when it is


changed on the Camera settings sidebar.

b. Set the Exposure time to a value that makes the image bright
enough, test until you are satisfied.
If you have no extra light the exposure time needs to be quite high,
at least 16000 microseconds or even up to 40000. This limits the cap-
ture rate that can be used. It also means that fast movements will be
blurred.
For high capture rates and measurements with fast movement,
extra lighting is needed because the exposure time must be
extremely short, sometimes as short as 100 microseconds.
Use a Codec if you want to reduce the size of the avi-files. If the file
will be used in analysis it must not be reduced too much as it can
influence the analysis.

RUNNING THE SYSTEM 581


NOTE: The codec does not change the download speed since
it is applied after downloading the raw video.

NOTE: The Image format will reduce the image directly in the
camera, but then you will lose pixels.

7. A video capture takes about a second to initialize so it is recommended


that you use external trigger and pretrigger to start the capture. For the
settings, see chapter "Synchronization" on page 266.
8. Click the Capture icon and set the capture time and other capture set-
tings. Keep the capture time as short as possible to reduce the fetching
time.
9. Click OK and then press the trigger button when you want to start the
measurement.
10. Wait for the fetching of video data to finish. Depending on how large file
you have captured and the number of cameras, it can take up to some
minutes to fetch the data.
11. The video file is displayed in the 2D view window and can be played in syn-
chronization with other data.

NOTE: If the video data is uncompressed it is not possible to play


the file in normal speed as the video playback will be too slow. Then
it is better to step through the file instead.

Codecs for Oqus high-speed video files

A codec can be used to significantly reduce the file size of the high-speed video
files. Any codec that can run on Microsoft DirectShow can be used in QTM to
compress Oqus high-speed video files. There are many codecs available. For a
selection of recommended codecs that have been tested with QTM, see the list
below.

RUNNING THE SYSTEM 582


Lossless codecs keeps all original image information in the files which makes
them suitable for high quality video analysis. Lossy codec removes information,
usually based on what the human eye can register, which makes a video com-
pressed with a lossy codec suitable for movie encoding.
The codec of the high-speed video file must be set before the measurement is
started, either by right-clicking on an Oqus camera in Video mode or with the
Video compression settings on the Cameras page in the Project options dia-
log. It is not possible to compress a saved video file in QTM. However, QTM will
still be able to play the file if it is compressed in another program.

IMPORTANT: Even for the codecs we recommend we will not guarantee


that the files can be played in any program, even if the codec is installed
on the computer. This is because every program is a little bit different in
how they play the file and it is impossible for us to test all of the settings
and configuration. Therefore we recommend that you make tests with
the codec before making the measurements.

Recommended codecs

For a list of current recommended codecs and download links, see


https://www.qualisys.com/info/recommended-codecs/. More information
about the codecs:
Lagarith
l A lossless codec with a typical compression of 4-8 times. The codec uses
arithmetic and RLE encoding.
l In QTM the files can be played close to the original speed.

BlackMagic/Decklink MJPEG
l A lossy codec using JPEG compression of each video frame.

l The codecs are included with BlackMagic Desktop Video software. The
software can be installed, even when you do not have a BlackMagic
design video interface.
l For the latest version tested with QTM, use the download link provided at
https://www.qualisys.com/info/recommended-codecs/.

RUNNING THE SYSTEM 583


l On the BlackMagic design download page, you can click the Down-
load Only link to download the software without registration.
FFDS (MJPEG)
l The FFDShow MJPEG codec, which was previously included with QTM, has
been discontinued.
l The FFDS MJPEG can be used with QTM version 2023.3 or earlier. For
detailed instructions, see the manual included with the specific
QTM installation.
l For QTM 2024.1 or later, the BlackMagic/Decklink MJPEG codec is recom-
mended.

Video preview in QTM


The video preview is a helpful functionality in the setup of the camera system.
It can for example be used to find reflections or look at the current field of view
of the camera. Any Qualisys camera can be changed to show video image in pre-
view. Follow these instructions to get the best image in preview.

1. Open a new measurement and open a 2D view window.

2. Click on Video or Marker intensity in the 2D View window menu to


switch the camera to Video mode or use the Mode options in the Camera
settings sidebar to switch all visible cameras to Video or Marker intens-
ity mode. When switching to video preview the marker cameras will con-
tinue to capture at the marker capture rate and try to keep the marker RT
frequency. The video frames will then be fetched in parallel with the
marker data at the highest possible frequency. Which depends on the
camera settings (for example zoom) and the number of cameras in video
mode.
If the camera has In-camera MJPEG compression then that is used during
the preview and the RT frequency can be around 20-30 Hz. However for
the other cameras the RT frequency will usually be 5-10 Hz.

RUNNING THE SYSTEM 584


3. There are two different Video modes.
Marker intensity
In the Marker intensity mode the cameras use the same Exposure
and flash time as for markers. The video is also color coded, see
image below, so that it is easier to see the Marker threshold. The
Marker threshold will always be where the color is green, therefore
the image will change when you change the Exposure time or the
Marker threshold. Which means that anything yellow and red in
the image will be markers, when you switch back to Marker mode.

Video
In the Video mode the cameras use the exposure and flash time
from the Video settings on the Cameras page. This is the mode
that you use when you want to capture video with the Qualisys cam-
era.

4. To optimize the preview frequency the number of pixels in the image are
automatically changed depending on the 2D view size and the zoom.

5. Use the Video mode and set the following settings on the Camera setting
sidebar to get a bright image in preview with non high-speed Qualisys
cameras.
Capture rate
Set the Capture rate to 10 Hz.

RUNNING THE SYSTEM 585


Exposure time
Set the Exposure time to 99000 microseconds for a standard cam-
era. If the image is still too dark lower the Capture rate and
increase the Exposure time even more.

Flash time
The flash does not change the brightness of the image a lot on long
distances. Therefore the Flash time can be set to 0. If you want to
see the markers better, e.g. when setting the focus on a marker, you
can set the flash time to about 400 microseconds.

6. Then to see more of the video image double-click on the camera so that only
that camera is shown in the 2D view window. It is also possible to zoom in
the image, see chapter "Selecting cameras, zooming and panning" on
page 87.

7. The following list are some of the things that the Video preview can be
used for.
Check the cause of extra reflections
Zoom in on the extra reflection in Marker intensity mode if neces-
sary.

Check the camera field of view

Check the focus setting


Use the Video mode with full resolution to focus.

Qualisys video files


The video captures from each camera will be stored in separate AVI files at the
same location as the QTM file. The name of the AVI file(s) is the same as the
QTM file with the camera type, camera number and camera serial number
appended. Video files are linked to in the QTM file using a relative path, so that
the link is still valid when moving or copying all the files to a different location.
The video data can be compressed with a codec in QTM. You can select any
codec that is installed on the computer to compress the data. For information
about video codecs, see chapter "Codecs for Oqus high-speed video files" on
page 582. For the In-camera MJPEG mode the codec is always MJPEG.

RUNNING THE SYSTEM 586


The video file is displayed in the 2D view window in QTM, but it can also be
opened in a media player.

NOTE: If you have compressed the video file with a codec the computer
where you play the file must have that codec installed.

The AVI files from Qualisys cameras contain meta information about the
QTM version, the capture time and the SMPTE time code, if used, according to
the AVI standard (isft, idit and ismp)

NOTE: The meta information is deleted if the QTM file is trimmed.

3D data overlay on video

The 3D data can be overlayed on the Qualisys video data to show the 3D view
from that camera's viewpoint. This can for example be used for showing the
force arrow in the video of someone stepping on a force plate. Which 3D
objects that are displayed in the 3D overlay is optional.

RUNNING THE SYSTEM 587


The Qualisys video will always be synchronized with the marker data. However
if the video capture rate is different from the marker capture rate the video will
of course only be exactly at the same position for corresponding frames. For
example with video capture at 30 Hz and marker capture at 120 Hz, the video
data will be updated every fourth marker frame.
Follow these steps to activate the 3D data overlay.

1. Calibrate the camera system, including the Qualisys cameras that are
used in video mode.
2. Open a 2D view window in RT/preview mode or in a file.

3. Right-click on the camera where you want to turn on the overlay and
select Show 3D data overlay.
4. The 3D elements displayed in the overlay and the opacity can be changed
on the 2D view settings page in the Project options dialog, see chapter
"2D view settings" on page 417.
5. The video data display in the 2D view can be switched between linearized
and unlinearized on the 2D view settings page. To match the video with
the 3D data the data must be linearized.

Capturing with multiple video systems for markerless


tracking
The Miqus video cameras must be split in multiple systems when there are
more video cameras than the computer model can handle. For example, a com-
puter with the 11th generation of Intel I9 processor, a maximum of 24 cameras
can be used at full speed and resolution. A computer with the 10th generation
of Intel I9 processor can handle up to 16 video cameras.
The setup for multiple video systems are described in "Setting up multiple
video systems" on page 525. The system will consist of one Main system with
the Camera Sync Unit and Agent systems which are time synchronized with the
Main system.
Calibration

First of all the camera systems must be calibrated in one calibration file so that
the video can be processed in Theia. Follow these steps to calibrate the system:

RUNNING THE SYSTEM 588


1. Disable the blocklist in QDS on the Main system.

2. Restart the cameras on the Agent systems and shutdown QDS before they
receive the IP address from the Agent system. The cameras will now get
an IP address from the Main system.
3. Calibrate the system with all video cameras, for general advice on cal-
ibration of video cameras, see chapter "Calibrating Qualisys video cam-
eras" on page 576.
4. Enable the blocklist in QDS on the Main system.

5. Start QDS on the Agent systems.

6. Restart all cameras, so that they get the IP address from the correct sys-
tem.
Setting up synchronization

When the system is calibrated follow these steps to capture synchronized video
in all of the systems.

1. Start the capture on all of the Agent systems so that QTM is in the Waiting
for trigger mode. The cameras are waiting for the UPD packet with the
camera start time from the Main system, for configuration see chapter
"Setting up multiple video systems" on page 525.
2. Start the capture on the Main system. The Main system will trigger the
Agent systems via the UDP packet so that the video capture starts at the
exact same time.
3. Once the capture is finished the QTM and video files are saved on each
separate computer. The UDP packet includes the file name used on the
Main system so that the QTM and video files on the Agent systems auto-
matically use the same name.
Processing with Theia 3D markerless mocap software

After the video capture follow these steps to process in Theia.

1. Copy the calibration file from the Main system to the computer with
Theia.

RUNNING THE SYSTEM 589


NOTE: The computer with Theia can be the same as the Main sys-
tem computer, but it is not possible to process in Theia at the same
time as the computer is capturing video files.

2. Copy all of the video files for one capture from Main and Agent systems to
a new folder on the computer with Theia. It is important that the folder
only includes the videos from one capture for the Theia processing to
work.
3. Optionally copy the QTM files to the same folder as the video files. The
QTM files are not needed for the Theia processing, but are good to save
to be able to read the settings used for the capture. It is important to
rename the QTM files from the Agent systems since otherwise the files all
have the same name.
4. Start Theia and import the video files and calibration manually. Please
refer to the Theia manual for instructions how to perform the processing.

NOTE: Theia will rename and move the video files, so you need to
have a copy of the video files if you want to be able to open the
QTM file with videos.

Real-time streaming
The QTM real time process enables QTM to send data to any program that can
receive data on TCP/IP or UDP/IP.

Real time protocols


The following protocols are supported by QTM:

QTM Real-Time Server protocol (RT protocol)


QTM's own real-time protocol.

RUNNING THE SYSTEM 590


Open Sound Control (OSC)
A protocol widely used for sound synthesizers and other multimedia
devices. An example for MAX/MSP, MAX OSC Demo, is available at the
QTM download page that can be accessed via https://www.qualisys.-
com/my/.

The protocols are described in the QTM Real-Time Sever protocol doc-
umentation QTM RT protocol.pdf that is included with the QTM installer in the
RT Protocol subfolder ("C:\Program Files\Qualisys\Qualisys Track Manager\RT
Protocol" when QTM is installed in the default location). The RT Protocol folder
also contains a compiled version of the RTClientExample, that can be used for
testing and troubleshooting real-time streaming. The RT protocol dou-
mentation is also available online via https://docs.qualisys.com/qtm-rt-pro-
tocol/.
QTM can also be controlled through its REST API. For more information, see
the RESTful_QTM_API PDF included with the QTM installer in the REST sub-
folder ("C:\Program Files\Qualisys\Qualisys Track Manager\REST").
The real time performance depends on the computer and graphical card spe-
cifications, the number of cameras, the number of markers and the complexity
of the used AIM model(s), amongst others. For more information and tips for
improving real-time performance, see chapter "How real time works in QTM"
on the next page.

Resources
Qualisys offers a variety of resources that can be used for real-time streaming,
including:

QTM Connect
Plugins/clients for streaming data from QTM into third party software,
such as Matlab, LabVIEW, or game engines, such as Unreal Engine or
Unity.

SDKs
Open source SDKs for developing custom QTM clients for C++, C#, Python,
etc.

RUNNING THE SYSTEM 591


The QTM Connect installers and SDKs are available via the Qualisys downloads
page at https://www.qualisys.com/downloads/ and via the Qualisys GitHub
page at https://github.com/qualisys/.

How real time works in QTM


Real time processing is always active when QTM is in preview mode. The pro-
cessing steps are selected in the Real time actions list on the Processing page in
the Project options dialog.
It is also possible to stream data from a file. Streaming from a file is started by
clicking Play with Real-Time output in the Play menu (keyboard shortcut: Ctrl
+ Shift + Space). This can be useful for testing or developing real-time client
applications.
The port settings and the client control settings can be accessed and modified
on the Real-Time Output page in the Project Options dialog. For detailed
information about the protocol and the commands, refer to the QTM
RT protocol documentation.
Streaming data

The data that is processed in real time can be viewed in QTM and streamed via
TCP/IP through the supported protocols (RT protocol and OSC). The TCP/IP
server is always active and waits for a connection from a client program.
Almost all data that is acquired and processed in QTM can be streamed, includ-
ing:
l 2D data (unlinearized, linearized)

l 3D data (Identified, Unidentified, Residuals)

l 6DOF data (Position, Rotation matrix, Euler angles, Residuals)

l Skeleton data (T-pose, Segment positions and orientations in global or


local coordinates)
l Analog data (Analog boards and EMG)

RUNNING THE SYSTEM 592


NOTE: Data is buffered in the analog equipment, which means that
there will not always be new analog data for every marker frame.

l Force data (Force, Moment and COP)

NOTE: Data is buffered in the analog equipment, which means that


there will not always be new force data for every marker frame.

l Eye tracker data (gaze vector, pupil size, etc.)

l Parameters (e.g. Label name, Statistics for the RT performance, Force


plate settings)
l Events

l Images (from cameras in Marker intensity and Video mode; only sup-
ported in preview mode, not when streaming data from file)
Controlling QTM

QTM can also be controlled via the RT protocol, for example for:
l Starting/stopping of a calibration or a capture

l Changing camera mode

l Modifying settings under Project Options

Real-time frequency

The real-time marker frequency is set on the Camera system page in the Pro-
ject options dialog. You can use the Reduced real time frequency option if a
high capture rate is required but you still want real time data during the cap-
ture. It is recommended that the reduced frequency is an integer divisor of the
capture rate.
These are the different steps in the real-time process:

RUNNING THE SYSTEM 593


1. The cameras capture data at a fixed frequency.

2. QTM processes the data in real-time as fast as possible. The processing


steps are performed in parallel so that for example the 3D tracking can
run faster than AIM or the Skeleton solver.
3. Data is sent on the TCP/IP output and the frequency displayed in the
status bar is equal to the last processing step for 3D data. If QTM takes
too long time processing a frame, the next frame captured by the cam-
eras will not be processed for that processing step. This is shown in the
main status bar as a decreased RT frequency.
Check that the processing can be done fast enough in QTM by looking at the
status bar, the RT frequency should be the same as the camera frequency.
There will also be a message in the status bar if any RT frames are dropped or if
there is a mismatch between the camera frames. For more information see
chapter "Main status bar" on page 79.
Optimization of real-time performance

Use the following guidelines to get as fast real-time as possible:


l Activate only the necessary processing steps on the Processing page.

l Set the GUI update to 15 Hz on the GUI page, or shut it off completely to
get the maximum performance.
l Check that the RT frequency is stable during the RT measurement. Lower
the rate if it changes to much from the specified capture rate.
Real-time streaming of data from external input devices

All external equipment samples, such as analog, EMG, force and eye-tracker
samples, are sent with each camera frame, e.g. if the analog capture rate is
three times the camera capture rate there will be in average three analog
samples sent with each frame. However, because of the buffering in the
external equipment the number of samples sent with each marker frame can
differ, but the total number of samples will be correct.
The analog and EMG is started in synchronization with the markers. The analog
boards are always started in synchronization with the sync out signal. The EMG
on the other hand needs to be started with the trigger button to be syn-
chronized, see respective EMG system in chapter "Wireless EMG systems" on
page 804.

RUNNING THE SYSTEM 594


NOTE: The external equipment sample rates are always as configured
for the capture rate in capture mode, even if the real-time processing fre-
quency is reduced.

Real time latency

Real time latency is the time it takes from the marker exposure to the time
when the TCP/IP signal can be received on another computer. The latency can
be divided into the parts in the image below. The size of the different parts
shows approximately how long time the different steps will take.

This delay will depend on the following factors:


Number of cameras
Increasing the number of cameras can increase the delay. Both because
of the extra data and extra complexity in tracking

Computer
The computer performance will influence the latency and because QTM
runs on Windows the latency may also differ depending on which other
programs that are running.

Because the latency is system and computer dependent it is not possible to


give a latency for any system. For Oqus cameras the system latency can be mon-
itored in QTM. Enable the display of latency with the Show Real-Time Latency
option on the GUI page in the Project options dialog. When enabled the
latency will continuously be updated in the status bar. The latency is calculated
by comparing the time stamp at exposure with the time stamp of a UDP packet
sent to the cameras by QTM at the end of the calculation pipeline. With this
method the latency will match the latency of a third-party software receiving
the data stream over UDP/IP from QTM’s real-time server. Contact Qualisys AB
for if you want more help on how to estimate the latency.

RUNNING THE SYSTEM 595


NOTE: The Real-Time Latency estimation is only implemented on Oqus
cameras.

Outline of how to use real time


The following steps describe how to use real-time and send the data to a real-
time client application.

1. Before you start the real-time, make sure that you have an AIM model or
6DOF bodies for the movement that you are going to capture, see chapter
"Automatic Identification of Markers (AIM)" on page 624 and "6DOF track-
ing of rigid bodies" on page 649, respectively.

NOTE: It is best if the AIM model is specific for the subject that will
be captured.

2. Switch on the camera system and start QTM.

3. Open a new file by clicking the New file icon .

4. Open Project options dialog and go to the Processing page.

5. Activate the Real-time actions that you want to use. For example, for
Visual3D typically the following actions are used: Pre-process 2D data,
Track each frame: 3D, Apply the current AIM models and Calculate
force data.

RUNNING THE SYSTEM 596


NOTE: The option Process every frame is disabled by default.
QTM will then skip individual frames in case the real time pro-
cessing exceeds the frame duration, which minimizes the real-time
latency by prioritizing the most recent data. When the option is
enabled then each frame is processed, which in some cases
improves the real-time tracking and identification. However it is
important to note that if too many frames exceeds the frame dur-
ation, then processing every frame will result in increased real-time
latency.

6. Check the settings for the actions that have been selected.

7. Go to the Camera system page and set the capture rate. The maximum
capture rate depends mainly on the computer specifications, as well as
the complexity of the AIM model.

RUNNING THE SYSTEM 597


NOTE: Using a too low real-time frequency may lead to suboptimal
3D tracking depending on the velocity of the movements.

8. Go to the GUI page and set the Real time mode screen update to 15 Hz.
This step is not necessary, but it will give the computer more time to pro-
cess the marker data.
9. Test the real-time with the motion that you want to capture. Look espe-
cially at how the AIM model is applied and if the RT frequency shown in
the Status bar is close to the capture rate. If it differs too much, lower the
Real time frequency, it might also help with a new AIM model or changing
the tracking parameters.
l If the real-time is slow close all windows, including the Data info win-
dow and the Trajectory info windows, except for a 3D view win-
dow.
l When the real-time is working fine you can even turn off the GUI on
the GUI page in the Projects Options dialog. This will reduce the
processing capacity needed by QTM.
l AIM and 6DOF data can be restarted with the F9 shortcut or the
Apply AIM model button .
10. When you are satisfied with the real-time in QTM, you can connect to QTM
from the other program or client.
Server mode

QTM can be started in server mode to minimize the interference by QTM when
running an RT client. To start QTM in server mode use the command-line argu-
ment /server when starting QTM. The modified behavior for the server mode is
as follows:

RUNNING THE SYSTEM 598


l QTM is started minimized.

l Non-critical dialogs are hidden.

l Crash dumps are automatically sent to Qualisys without user feedback.

RUNNING THE SYSTEM 599


Processing dat a

Introduction to data processing


After the capture is finished the data must be processed in QTM to get 3D data,
6DOF data and force data. Some functions in QTM, such as AIM and exports,
can also be performed in processing. The processing can be done automatically
after the capture, manually in the current capture file, see chapter "Repro-
cessing a file" on the next page, or automatically on saved files by using Batch
processing, see chapter "Batch processing" on page 605.
The settings of the automatic processing, which are the same for the pro-
cessing directly after capture and batch processing, are specified on the Pro-
cessing page in the Project options dialog. The only exception is the Auto
backup option which is only available when making a measurement and batch
process.
The following processing steps are available on the page:
l Auto backup, see chapter "Auto backup" on page 572.

l Pre-process 2D data, see chapter "Processing 2D data" on page 609.

l Track the measurement

l 3D, see chapter "3D tracking measurements" on page 614.

l 2D, see chapter "2D tracking of data" on page 618.

l Merge with Twin slave, see chapter "Working with QTM twin files" on
page 522.

l Gap-fill the gaps, see chapter "Trajectories" on page 338.

l Apply the current AIM model, see chapter "Automatic Identification of


Markers (AIM)" on page 624.

l Calculate 6DOF, see chapter "6DOF tracking of rigid bodies" on page 649.

l Solve Skeletons, see chapter "Tracking of skeletons" on page 671.

PROCESSING DATA 600


l Apply SAL, see chapter "How to use SAL" on page 700.

l Calculate force data, see chapter "Force data calculation" on page 703.

l Calculate gaze vector data

l Export to TSV, see chapter "Export to TSV format" on page 711.

l Export to C3D format, see chapter "Export to C3D format" on page 727.

l Export to Matlab file, see chapter "Export to MAT format" on page 729.

l Export to AVI, see chapter "Export to AVI file" on page 739.

l Export to FBX file, see chapter "Export to FBX file" on page 742.

l Export to JSON file, see chapter "Export to JSON file" on page 743.

The settings of each processing step can be found in the tree to the left in the
dialog. For information about the settings, see chapter Project options.

NOTE: The two tracking options under Track the measurement are
mutually exclusive and can therefore not be used at the same time.

Reprocessing data

Reprocessing a file
Files can be reprocessed at any time. Reprocessing can be useful in the fol-
lowing scenarios:
l In case not all processing actions were applied during the capture,

l In case you used the Store real-time data option for a quick review during
the capture session,
l In case you want to modify processing options to optimize the results, for
example

PROCESSING DATA 601


l for improving the tracking,

l for changing the calibration, see chapter "Changing the calibration"


on page 604,
l for applying improved AIM models or rigid body definitions.

For reprocessing the currently open file, click the Reprocessing icon or
Reprocess in the Capture menu top open the File reprocessing dialog. If you
want to process multiple files at the same time, you can use batch processing
instead, see chapter "Batch processing" on page 605.

The File reprocessing dialog contains all of the settings from the Project
options dialog that can be applied to a file in post-processing, see chapter "Pro-
cessing" on page 320. Follow these steps to set the reprocessing steps.

PROCESSING DATA 602


1. Select the reprocessing steps that you want to perform from the list.

WARNING: When Track the measurement with 3D or 2D is used,


any previous labeling, editing or other processing actions applied to
the existing trajectories will be undone. Any manual labeling or edit-
ing will be lost.

2. Choose the source for the settings from the drop-down lists to the right.
There are two options:
Measurement
Choose the measurement settings for reprocessing the file based on
the previously used settings. This is the default choice. You can
modify the settings by editing them in the dialog. The edits will only
apply to the current file.

Project
Choose the project settings if you want to use the current settings
used in the project. Any edits will update the current project settings
as well.

If you need to reprocess multiple files with the same settings, you
can use the project settings and modify them for this purpose if
needed.

PROCESSING DATA 603


NOTE: Notes on settings source:
l For the export steps, there is no choice for the settings
source. They are always based on the current project set-
tings.
l The Track the measurement settings are split into
Tracking parameters and Calibration settings so that
the source can be set separately. Make sure that the cal-
ibration is correct when using the option project.
l The Calculate force data settings are split into Force
plate settings and Force plate location so that the
source can be set separately.

3. Click OK to start the reprocessing.


l Alternatively, press Cancel to close the dialog, for example, if you
opened it for reviewing the measurement settings.
4. Save the file to keep the changes.
l If the reprocessing does not give the expected results, you can close
the file without saving to discard the reprocessing.

NOTE: The Calibration and Linearization pages are connected for files
that have been 3D tracked. This means that you have to change the cal-
ibration to change the linearization. The Linearization page is only dis-
played so that you can check that the linearization files are correct.
However, for a 2D tracked file there is no Calibration page and you can
change the linearization on the Linearization page, see chapter "Lin-
earization" on page 249.

Changing the calibration

The calibration of the measurement can be changed in the following ways:


l By recalibrating the existing calibration of the file, see chapter "Recal-
ibration" on page 563.

PROCESSING DATA 604


l By replacing the calibration with another calibration for the same camera
setup, for example a more recent calibration if one of the cameras was
bumped during the capture session.

The calibration of a file is changed on the Calibration page in the File repro-
cessing dialog. The file's current calibration is shown under the Calibration
file heading and the results are displayed under the Calibration results head-
ing. Replace the current file with the wanted calibration file by clicking Load
other and locating the file. The calibration files are located in the Calibration
folder in the project folder and the name of a calibration file includes the date
when the calibration was performed.
If the current calibration file has been recalibrated, it must be loaded again to
reload the parameters, otherwise the file is reprocessed with the old cal-
ibration results that are stored in the file.
To start reprocessing the file click OK. The new settings in the File repro-
cessing dialog are only valid for the active file. To keep the changes the file
must be saved.

Batch processing
With batch processing, several capture files can be processed with the same set-
tings. The same actions can be performed in batch processing as in the pro-
cessing that is performed after a measurement has been captured.

PROCESSING DATA 605


NOTE: If you only want to batch export files, you can also use the Batch
Exporting dialog, see chapter "Batch exporting" on page 710.

To start a batch process, follow the steps below.

1. Click Batch process on the File menu. Select the files and click Open to
open the Batch processing dialog.

l Alternatively, select multiple files in the Project view, right-click on


the selection, and click Batch process... in the context menu.
2. Select the processing steps that you want to perform from the list. The
processing steps are the same as those that can be applied in post-pro-
cessing, see chapter "Processing" on page 320.
l Use the Auto backup files before processing if you want to keep a
backup of the file. The files are saved in the same directory with the
current date and time in the filename.

PROCESSING DATA 606


WARNING: When Track the measurement with 3D or 2D is used,
any previous labeling, editing or other processing actions applied to
the existing trajectories will be undone. Any manual labeling or edit-
ing will be lost.

3. Choose the source for the settings from the drop-down lists to the right.
For some settings the only possible source is the project, for the other set-
tings there can be three possible options:
Processed file
The settings used are the ones present in each processed file and
cannot be edited. This option is always selected by default because
the other options replace the original settings from each file.

Project
The settings are copied from the current project settings to each pro-
cessed file. The settings can be edited in the tree-view to the left.
Editing the settings will change the current project settings as well.
This option is often the best to use when you want to change the set-
tings of the processed files.

Present file
The settings are copied from the present file (the file open in the
QTM window where you opened the Batch processing dialog) and
can be edited in the tree-view to the left.

PROCESSING DATA 607


NOTE: Notes on settings source:
l The Present file settings source is not available for the
Pre-process 2D data and Track the measurement
steps.
l For the export steps, there is no choice for the settings
source. They are always based on the current project set-
tings.
l The Track the measurement settings are split into
Tracking parameters and Calibration settings so that
the source can be set separately. Make sure that the cal-
ibration is correct when using the option project.
l The Calculate force data settings are split into Force
plate settings and Force plate location so that the
source can be set separately.
l If Merge with Twin slave is used with Project settings
then the twin master file is automatically merged with a
file with the same name and ending with _slave that is loc-
ated in the same folder.

4. Click OK. The actions are performed on the files and then the files are
saved and closed.

WARNING: Files are automatically saved when batch processing.


Any changes cannot be reverted, unless you restore a backup.

Loading a saved calibration

If you want to batch reprocess files with a specific calibration, it is easiest to


load it into the project.
The calibration can be loaded as follows:

1. Click on Load other on the Current calibration respectively the Cal-


ibration page.

PROCESSING DATA 608


2. Select the calibration file (.qca) in the file dialog and open it. The file dialog
opens by default in the Calibration folder of the project, but you can also
browse for another folder.
When loading a saved calibration , the following data is loaded:
l The calibration and its Transformation settings,

l The camera system and the camera order,

NOTE: If the current camera configuration differs, it is recom-


mended to load a more recent calibration again before you start a
new measurement.

l The linearization files.

NOTE: If there are newer linearizations stored in the camera, you


will get a message when you start a new measurement.

Processing 2D data
The 2D data is processed when the Pre-process 2D data option is enabled on
the Processing page in Project Options. When reprocessing 2D data in a file,
the 3D, 6DOF and skeleton data must also be reprocessed for the changes to
have any effect.
The following pre-processing and filtering methods for 2D data are available:
l Circularity filter (a legacy filtering method, only supported by Oqus cam-
eras)
l Software marker masks

l Marker size filter

PROCESSING DATA 609


How to use circularity filter (Oqus only)
The circularity filter can be used to correct or delete bad or partially hidden
markers. It is only available when the Marker circularity filtering option is
enabled for the camera, see chapter "Marker circularity filtering (Oqus only)" on
page 541.
The quality of 3D data and how to use the filter depends a lot on the number of
cameras in the system. If you have three or more cameras covering the same
markers, it is usually better to try to remove the non-circular markers. Because
then you have enough data to create the 3D data anyway. However, if there is
usually just two cameras viewing the same markers the data can become more
complete by correcting the non-circular markers. This will result in more 3D
data, which otherwise cannot be calculated.
The filter has two options: Correct center point of non-circular markers and
Discard non-circular markers.. One of the options must be selected for the
2D preprocessing. Use the following advice to set the settings for the different
options.
Correct center point of non-circular markers
Any markers that have been filtered out by the camera as non-circular will
be fitted to a circle if possible and merged markers are also corrected.
Circularity level
In this mode you can set the Circularity level, in the camera set-
tings, depending on how many markers you have. Try to find a level
where the camera has time to process the markers.

Effect on 3D data
This mode will give you more exact 3D data than the unfiltered data,
because the corrected markers have more exact center points than
the camera data. It has most effect if only two cameras can see the
marker. For example you can use this mode to improve the accuracy
of the calibration, since the wand markers are partially hidden by
the stick.

PROCESSING DATA 610


NOTE: Corrected merged markers are always used by the 3D
tracker. Uncorrected non-circular markers will not be used by
the 3D tracker if two other cameras with circular markers can
see the marker.

Discard non-circular markers


All markers that have been detected as non-circular by the camera are
removed from the 3D tracker.
Circularity level
Check the dark yellow markers in the 2D data to find out which mark-
ers are not used by the 3D tracker. Then set the level, in the camera
settings, according to how much you want to delete.

Effect on 3D data
This mode will give you less 2D data than the unfiltered data and
actually most of the time it is better to try to correct the markers.
The biggest advantage with this mode is that it is faster than cor-
recting the markers. So if you have many cameras that can see each
marker you will not loose much 3D data and it will also be more
exact.

IMPORTANT: If you change this option when processing a file, you must
also reprocess the 3D, 6DOF and skeleton data for the change to have
any effect.

How to use software marker masks


The software marker masks can be used to remove markers in specific areas of
recorded 2D data. Software masks can be useful to discard unwanted markers
that are contained in a limited area throughout the measurement.
Adding and applying software masks

Add software marker masks to a file with the Marker mask tool in the 2D view.
The mask is shown as a blue rectangular area. Use the Selection tool in the 2D
view to resize and move the mask with the pointer. There can be up to 100

PROCESSING DATA 611


masks per camera. Software marker masks are drawn linearized, so it is best to
keep the Show linearized data option enabled on the 2D view settings page.

To apply the software masks, the QTM file must be reprocessed with the Pre-
process 2D data step and the Software marker mask option enabled. After
processing, the masks are surrounded by blue edges, indicating that they have
been applied. The markers that are filtered are surrounded by white rect-
angles.

IMPORTANT: The 3D, 6DOF and skeleton data must also be reprocessed
for the change to have any effect.

Undoing and removing software masks

To undo the filtering by software masks, the QTM file must be reprocessed with
the Pre-process 2D data step enabled and the Software marker mask option
disabled. The masks will be preserved, but the filtering is undone. The masks
are shown as blue rectangular areas without the surrounding edge, the same
as when they were just created.
To delete masks, right-click on a mask and select one of the available options:
l Delete this mask to delete the currently selected mask,

l Delete all masks to delete all masks for the current camera,

l Delete all masks for all cameras to delete all masks for all cameras.

PROCESSING DATA 612


If the masks were applied before deleting, the markers are still filtered, as
shown by the white rectangles and the remaining blue edges indicating the
masked areas.

To undo the filtering, the file must be reprocessed with the Pre-process 2D
data step enabled (the Software marker mask option can be either enabled
or disabled).

IMPORTANT: The 3D, 6DOF and skeleton data must also be reprocessed
for the change to have any effect.

How to use marker size filter


The 2D data can be filtered on the size of the markers. Set the minimum and/or
maximum size in subpixels for the markers in the file with the options: Min-
imum marker size and Maximum marker size.
To apply the marker size filter, the file must be reprocessed with the Pre-pro-
cess 2D data step enabled, and one or both of the Minimum marker size and
Maximum marker size options enabled. When a marker is discarded by the fil-
ter, it is displayed with a white square in the 2D view, and the 2D data is not
shown in the Data info window.

To undo the size filtering, reprocess the file with the Pre-process 2D data step
enabled and the Minimum marker size and Maximum marker size options
disabled.

PROCESSING DATA 613


IMPORTANT: If the marker size filter is changed when processing a file,
the 3D, 6DOF and skeleton data must also be reprocessed for the change
to have any effect.

3D tracking measurements
3D tracking is the process of constructing 3D points from the 2D rays from the
cameras and sorting them into trajectories. It is activated with Track the meas-
urement and the 3D option on the Processing page and is controlled by the
tracking parameters, see chapter "3D Tracker parameters" on page 325.
How to set the 3D tracker parameters depends a lot on the measurement
setup, for some advice, see chapter "Advice on how to set 3D Tracker para-
meters" below. For an example of how to evaluate if the tracking is working
properly, see chapter "3D tracking evaluation" on page 616.
Track the measurement with the 3D option is used by default while capturing
data. Files can also be reprocessed in case tracking was skipped during the
recording, or to optimize tracking results, see chapter "Reprocessing a file" on
page 601. For information about how to optimize tracking results in case of
problems, see chapter "Troubleshooting tracking" on page 1028.

Advice on how to set 3D Tracker parameters


The default parameters for the 3D tracker will yield good tracking results in
most situations. However, in some situations, when you notice that the tracking
is not optimal, changing the 3D tracker parameters may help to improve the
results. When changing these parameters, it is beneficial to have a good under-
standing of how they influence the tracking. For more information about this,
see chapter "3D Tracker parameters" on page 325 and "Tracking and accuracy"
on page 617.
Here follows some general advice on how the 3D tracking parameters can be
adapted for special tracking situations.

PROCESSING DATA 614


Prediction error
The Prediction error is by default set to 25 mm. If you have problems
with switching of trajectories it may help to use a lower value. For very
fast movements, for example tracking a golf club during a swing, you can
use a higher value.

NOTE: When capturing very fast movements it is important to use a


high capture rate for optimal 3D tracking performance.

Maximum residual
The Maximum residual is by default set to 6 mm. For very large tracking
volumes it may help to increase the value for more robust tracking. Redu-
cing the value may help to avoid switching artifacts, but may lead to more
fragmented trajectories. A good way to estimate the Maximum residual
is to track the file and then plot the residual of all of the trajectories. Then
you can set the parameter two or three mm higher than the maximum
residual of the trajectories.

Minimum trajectory length


The minimum trajectory length is by default set to 2 frames, which is the
minimum value for a part or trajectory. Increasing the value may help to
reduce the amount of ghost markers, for example when tracking highly
reflective objects.

Minimum ray count per marker


The minimum ray count is by default set to 2, which means that a marker
can be defined in 3D when seen by two cameras. Increasing the value to 3
or more cameras may help to reduce the amount of ghost markers, for
example when tracking highly reflective objects. However, this may come
with the disadvantage of more fragmented trajectories.

Ray length limits


The ray length limits are by default calculated automatically based on the
calibration, but they can also be set manually. Using a certain minimum
ray length can help to avoid ghost markers close to the cameras if there
are many spurious reflections, for example from a water surface. For very
large tracking volumes, it can be useful to set a maximum ray length so
that cameras that are at a large distance of a marker are excluded from

PROCESSING DATA 615


the calculation of its position.

3D tracking evaluation
Before data collection, it is a good idea to test that the tracking is working prop-
erly for the current camera setup. Place all markers required on the meas-
urement subject and also one marker in each corner of the measurement
volume. Then perform the following test:

1. Open a new QTM file and perform a capture. The movements should be
as similar to the planned recordings as possible.
2. Go to the View menu and open the File information dialog. In the dialog
there is a list of the tracking residuals per camera. There are two numbers
per camera. Points are the number of 2D points from the camera that are
used to calculate 3D in the measurement. Avg res is the average residual
for the 2D rays from the camera at the average marker distance in the
whole measurement.

a. Check that all cameras roughly have a similar amount of Points. If a


camera has much less points, it either cannot see the markers or
has moved since the calibration.
b. Check that the Average residual is similar across the cameras. If
there is a camera that has a much higher residual a new calibration
is needed.
3. Check the number of trajectories in the Unidentified trajectories win-
dow. Each of the trajectories should represent a marker and therefore the
number of trajectories should match the number of markers in the meas-
urement. However, if some markers have been occluded during the meas-
urement there will be some extra trajectories.

PROCESSING DATA 616


4. Check that the markers can be seen in the entire measurement volume by
comparing the trajectories data to the four markers placed in the corners
of the measurement volume. It is easier to see if the volume is viewed
from above (XY- view).
If the tracking test is not acceptable, check the following elements:
l Check the calibration: A faulty calibration gives erroneous data to the
tracking function, see chapter "Calibration of the camera system" on
page 543.
l Check camera positioning: Incorrect positioning results in hidden markers
and problems with the calculation of markers; see chapter "Camera pos-
itioning" on page 439.

NOTE: After rearrangement of the cameras, a new calibration is


needed.

l Check the tracking parameters, see chapters "3D Tracker parameters" on


page 325 and "Advice on how to set 3D Tracker parameters" on page 614.
l Check the size of the reflective markers, see chapter "Marker size" on
page 529.
For example, if the markers are not visible in 2D views, check the focus,
choose a larger marker size, or change the exposure settings (e.g.
increase exposure time, or increase the aperture). If there are many
merged markers in the 2D views, choose a smaller marker size, or change
the exposure settings (e.g. decrease the exposure time, or decrease the
aperture).

NOTE: After changing the aperture on a camera with a manual lens,


a new calibration is needed.

Tracking and accuracy


Incorrect settings of the tracking parameters may degrade the accuracy of the
system.

PROCESSING DATA 617


Setting the Prediction error and the Max residual too low or too high has a
negative impact on the accuracy. If the options are set too low, camera data
can be discarded that should have been used for calculation of a trajectory,
which can result in extra noise and gaps. If they are set too high, the risk
increases for false data to be used with the trajectory, which can result in
increased noise. However, the likelihood for this is reduced because of the out-
lier filtering process in the 3D calculation, so it is better to set the parameters
somewhat higher than expected.
Setting the Prediction error too high can also increase trajectory flipping, i.e.
when trajectories are swapped or get mixed up in other ways.

2D tracking of data
With the 2D tracker you can track the 2D data of one camera and present the
markers as trajectories in the 3D view window. Because just one camera is
used to track the trajectories, they will all be in a plane in front of the camera.
This means that you can only measure the distances in two dimensions.
As the 2D data is tracked only in a plane no calibration is used to track the data.
The only settings that are needed is the distance between the plane and the
camera. That setting and other 2D tracking settings are set on the 2D tracking
page in the Project options dialog, see chapter "2D tracking" on page 330.

NOTE: The 2D tracker can only track one camera at a time and it can only
be used on a captured file and not in real-time.

2D tracking can be applied either after a measurement as a processing step or


by reprocessing a file, see chapter "Reprocessing a file" on page 601. To use 2D
tracking after a measurement you have to activate it on the Processing page in
the Project options dialog. And then check the settings on the 2D tracking
page.

PROCESSING DATA 618


Using 2D tracking
When capturing data which will be used by the 2D tracker it is important that
the movement only takes place in a plane and that the cameras are placed so
that the camera sensor is parallel with the plane of the movement. Otherwise
the distances in the measured by QTM will not be correct. After you have
placed the cameras follow these steps to use 2D tracking.

1. Before you start the capture you should activate 2D tracking with Track the
measurement and 2D on the Processing page in the Project options dia-
log. Then, set the settings on the 2D tracking page, see chapter "2D tracking"
on page 330.

a. Choose which camera to use. 2D tracking can only be applied to one


camera at a time. So if you have more than one camera in the camera
system you must retrack the file to get the trajectories for another cam-
era.

b. It is a good idea to use filter so that you do not get may short tra-
jectories.

c. Measure the distance between the measurement plane and the camera
sensor and enter the Distance to measurement plane setting.

d. Define the orientation of the coordinate system with the three settings
for the axes.

2. Start a capture with Start capture on the Capture menu. 2D tracking cannot
be applied in real-time so you have to make a measurement to see whether
your settings are correct.

3. When the 2D tracker is finished the result is trajectories that can be pro-
cessed and handled exactly as trajectories created by 3D tracking. The only
difference is that the trajectories will all be in the same plane.

PROCESSING DATA 619


Identification of trajectories
After the trajectories have been tracked, the next step is to identify or label
them. The process of identifying and modifying trajectories is generally
referred to as trajectory management.
QTM offers a range of tools to facilitate manual and automatic identification of
markers. The following chapters describe how to manually manage trajectories,
and how to use AIM for automatic identification of markers.

Manual identification of trajectories


Even if you plan on using an AIM model to identify the trajectories, you need to
identify at least one file by hand, since an identified file is needed to create an
AIM model. Manual identification can be performed with any of the following
three methods, and they can also be used together.
Quick identification method

Manual identification is best done with the quick identification method. Follow
these steps to use this method:

1. To use the quick identification method you need to have a label list in the
Labeled trajectories window. Load a label list or enter the labels manu-
ally with Add new label on the Trajectory info window menu.
2. Select the first label in the Labeled trajectories window that you want to
identify.

3. Hold down Ctrl + Alt or select the Quick identification cursor and
click on the marker or trace in the 3D view window that matches the
selected label.
l When you click on the marker or trace, it will be added to the selec-
ted label. If you have not selected any label, the marker you click on
will be added to the first empty label in the list. If there is no empty
label in the list, the marker you click on will be added as a new label
at the end of the list, and you can edit its name at once.

PROCESSING DATA 620


l You can also add additional parts to a trajectory with the quick iden-
tification method. Select the labeled trajectory in the list, then hold
down Ctrl + Alt and click on the unidentified marker.
However, if the two trajectories cannot be joined (because of too
much overlap) the selection will just move to the next label without
any action.
l By holding down Shift when clicking on the trajectory you can join a
trajectory to the previous label in the Labeled trajectories window.
This option can for example be used if the most recently identified
trajectory only covers a part of the measurement range. In this case,
you can add more parts by clicking on the next part while holding
down Shift.
4. QTM will move the selection automatically to the next label in the list.
Hold down Ctrl + Alt every time you click on the next corresponding
marker. Continue until all markers in the Labeled trajectories window
are identified.
5. You can start and restart the process on any label in the list.
l This means that if you click on the wrong marker, you can undo the
identification with Ctrl + Z and then make sure that the correct label
is selected and use quick identification again.
Drag and drop method

Manual identification can also be performed by dragging and dropping tra-


jectories. The following drag and drop methods are available to identify tra-
jectories:
l From the Unidentified trajectories window to the Labeled trajectories
window.
l From the 3D view window to the Labeled trajectories window.

l One trajectory (trace) in the 3D view window to another trajectory (trace)


in the 3D view window.
A trajectory can be dropped either on a label to add the data to that label or in
the empty part of the Labeled trajectories window to create a new label. If it
is dropped on a label that already contains data, the two parts will be joined if
they do not overlap.

PROCESSING DATA 621


Identify method

It is also possible to use the Identify option on the Trajectory info window
menu, which appears whenever you right-click a trajectory, either in a 3D view
window or in a Trajectory info window. For information about functions in the
Trajectory info windows, see chapter "Trajectory info window menu" on
page 144.
The following features are useful when identifying trajectories.
l The keyboard shortcut C can be used to center on the selected marker in a
3D view window.

l The Trajectory info window menu option Center trajectory in 3D, will also
center on the selected trajectory or part. However, if the trajectory is not vis-
ible at current frame it will also move the current frame to the first frame of
the trajectory.

l The keyboard shortcuts J and Shift + J can be used to jump to the next
unidentified trajectory or part.

Tips and tricks for manual identification


The following chapter describes some different tips and tricks to help you in
the manual identification of trajectories. However, it is important to first use
the AIM functionality because it will significantly reduce the identification time,
see chapter "Automatic Identification of Markers (AIM)" on page 624.
Sometimes the trajectories have to be manually identified, for example when
identifying the file for the AIM model. Then you can use the following tips to
make the process faster and easier.
Visualization tips
l Use different colors on the trajectories so that it is easier distinguish
between them in the 3D view.
l Use bones between markers so that the marker pattern is more visible.

l View the trace of the trajectories to check which trajectories that match.

PROCESSING DATA 622


l How much trace you can show depends on the measurement, for
example in a straight movement you can show the trace for the
whole measurement, but if it is a long measurement with a rotation
movement the trace have to be shortened so that you can still see
the individual traces.
l Turn off the trace for some of the labeled trajectories if it is hard to
see the individual traces.
l Enable Show labeled trajectory information and Show trajectory
count in the 3D view settings for a quick overview of the quality of the
labeling at the current frame.
l Use the Trajectory Overview window to get a quick overview of the
status of the labeled trajectories and areas that require attention (e.g.,
gaps or spikes).
l Use the Trajectory Editor window for an interactive view of the 3D data
of a trajectory.

TIP: Use the Lock key to keep the focus on the trajectory you are
working on, and uncheck the horizontal auto zoom to keep the time
line at a fixed interval.

Identification methods
l Use Quick identification to identify markers in the 3D view.

l Drag and drop traces or markers from the 3D view to the Labeled tra-
jectories list or onto a trace in the 3D view.
l If it is a long measurement start with identifying just a part of the meas-
urement and then generate an AIM model and apply that to the whole
measurement.
Other tips for navigation and trajectory management
l Use scrubbing (Ctrl + drag) and trace range zoom (Shift + Mouse wheel)
features for browsing through the measurement.

PROCESSING DATA 623


l Hover the mouse over a marker or trace in the 3D view to get information
about the range that it covers to check where it can fit. When hovering
over a marker you get the information for the whole trajectory, while hov-
ering over a trace only gives the information for that part of the tra-
jectory.
l Use the Center on trajectory tool (keyboard shortcut C) to center the
view on the trajectory that you are interested in.
l Use the Cut trajectory trace tool (keyboard shortcut X) to cut a trace
and then use Alt-click and drag and drop to move the trace to another tra-
jectory.
l Use the swap parts method to fix swapped trajectories or parts, see
chapter "Swap parts" on page 151.
l Use the Jump to next trajectory/part tool (keyboard shortcut J and
Shift + J) to jump to and center on the next unidentified trajectory or part
in time.
l Use Alt-click in the 3D view window to select only a part if parts have not
been joined correctly. Then drag and drop it to where you want it.
l If there are unwanted reflections that have been tracked in the file,
expand the traces, select all traces in an area with Shift + drag and delete
them. This way you can delete several trajectories at once. If there are
many reflections in the same position, you may have to repeat the pro-
cess several times because only the visible traces are selected.

Automatic Identification of Markers (AIM)


Automatic identification of trajectories in the QTM software is performed by a
module called AIM (Automatic Identification of Markers). The AIM model is cre-
ated from identified files and can then be applied to any measurement that cap-
tures similar motions with the same marker set. This makes AIM very flexible
and powerful - any motion can be made into an AIM model.
How AIM identifies trajectories

AIM identifies trajectories from angles and distances between markers. In the
AIM model QTM has saved the angle and distance ranges of added meas-

PROCESSING DATA 624


urement files. This is the reason why it works better when you add more meas-
urements since the AIM model will then include more movements. When the
model is applied AIM tries to find the solution that fits these ranges the best.
There are a few limitations to AIM that are good to know:
l AIM does not work on only two markers. The main reason is that AIM works
in local coordinates, i.e. it does not take into account whether "marker x is to
the left of marker y". AIM only cares about "what location marker x has com-
pared to the locations of markers y and z". This means that there is no
left/right, over/below or in front of/behind in the model.

l AIM needs to have movement in the first file used to create the model. Other-
wise AIM will not identify the connections between markers in the AIM
model, since they are based on the shortest possible distances between
markers.

l AIM looks for the best solution in each frame and then joins the solutions.
This means that AIM will join and split trajectories depending on the AIM
model so that you get the best match. The measurement subject can leave
and enter the volume and AIM will continue to join the trajectories.

l Because AIM is working with relations between markers, all of the markers in
the model file must be included in the measurement to ensure that all of the
trajectories can be identified. E.g. if a marker on the shoulder disappears for
a long time it is difficult for AIM to find the arm.

Generating an AIM model

AIM needs a model file to work, which is generated from a measurement or


multiple measurements with identified trajectories.

General instructions

1. Make a measurement and identify all of the trajectories that will be


included in the model. For the best results, follow the instructions in
chapter "Guidelines for data added to AIM models" on page 631. In

PROCESSING DATA 625


particular, keep the following in mind:
l Make sure that the subject in the file is moving. The movement does
not have to be exactly the same as the rest of the captures, just that
it includes most of the movement. When you add more meas-
urements to an AIM model, it becomes less important that the
added files include a lot of movement.

NOTE: Even if you want to make an AIM model for a static


measurement it is important to have some movement in the
first file. This is to make sure that the internal bone definitions
are correct. However, the subsequent files can be static
because then the hierarchy is already defined.

l Make sure that the trajectories have the correct identity throughout
the file. You can select a smaller measurement range to delete the
incorrect parts if you do not want to identify the whole file.
l In the first file used to generate an AIM model, it is recommended to
not have any static markers together with the moving subject.
Because it will make it very hard for AIM to find a solution. Only use
an AIM model with static markers if the markers are always static.
2. Select the measurement range of frames that will be used in the model.
Use the scroll boxes on the Timeline control bar to select the range.
Choose a range where the motion is typical for what you are going to cap-
ture, most of the time it is best to use the whole measurement range
unless there are large gaps or incorrect data.

3. Click the Generate model icon on the AIM toolbar or click Generate
model on the AIM menu.
4. Choose one of the options Create new model, Create new model
based on marker connections from existing AIM model or Add to
existing model, see the below chapters for a detailed description. Click
Next to start the model generation.

PROCESSING DATA 626


5. Verify the AIM bones. When creating a new model, you can also edit the
bones. For the other two options, you can only inspect the bones. Click
Next to proceed.
6. Review the results. If anything goes wrong in the process it will be dis-
played in the Results list. For example you will get a warning if the first
file is static or if the added file is too different and in those cases you can
choose to abort the process.

PROCESSING DATA 627


Click Finish to exit the dialog.
l If you created a new AIM model, it is loaded to the Applied models
list on the AIM page in the Project options dialog. Any other model
in this list will be moved to Previously used models list.
l If the data is added to an existing model, then the model files are
updated and nothing is changed on the Applied models or Pre-
viously used models lists.
The model can now be applied to other measurements, see chapter "Applying
an AIM model " on page 634.

Create new model

Use this option to create a new AIM model from scratch. When creating a new
model, it is very important that the bones are connected in the correct way to
reflect the hierarchy of the model. You can use the Delete bones tool in the
toolbar to remove the bones that are wrong. For more information about the
AIM bones, see chapter "How to verify and edit AIM bones" on page 632.

PROCESSING DATA 628


When done, enter a File name for the model and click OK. The model is gen-
erated and saved in the AIM models folder of the project.
There can be two warnings when generating AIM models related to static mark-
ers.
l First, if all labeled markers are too static, there is a warning because a
static measurement can lead to erroneous internal bones in the AIM
model, for example two knees can be joined. This will then be used even if
the added files are with a movement. You should only use an AIM model
with static markers if the markers are always static.
l The other warning is if some of the labeled markers in the file are static
while others are moving. The problem is the same as above that you will
get internal bones in AIM that makes it harder to apply the model. If you
need static markers you can either create a separate AIM model for those
markers or have empty labels in the first AIM model and identify them
manually.

PROCESSING DATA 629


Create new model based on marker connections from existing AIM
model

Use this option to create a new AIM model based on the AIM bones (hierarchy)
and the visual properties (colors and visual bones) of an existing AIM model.
The data of the existing AIM model is ignored, so the new AIM model is based
on the movements of the current measurement only. The labels of the meas-
urement should correspond to the labels of the existing AIM model. This option
consists of the following steps:

1. Select an existing AIM model from the file dialog. If the labels of the selec-
ted model do not correspond to those of the measurement, AIM gen-
eration will fail.
2. Review the AIM model in the AIM model visualization window. Note that
the AIM bones cannot be edited.
3. Specify the file name of the new AIM model.

Add to existing model

This option is used to extend the motion of an existing model. It can be used to
generalize the model for application to a wider range of subjects or move-
ments.
Select one or more models from the list to which you want to add the move-
ment in the current file.
l The models in the list are the models that are available on the AIM page.
By default all the models in the Applied models list are selected. Click
Add model to browse for another model.
l The file must include all the labeled trajectories that are included in the
model. It can however have other labeled trajectories as well, only those
with the same name as in the model will be used to update the model.
This means that you can update several models at the same time. E.g. if
the file include three person with different names on their respective
labeled trajectories, then you can select all three AIM models in the list at
once and all three models will be updated.
l If you only add to one AIM model you will get a dialog that displays the
AIM bones of the AIM model that is added to. If the AIM bones are placed
incorrectly in the model it is best to make a new AIM model and start

PROCESSING DATA 630


training that one instead.
l Backups of the previous AIM model(s) are automatically saved in the AIM
models\Backups folder in the project.
Alternatively, you can use Generate from multiple files option from the
AIM menu to add multiple files to one or more AIM models at once. When
using this option, first select the model(s) to add the data to, and then select
the measurement files. The selected files must contain all the labels that are
present in the selected AIM models.

Guidelines for data added to AIM models

Follow these guidelines when you check the data before adding it to or creating
an AIM model.
l Make sure that the subject in the file is moving. The movement does not
have to be exactly the same as the rest of the captures, just that it
includes most of the movement. When you add more measurements to
an AIM model, it becomes less important that the added files include a lot
of movement. You can verify that the movement is large enough by look-
ing at the AIM bones, see section "How to verify and edit AIM bones" on
the next page.

NOTE: Even if you want to make an AIM model for a static meas-
urement it is important to have some movement in the first file. This
is to make sure that the internal bone definitions are correct.
However, the subsequent files can be static because then the defin-
ition is already correct.

l Make sure that the trajectories have the correct identity throughout the
file. You can select a smaller measurement range to delete the incorrect
parts if you do not want to identify the whole file.
l The colors of the trajectories and any bones between them are also saved
in the model. For example to make it easier to verify the identification
after AIM has been applied, the colors of the trajectories can be set with
Set different colors on the Trajectory info window menu before cre-
ating the AIM model.

PROCESSING DATA 631


l Trajectories that are left unidentified or discarded will not be included in
the model.
l If you have several subjects in the same measurement it is recommended
to make an AIM model for each subject, see chapter "AIM models for mul-
tiple subjects in the same measurement" on page 637.
The following two steps are not as important when you add files to the
AIM model.
l The AIM model will be better if all of the trajectories are visible through-
out the whole measurement. Therefore it is a good idea to gap-fill the tra-
jectories as long as the gaps are relatively small. However if there are
large gaps it is sometimes better to omit the gaps by limiting the meas-
urement range in step 2 below. This is especially important if you use
clusters, because then you have several trajectories that move along the
same path.
l The data of each trajectory should be as good as possible. This is espe-
cially important if your model includes trajectories that are close to each
other. Then a small erratic noise on a trajectory can make two trajectory
paths come very close to each other, which makes the identification dif-
ficult. Therefore if the data is erratic, i.e. the trajectory movement is not
smooth when you play the file, you should delete the erratic data by the
following process:
l Find the frame where the erratic data starts. Split the trajectory at
that frame, see chapter "Split part after current frame" on page 151.
l Then step through the frames to locate where the trajectory data is
OK again. Split the trajectory again.
l Delete the part with erratic data that you have just created.

l Repeat these steps for all of the frames where you can find erratic
data and then gap-fill the trajectories

How to verify and edit AIM bones

When an AIM model is created there are AIM bones created automatically
between the markers that have least movement between them throughout the
file. Because these AIM bones are then kept even if you add more meas-
urements it is important that they are correct. Therefore there is a step when
creating a new AIM model where you can verify and edit the AIM bones.

PROCESSING DATA 632


l The AIM bones are displayed as red bones between the markers.

l There is no relationship between the AIM bones and bones created by the
user.
l The number of AIM bones are as few as possible.

l You can rotate, translate and zoom the 3D image of the AIM bones.

l You can delete AIM bones with the Delete bone tool, see below.

l The frame used for displaying the AIM model is the first frame that
include all or as many as possible of the markers.
To verify that the AIM bones are correct follow these instructions:

1. Check that all of the markers are displayed. If there are missing markers it
means that there is no frame with all of the markers. It is recommended
to use another file where all of the markers are available in at least one
frame.

PROCESSING DATA 633


2. Check that the AIM bones are connected between the correct segments.
For example if you are creating a model of a person that has been stand-
ing still then the knees and feet may be connected with AIM bones, see
image below. Then use the Delete bone tool to remove the bones or
select the bone and press Delete. Every time a bone is deleted AIM will
look for a new solution, so you may have to delete several bones before
you have an acceptable solution. This is especially true if the meas-
urement is static, then it is often better to create a new file with more
movement.

3. The AIM bones may not look as good as your user created bones, but that
does not necessarily mean that it is wrong. For example, the AIM bone
between a head and the body can look strange if the marker nearest to
the head is on the shoulder.
Applying an AIM model

The AIM model can be applied to files with captured motions that are similar to
any part of the motion in the model. I.e. if your model includes a lot of different
motions made by a human, then the captured trajectories can be identified if
another human makes one or more of these motions. An AIM model can be
applied either as a processing step or manually.
Below follows a description of how to apply an AIM model manually on a cap-
ture file.

PROCESSING DATA 634


1. Open a capture file in QTM.

2. Make sure that all of the trajectories are in the Unidentified trajectories
window or the Identified trajectories window. Discarded trajectories are
not used by AIM. It is also important that all required parts of the meas-
urement is included in the selected measurement range, since the AIM
model is only applied to trajectories with parts within the selected meas-
urement range.
3. Click the Apply model icon on the AIM toolbar or click Apply model on
the AIM menu. The AIM application settings dialog is displayed.

NOTE: The AIM settings are copied from Project options, if you
want to use the AIM settings and models saved in the file you need
to reprocess the file and select from measurement as source, see
chapter "Reprocessing a file" on page 601.

4. Check that the Applied models are correct. It is possible to apply several
AIM models to the same file, or to apply the same AIM model to multiple
actors. In the latter case, set Nr To Apply to the number of actors. For
more information, see chapter "AIM models for multiple subjects in the
same measurement" on page 637 .

PROCESSING DATA 635


You can also change the AIM model application parameters, see chapter
"AIM model application parameters" on page 340. However, if you add
measurements to the AIM model you usually do not have to change the
parameters.
5. Click OK, the AIM module will try to apply the model to the measurement.
Click Cancel in the dialog below to abort AIM.

If any of the models cannot be applied to the trajectories the dialog below
will appear. Showing how many of the bodies (models) were applied.

The AIM results dialog will display the result of all of the applied AIM
models. The Partial results are AIM models where not all of the markers
have been identified. For the Failed models none of the markers have
been identified.
The most likely reason for the Partial result is that the model doesn't com-
pletely match the captured motion. Then it is recommended to manually
identify the file and then add it to the existing AIM model, see chapter
"Generating an AIM model" on page 625.

NOTE: When you get a partial solution it is recommended to


retrack the file before trying to apply a modified AIM model, since
the cuttings made by AIM can make it harder to apply AIM the next
time.

It can also help to reduce the selected measurement range so that the
AIM model is applied only on a smaller part of the measurement. For
example, if the subject walks in and out of the volume it can help to
reduce the selected measurement range to where the subject is inside the
volume.

PROCESSING DATA 636


6. Check the file to see whether all of the trajectories have been identified
correctly. If needed, manually correct any mistakes using the trajectory
management tools described in chapter "Identification of trajectories" on
page 620. In case you encounter too many mistakes, AIM may need to be
trained with one of the new measurements, see chapter "Add to existing
model" on page 630.

NOTE: In case you encounter too many swapped trajectories, it is


often best to reprocess the file with 3D tracking enabled and create
a new AIM model that corresponds better with the measurements.

7. When the AIM model has been successfully applied, the capture file must
be saved to keep the changes.
When you apply a model as a processing step, either directly after a capture or
in a batch process, it works exactly like when applying it manually. The model is
set on the AIM page in the Project options dialog.
AIM models for multiple subjects in the same measurement

You can apply AIM for automatic labeling of multiple subjects at the same time.
The following scenarios can be distinguished:
l Tracking of subjects with different marker configurations and labels. This
requires a unique AIM model for each subject, which should be present in
the Applied models list in the AIM dialog or settings page.
l Tracking of subjects with similar marker configurations (e.g. multiple act-
ors with same marker set). In this situation the same AIM model can be
applied multiple times by specifying Nr to Apply in the AIM dialog or set-
tings page.
l A combination of the above.

For more information about how to create and apply AIM models when cap-
turing multiple subjects, see the chapters below.

PROCESSING DATA 637


Unique subjects

For a unique identification of multiple subjects you must make a separate


AIM model for each subject. Follow the process below to generate and apply
the AIM models. For more information on how to generate and apply AIM mod-
els see "Generating an AIM model" on page 625 and "Applying an AIM model "
on page 634. If you already have a generic AIM model, which you want to use to
create new AIM models for unique subjects, see chapter "Use generic AIM
model to create unique models" on the next page.

1. Make sure that each subject has a unique marker configuration. If you
have similar types of subjects (e.g. multiple actors), you can use a dif-
ferent marker pattern to distinguish the subjects, for example by placing
four markers on the chest in different patterns.
2. Make sure that the label names of the subjects are different so that
QTM can identify the labels when you add measurements to the
AIM models.
3. Create an AIM model for each subject. This can be done per subject as
described in "Generating an AIM model" on page 625. Alternatively, you
can generate the AIM models using a single measurement with multiple
subjects as follows.
l Make a Range Of Motion (ROM) measurement with all subjects sim-
ultaneously.
l Label the markers per subject as described in "Manual identification
of trajectories" on page 620.
l Select the trajectories of one subject in the 3D view or the trajectory
list, right-click on the selection and click Generate AIM model from
selected trajectories....
l Generate the AIM model as described in chapter "Generating an AIM
model" on page 625
l Repeat step b. to d. for every subject.

4. To apply multiple AIM models to a measurement you must add them to


the Applied models list on the AIM page in the Project options dialog.

PROCESSING DATA 638


NOTE: When you create a new AIM model all of the other models in
the list are moved to the Previously used models list.

5. You can add data to multiple AIM models from a single measurement by
selecting the applicable AIM models under the Add to existing model(s)
option. The data will then be added to the AIM model(s) with matching
label names.

Similar subjects

When tracking similar subjects with the same marker configuration, you can
use one generic AIM model and apply it multiple times. This is done as follows:

1. Add the AIM model to the Applied models list on the AIM page in the Pro-
ject options dialog.
2. Set Nr To Apply to the number of subjects to be tracked simultaneously.

3. Optionally, enable Use random trajectory color for each AIM file. This
will help to distinguish the subjects from one another.
4. Start tracking.

NOTE: It is not possible to uniquely identify subjects when applying a


generic AIM model multiple times. The order of the subjects may be dif-
ferent for every measurement.

Use generic AIM model to create unique models

If you are measuring multiple subjects with the same marker configuration you
can create specific AIM models for each subject based on a generic AIM model.
The advantage is that this allows for identification of individual subjects.

1. First, follow the steps in "Similar subjects" above


a. Add the AIM model to the Applied models list on the AIM page in
the Project options dialog.

PROCESSING DATA 639


b. Set Nr To Apply to the number of subjects to be tracked sim-
ultaneously.
c. Optionally, enable Use random trajectory color for each AIM file.
This will help to distinguish the subjects from one another.
d. Make a measurement with multiple subjects.

2. Select the trajectories of one subject in the 3D view or the trajectory list.

3. Optionally, a prefix can be added to the selected labels by right-clicking on


the selection and clicking Add prefix to selection.... This can be helpful
to distinguish subjects from each other, especially when using the same
colors scheme.
4. Right-click on the selection and click Generate AIM model from selected
trajectories....
5. Generate the AIM model as described in chapter "Generating an AIM
model" on page 625.
6. Repeat steps 2. to 4. for each subject.

7. Start tracking.

Editing of trajectories
The Trajectory Editor can be used to view and edit trajectory data. Manual
editing trajectory data should be the last step in the processing of trajectories.
Before editing trajectory data, it is important to make sure that the quality of
the tracking is optimal, and that the identification of trajectories is correct and
as complete as possible. Remaining irregularities of trajectories can be edited
with the Trajectory Editor.
The main functions of the Trajectory Editor are:
l Locating and filling gaps,

l Detecting and smoothing artifacts.

You can open the Trajectory Editor window in the following ways:

PROCESSING DATA 640


l By clicking Trajectory Editor in the View menu,

l By clicking the Trajectory Editor button on the Trajectory toolbar,

l Via the keyboard shortcut Ctrl + T.

The functions and lay-out of the Trajectory Editor is described in chapter "Tra-
jectory Editor window" on page 159.
The following chapters describe in more detail how to use the Trajectory
Editor.

Gaps
Gaps are to be understood as missing parts within a trajectory. Two common
types of gaps are:

1. Gaps due to incomplete identification.

2. Gaps due to missing 3D data, for example in case of occlusions,

The first type of gaps can generally be kept to a minimum by improving the AIM
model. In some cases, manual identification may be needed to add unidentified
parts to the trajectory. Remaining gaps of the second type can be detected and
filled in the Trajectory Editor.
Identification and selection of gaps

Gaps are indicated in the plot area by a brown area and an amber indicator
below the time axis at the frames where data is missing. The gaps are also lis-
ted in the Gaps panel in the Points of Interest sidebar.

PROCESSING DATA 641


You can select a gap in the graph by clicking on the gap area. The selection
range is then set from the start to the end of the gap, and the gap area will be
highlighted.
Alternatively, you can select gaps in the Gaps panel in the Points of Interest
sidebar. By holding the Shift key you can select multiple gaps. The selection
range is then set from the start of the first gap to the end of the last gap. When
double clicking on a gap in the Gaps panel, the current frame will be moved to
the start of the selection.
Filling of gaps

To fill gaps follow these steps:

1. Select the trajectory you want to edit. You can select the trajectory in one
of the Trajectory info windows, or by clicking on the marker in the 3D
view window.
2. Select a gap or a frame range that includes the gaps you want to fill.

3. Press the Fill button.

The gaps included in the selected range will then be filled using the Type spe-
cified in the Fill settings. The filled parts are indicated in the plot area by a
dark blue area and a blue line below the time axis at the gap-filled frames, and
the data series is shown as a dashed line.
The available fill types can be divided into two categories. Linear, Polynomial,
Relational and Kinematic are gap fill types that interpolate the data between
the respective edges of the trajectory. Polynomial gap fill can only be applied
when there is trajectory data on both sides of the gap. Linear and relational
gap fill can also be used for extrapolation for gaps at the start or end of the cap-
ture. The filled parts of the trajectory are indicated as type Gap-filled in the Tra-
jectory info window. Static and Virtual are virtual fill types that are
independent of the data at the edges of the gap. These types can also be

PROCESSING DATA 642


applied when there is no surrounding data, for example to empty trajectories
or to gaps at the start or the end of the capture. The filled parts of the tra-
jectory are indicated as type Virtual in the Trajectory info window.
Static
Gaps are filled with fixed values for X, Y and Z as specified.

Linear
Gaps are filled by a linear interpolation between the respective edges of
the trajectory. If the gap is at the beginning or the end of a capture, the
gap will be filled with a constant value (first or last data value after or
before the gap, respectively). Options:
Max Length: Check to apply a maximum length for linear gap filling.

Frames: Specify maximum number of frames for linear gap filling.

Polynomial
Gaps are filled by a cubic polynomial interpolation between the respective
edges of the trajectory. Polynomial gap fill uses the data from two frames
before and after the gap for the calculation. If the gap starts or ends with
a trajectory part which consists of one frame, then the polynomial gap fill
will use the next available frame to calculate the polynomial. If there is no
other trajectory part then polynomial gap fill is calculate using just that
one frame.Options:
Max Length: Check to apply a maximum length for polynomial gap
filling.

Frames: Specify maximum number of frames for polynomial gap


filling.

Relational
Gaps are filled based on the movement of surrounding markers. The user
specifies one, two or three context markers, which are used to define a
local coordinate system (LCS). The filled trajectory consists of a linear
interpolation in the LCS, which is then transformed to the global coordin-
ate system. If the gap is at the beginning or end of a capture, the filled
part will be extrapolated. The options can be selected from a drop down
list. Alternatively, you can drag and drop trajectories on the field if the tra-
jectory is locked. The following options are available:

PROCESSING DATA 643


Origin (required): Marker defining the origin of the LCS, preferably a
marker close to the target marker. If only Origin is defined, the filled
trajectory will be entirely translational.

X Axis: Marker defining the primary axis of the LCS. Preferably, the
movement of the primary axis should be strongly correlated with
that of the target marker. If the target marker is on the same line as
the Origin and the X Axis markers, it should be sufficient to specify
only these two context markers.

XY Plane: Marker defining the secondary axis of the LCS, fixating its
full orientation.

Rigid body: Check this option in case the three context markers are
part of a rigid structure. If checked, the pose of the LCS will be cal-
culated by means of a rigid body fit. The definition of the rigid body
will be based on the average of the relative configuration of the con-
text markers across the gap. This option can only be checked if all
three context markers are specified.

Virtual
Gaps are filled based on the movement of surrounding markers. Similar
to Relational gap fill, except that the filled part is independent of sur-
rounding trajectory data. The user specifies one, two or three context
markers, which are used to define a local coordinate system (LCS). The
filled trajectory represents the movement of the origin of the LCS, with an
optional offset specified by the user. The options can be selected from a
drop down list. Alternatively, you can drag and drop trajectories on the
field if the trajectory is locked. The following options are available:
Origin (required): Marker defining the origin of the LCS, preferably a
marker close to the target marker. If only Origin is defined, the filled
trajectory will be entirely translational.

X Axis: Marker defining the primary axis of the LCS. Preferably, the
movement of the primary axis should be strongly correlated with
that of the target marker. If the target marker is on the same line as
the Origin and the X Axis markers, it should be sufficient to specify
only these two context markers.

PROCESSING DATA 644


XY Plane: Marker defining the secondary axis of the LCS, fixating its
full orientation.

Rigid Body: Check this option in case the three context markers are
part of a rigid structure. If checked, the pose of the LCS will be cal-
culated by means of a rigid body fit. The definition of the rigid body
will be based on the average of the relative configuration of the con-
text markers across the gap. This option can only be checked if all
three context markers are specified.

Offset: Apply an offset to the virtual point, relative to the origin of


the LCS.
X, Y, Z: Offset values of X, Y and Z in mm or percent.

Relative Offset (%): check to specify offset in percent relative


to the distance between the Origin and X Axis markers.

Kinematic
Kinematic gap fill of markers associated with skeleton segments or rigid
bodies based on current skeleton or 6DOF data.

NOTE: In some cases there may be a spike in the trajectory just before or
after a gap. Deleting such artifacts before gap filling can improve the qual-
ity of the filled part, in particular for the polynomial method.

Spikes
Spikes are to be understood as discontinuities between consecutive frames
within a trajectory. The Trajectory Editor can be used as a tool to detect
spikes. Two common types of spikes are due to:

1. Tracking artifacts, for example due to occlusions or partial occlusions of


markers in one or more cameras.
2. Labeling artifacts, for example due to swapping of adjacent markers.

Tracking artifacts can generally be kept to a minimum by optimizing camera


setup, camera settings for the best possible visibility and distinction of mark-
ers, as well as by optimizing the tracking parameters. Labeling artifacts can be
avoided by using an appropriately trained AIM model (see chapter "Automatic

PROCESSING DATA 645


Identification of Markers (AIM)" on page 624), or repaired by swapping parts
(see chapter "Swap parts" on page 151). As a final step in the processing of
data, the resulting spikes can be edited in the Trajectory Editor.
Detection and selection of spikes

The detection of spikes is based on an acceleration threshold, which can be


adjusted in the Settings Sidebar of the Trajectory Editor window.
Spikes are indicated in the plot area by a red indicator below the time axis. The
spikes are also listed in the Spikes panel in the Points of Interest sidebar.
You can select a spike by clicking on the corresponding indicator below the
time axis. The selection range is then set from four frames before the start to
four frames after the end of the spike.
Alternatively, you can select spikes in the Spikes panel in the Points of
Interest sidebar. By holding the Shift key you can select multiple spikes. The
selection range is then set from four frames before the start of the first spike to
four frames after the end of the last spike. When double clicking on a spike in
the Spikes panel, the current frame will be moved to the start of the selection.
Smoothing

To smooth spikes follow these steps:

1. Select the trajectory you want to edit. You can select the trajectory in one
of the Trajectory info windows, or by clicking on the marker in the 3D
view window.
2. Select a spike or a frame range that includes the spikes you want to
smooth.
3. Press the Smooth button.

PROCESSING DATA 646


The spikes included in the selected range will then be smoothed using the Type
specified in the Smooth settings. The edited parts of the trajectory are indic-
ated as type Edited in the Trajectory info window.

NOTE: When applying smoothing to a selection, the whole selected


range will be smoothed, not only the detected spikes.

The following smoothing types are available:


Moving Average
Smoothing by means of a unweighted moving average across a centered
window. The Window Size can be set by the user. The Moving Average
type is most suitable for smoothing of local spikes.

Butterworth
Smoothing by means of a fourth order Butterworth low-pass filter. The
Butterworth type is most suitable for reduction of high-frequency noise
across large frame ranges. The available options are:
Cutoff: Cutoff frequency specifying the pass band and the stop band
of the filter. As a rule of thumb, the cutoff frequency should be a
factor 2-3 higher than the highest frequency of interest.

Alternatively, spikes can be removed by means of deleting and gap filling. To


delete a spike, you can select a frame range and delete it by pressing the
Delete button, or by pressing the Delete key on the keyboard.

PROCESSING DATA 647


NOTE: Notes on the Butterworth filter:
l The minimum frame range for the Butterworth filter is 7 frames.
This minimum also applies to the size of parts in case the selected
frame range includes gaps. When the selected frame range includes
parts that are smaller than 7 frames, the Butterworth filter cannot
be applied.
l The Butterworth filter is applied in forward and reverse direction to
compensate for the latency of the filter. The order and cut-off fre-
quency are adjusted so that the effect of the filter is in accordance
with the specified order and cutoff frequency when applying the fil-
ter in one direction.

Adding virtual trajectories


The following chapters describe how to create virtual trajectories based on
other trajectories or at a fixed position. For instructions on how to create vir-
tual trajectories using a rigid body, see chapter "Virtual markers calculated
from 6DOF data" on page 662.
Create virtual trajectories via the Trajectory info window menu

A simple way to create a virtual trajectory is to select multiple trajectories and


create a virtual trajectory corresponding to their geometric average. Follow
these steps:

1. Select one or more trajectories and right click on the selection in the Tra-
jectory info window or in the 3D View window.
2. In the context menu, click Add new trajectory > Virtual (Average of
selected trajectories). This will create a new trajectory at the geometric
average of the selected trajectories.
3. Name the new trajectory.

Create virtual trajectories using the Trajectory Editor

The Trajectory Editor can be used to create virtual trajectories, giving more
control to the user to define their location. Follow these steps to create a vir-
tual trajectory with the Trajectory Editor:

PROCESSING DATA 648


1. Create a new empty trajectory via Trajectory info window menu or by
pressing the Insert (Ins) key and name it.
2. Select the new trajectory.

3. Open the Trajectory Editor window, if it is not already visible.

4. In the Settings sidebar of the Trajectory Editor window, choose fill type
Constant or Virtual.
5. Specify the fill options, depending on the chosen type:
l For type Constant, specify X, Y and Z coordinates. This will result in
a static virtual trajectory at position X, Y and Z in the global coordin-
ate system.
l For type Virtual select the trajectories on which the virtual tra-
jectory should be based. The options can be selected from a drop
down list. Alternatively, you can drag and drop trajectories on the
field if the trajectory is locked. By default, the virtual trajectory will
correspond to the trajectory selected as Origin. By specifying the X,
Y, and Z values under Offset, you can move the virtual trajectory to a
different position, relative to the local coordinate system as spe-
cified by the fill options. See chapter "Filling of gaps" on page 642 for
more information about the options.

6DOF tracking of rigid bodies


6DOF tracking identifies rigid bodies from the 3D data of a measurement and
calculates their translation and rotation. It is activated with Calculate 6DOF on
the Processing page in the Project options dialog and is controlled by the
6DOF tracking parameters, see chapter "6DOF tracking" on page 345.

6DOF versus 3D
A single point in the mathematical sense can be fully described by its three
coordinates (X, Y, Z) in a Cartesian coordinate system (3D). This can also be
described as having three degrees of freedom. A physical body, such as a ship
or an aircraft, requires three additional degrees of freedom to fully char-

PROCESSING DATA 649


acterize its pose, namely three rotation angles. The characterization of an
object by three positional coordinates and three rotational angles is therefore
called ”six degrees of freedom” (6DOF).
To measure the 6DOF data of a physical body QTM uses 6DOF bodies (rigid bod-
ies). A 6DOF body consists of markers placed on an object at fixed positions.
The physical design of a 6DOF body is described in chapter "How to design a
6DOF body" below.
For more information about how to create rigid bodies and how to process
6DOF data, refer to the following chapters.

Creating 6DOF bodies


The creation of a rigid body for measurement of 6DOF data encompasses the
following steps:
l Physical design of the rigid body and marker placement

l Creating the rigid body definition, including


l Definition of the local coordinate system

l Choice of the reference system for calculation of 6DOF data

How to design a 6DOF body

A rigid body is a configuration of points at fixed relative positions. For the 6DOF
tracking function to work the rigid body must be defined by at least three
points, which should not be on the same line. When designing a 6DOF body,
consider the following aspects.

Physical design of the rigid body

When choosing or designing a rigid body, it is important to consider the type of


markers that you want to use, as well as their placement. The rigid bodies
should allow for optimal placement of the markers. This often depends both on
the shape of the rigid body, as well as the positioning of the cameras. Gen-
erally, there should be clear lines of sight from the cameras to the markers for
optimal tracking.

PROCESSING DATA 650


In case you need to have an exact definition of the local coordinate system of
the rigid body, you can design the body in a way that the markers can be placed
at known positions.
For more detailed information about the choice of markers and their con-
figuration, see the chapters below.

Choice of markers

When designing a rigid body, take the following in consideration when choosing
markers:

Marker size
The markers should be large enough for tracking, see chapter "Marker
size" on page 529. Furthermore, the markers should be small enough so
that there is sufficient separation between the markers to minimize the
occurrence of merging of the markers in the camera views.

Active vs passive markers


In most situations, passive markers are recommended. Active markers
can be considered in the following situations:
l In large volume applications, where the distance between markers
and cameras are too large for tracking passive markers, for example
marine applications. In this case, it is recommended to use long-
range active markers (LRAM).
l In case you need to track many rigid bodies simultaneously, you can
consider the use of Active or Naked Traqrs. The active markers use
sequential coding for reliable rigid body identification of multiple
rigid bodies. Due to the automatic marker identification there is no
need to use an asymmetric or unique marker configuration. Active
markers can also be useful for robust real-time tracking of drones.

TIP: For Crazyflie drones, dedicated active marker decks for


Qualisys can be purchased via Bitcraze.

Marker configuration

The following aspects are important for the tracking of rigid bodies.

PROCESSING DATA 651


Marker placement
The rigid body must consist of at least 3 markers, which should not be in a
straight line. In many cases, it can be an advantage to use 4 or more mark-
ers for more redundancy, for example in case markers may get occluded
during the measurement.

The markers should be placed in a way that they are clearly visible and
well separated in the camera views for optimal tracking.

Marker distribution
Generally, the accuracy of the 6DOF data increases with the distance
between the markers. Try to place the markers as far apart as possible in
different directions, spanning up a plane or a volume.

Asymmetry
Marker should be applied in an asymmetric configuration, so that the ori-
entation of the rigid body can be uniquely determined. When the markers
are placed symmetrically, the measured orientation of the rigid body may
flip.

NOTE: This limitation does not apply when using active markers
with marker ID.

Uniqueness
When tracking multiple rigid bodies simultaneously, they should have
unique configurations so that they can be identified. If the configurations
are the same, it is not possible to distinguish between the rigid bodies.

NOTE: This limitation does not apply when using active markers
with marker ID.

Using extra markers for the rigid body definition

The marker placement can also be important for the definition of the rigid bod-
ies. For example, markers can be placed in a way that they can be associated
with important positions, axes or planes for the definition of a models

PROCESSING DATA 652


geometry.
In some cases, these landmark positions may not be suitable for tracking the
rigid body during a measurement. In that case, you can temporarily place extra
markers at these positions and use them for the definition of the rigid body.
After completing the definition, these extra markers can be removed, while
keeping the markers used for tracking.
Here are some guidelines for how to use extra markers for creating a rigid body
definition:

1. Apply the markers for tracking and the extra markers for the definition of
the rigid body.
2. Place the model in a way that all markers are visible.
l If it is not possible to see all markers in a single measurement, you
can also make several recordings with the model in different ori-
entations.
3. Make an initial rigid body definition, see chapter "Definition of 6DOF bod-
ies" below.
l In case you have multiple recordings, use the Add to rigid body (6
DOF) function from the Trajectory info window menu to add selec-
ted markers to the rigid body definition.
l If needed, you may even add virtual markers to the definition, for
example a point in between two markers.
4. Use the Translate body and Rotate body methods to change the local
coordinate system of the rigid body according to your specifications.
5. You can now remove the extra markers from the physical rigid body.
l You must either remove the corresponding points from the rigid
body definition, or check their Virtual option, so that QTM will not
try to track the point as a marker. When using the virtual option, the
point will be added as a virtual trajectory when measuring the rigid
body in QTM, which can for example be useful for visualization pur-
poses.
Definition of 6DOF bodies

When the 6DOF body has been designed according to the previous chapter you
must add the rigid body definition to the 6DOF Tracking page in QTM. This can
be done in the following alternative methods:

PROCESSING DATA 653


I. Use the Define rigid body (6DOF) method from the Trajectory info win-
dow menu,
II. Use the Acquire Body method from the 6DOF Tracking page,

III. Use the Load Bodies option from the 6DOF Tracking page to load stored
rigid body definitions from a file,
IV. Use the Add Body and Add Point options from the 6DOF Tracking page
to manually add a rigid body and its points.
The first two methods can be used to create a new rigid body definition based
on a measurement. The latter two require that the points in the rigid body
definition are known, either from prior measurement or by design.

NOTE: Markers can be included in multiple rigid body definitions. This


can for example be used for having several representations (local coordin-
ate systems or reference systems) for the same rigid body. The marker
should have the same label in the respective rigid body definitions.

Creating new rigid body definitions from a measurement

New rigid body definitions are most commonly created from a measurement.
This is the easiest way to define the exact positions of the points for the best
tracking results.

Preparations

Before creating the rigid body definition, the following should be taken into
account:

The marker labels


If you want to be able to identify the markers, you can label them in
advance. The easiest way is to record a file and label the markers before
creating the rigid body definition using method I. When selecting labeled
trajectories, the labels will be used for the points of the rigid body. When
selecting unlabeled trajectories, the points of the rigid body will be named
as the rigid body followed by a number (e.g., "My rigid body - 1").

PROCESSING DATA 654


NOTE: You can always change the labels of the points afterward by
editing the rigid body definition.

The alignment of the rigid body


The alignment of the rigid body relative to the global coordinate axis will
be used to set the orientation of the local coordinate system when defin-
ing the rigid body. This can be an easy way to set the initial orientation.
When defining the rigid body from a file using method I, the first frame of
the file or the selected range will be used to set the orientation.

Method I: Define rigid body (6DOF)

The Define rigid body (6DOF) method can be used to create a rigid body defin-
ition from selected trajectories in a measurement. This can be done during pre-
view or from a file. An advantage when creating the rigid body from a file is that
the rigid body points can be based on an average of multiple frames. The
object is allowed to move during the recording. This way the rigid body points
can be obtained from an average across many poses, making the definition less
dependent on local measurement errors.
Follow these steps to create a new rigid body definition:

1. Select the trajectories associated with the rigid body in the 3D view win-
dow or the Trajectory info window.
2. Right-click on the selection and choose Define rigid body (6DOF) from
the context menu. Alternatively, use the keyboard shortcuts F8 or Shift +
F8. There are two options:
l Current Frame (Shift + F8): Create rigid body definition based on
marker configuration of the current frame.
l Average of frames (F8): Create rigid body definition based on the
average configuration of the markers in the current capture. The
advantage of this method is that the statistics of the marker pos-
itions are taken into account. The Bone tolerance setting of the
body will be based on these statistics. This option is not available
when in Preview mode.
3. Specify the name of the rigid body.

PROCESSING DATA 655


The new rigid body definition is added to the Rigid bodies lists in the Project
options and to the current file. The origin of the local coordinate system of the
rigid body is set to the geometric center of the points included in the rigid body
definition. The orientation is set to that of the global coordinate system at the
current frame (when using Current Frame) or at the start frame of the file or
the selected time range (when using Average of frames). You can modify the
local coordinate by translation and rotation (see chapters "Translate body" on
page 350 and "Rotate body" on page 352) or change the reference coordinate
system (see chapter "Coordinate system for rigid body data" on page 354).

NOTE: When defining a new body in this way, the 6DOF data is re-cal-
culated in the file, which means that the 6DOF data of all other bodies in
the file will be updated as well.

Method II: Define 6DOF bodies using Acquire body method

The Acquire body method can be used to create a rigid body from markers that
are detected during preview. Rigid bodies created using this method are added
to the rigid body list in the Project Options.
Follow these steps to create a rigid body using the Acquire body method:

1. Start QTM and start a preview by clicking the New file icon .

2. Place the rigid body in the measurement volume so that the rotation of
the desired local coordinate system is known in reference to the global
coordinate system. One way is to place the body so that the desired local
coordinate system is aligned with the global coordinate system and then
the local origin can just be translated to the desired position.
3. Check that the markers on the 6DOF body do not merge in any of the cam-
eras' 2D views.
4. Open the Project options dialog in QTM and go to the 6DOF Tracking
page.
5. Click Acquire body to open the Acquire body dialog.

6. Click Acquire.

PROCESSING DATA 656


When the collection is done the new 6DOF body definition will be added to the
rigid body list on the 6DOF Tracking page. The origin of the local coordinate
system of the rigid body is set to the geometric center of the points included in
the rigid body definition. The orientation is set to that of the global coordinate
system during the acquisition. You can modify the local coordinate by trans-
lation and rotation (see chapters "Translate body" on page 350 and "Rotate
body" on page 352) or change the reference coordinate system (see chapter
"Coordinate system for rigid body data" on page 354).

Adding an existing rigid body definition

Method III: Load from file

Rigid body definitions can be loaded from an XML file using the Load bodies
button in the 6DOF Tracking page. This action will replace any rigid bodies
present in the list with those in the file. All rigid bodies that are loaded from a
file are by default enabled.
The XML file can be created by saving rigid bodies using the Save bodies but-
ton in the 6DOF Tracking page. The XML file can also be edited, for example,
you can add, delete or modify rigid bodies in the file before loading it.

Method IV: Manually create a rigid body definition

You can manually add a rigid body definition to the rigid body list using the Add
body button in the 6DOF Tracking page. You can then add points to the rigid
body definition using the Add point button. This method can be used if you
use a rigid body with know marker positions, for example one created using a
3D printer.

Editing rigid bodies

Rigid bodies and points by double clicking on a property or by using the Edit
buttons (Edit color, Edit point, Edit label). For information about the prop-
erties, see chapter "Rigid bodies" on page 346.

PROCESSING DATA 657


Definition of local coordinate system

When you have defined the points of the 6DOF body in QTM you can change
the definitions of the local coordinate system. The local coordinate system is by
default placed in the geometric center of the points.
The local coordinate system is used in the calculation of rotation and position
of the measured rigid body in reference to a reference coordinate system.
Therefore it is important that the local coordinate system is defined according
to the specifications of the measurement. The local coordinate system should
have an orientation and a location in reference to the points in the 6DOF body
definition, which is well-defined. Use a definition where the normal orientation
of the body is the same as no rotation, i.e. aligned with the reference coordin-
ate system.
When you have decided where the local coordinate system should be, use the
Translate and Rotate functions on the 6DOF Tracking page to specify the
local coordinate system, see chapter "Rigid bodies" on page 346.
Then you should also decide which coordinate system that the 6DOF body data
should refer to. This done in the Coordinate system for rigid body data dia-
log, which is opened by double-clicking on Global origin on the 6DOF bodies
page. See chapter "Coordinate system for rigid body data" on page 354 for the
alternatives.

NOTE: If you want to change these setting in a capture file you must
reprocess the file with the new setting.

PROCESSING DATA 658


Creating an active Traqr rigid body

The active Traqr is designed to be used as a rigid body. Each Traqr has the
same marker setup and is instead differentiated by the active IDs. The IDs of
the markers of the active Traqrs can be managed with the Traqr Configuration
Tool, for more information refer to its manual.
Follow these steps to create your active Traqr rigid body in QTM:

1. Place one active Traqr in the volume with the rotation that you want com-
pared to the global coordinate system.
2. Start preview with New on the File menu.

3. Open the 6DOF Tracking page in Project Options.

4. Click on Acquire body to create a new rigid body. Repeat this step for
each Traqr.

NOTE: If you know the IDs for each Traqr then you can acquire the
same Traqr and edit the IDs in the Rigid bodies list.

Tracking 6DOF bodies


Rigid bodies can be tracked in real time and in a capture. To track rigid bodies,
3D tracking and Calculate 6DOF must be enabled on the Processing page in
the Project options dialog. You can also use reprocessing to edit the 6DOF
data, see chapter "Calculating 6DOF data" on the next page.
The rigid body tracking behavior is defined by the settings in the 6DOF Track-
ing page in the Project options. Rigid bodies can be enabled or disabled for
tracking. The main parameters for controlling the tracking are Bone tolerance
and Max. residual. These parameters specify the tolerance for deviations of
measured trajectories from the rigid body definition. Optionally, 6DOF data can
be smoothed by choosing one of the available Filter presets, see chapter
"Smoothing 6DOF data" on page 356. The filtering is applied both in real time
and in a capture.

PROCESSING DATA 659


QTM will identify the trajectories of the 6DOF bodies and place them in the
Labeled trajectories window. For each 6DOF body the trajectories are named
according to the definition of the 6DOF body, see image below. The color of the
trajectories will be slightly brighter than the 6DOF body color.
If the bodies cannot be tracked in a file, you will get a warning with how many
bodies that have failed. There is also a warning if a label is repeated in the
Labeled trajectories list, in which case only the first label is used by the 6DOF
calculation.

NOTE: If several bodies share the same marker definition and name, the
trajectories will only be shown once in the Labeled trajectories window.

Calculating 6DOF data


When 6DOF tracking is activated, QTM attempts to identify the measured rigid
bodies from the 3D data. It is done by comparing the distances between points
in the 6DOF body definitions to all distances between the computed markers’
locations. The following rules are used for calculation of 6DOF data.
l For a marker to be included in the 6DOF body none of the distances to the
other markers in the body may exceed the Bone tolerance.
l When the trajectories have been identified the tracker computes the loc-
ation and orientation of each measured rigid body in the coordinate sys-
tem of the motion capture. The 6DOF tracker will automatically move
trajectories that are outside the Bone tolerance to the unidentified
labels. This is done in RT as well on tracking a file.
l If you have identified trajectories manually in a file the 6DOF cal-
culation will try to use all of the data, and then there will be no 6DOF
data in frames where a marker is outside the tolerance. To make the

PROCESSING DATA 660


6DOF calculation remove the data outside the tolerance you can
activate the Reidentify all body markers option while repro-
cessing, see below.
l A rigid body with sequentially coded active markers, such as the active
Traqr, uses only the ID and not the Bone tolerance to identify the mark-
ers.
l The 6DOF data is only calculated if the 6DOF body residual is less than the
Max residual.
l If there are frames when there are less than three markers that can be
tracked on the 6DOF body, the 6DOF tracking function cannot calculate
the 6DOF body in those frames.
l Each 6DOF body definition can just have one solution in the meas-
urement, but the 3D trajectory can be included in several 6DOF bodies as
long as the 6DOF definition is the same and the body points have the
same names.
l When a Filter preset has been selected, the 6DOF data will be smoothed
according to the preset parameters both in real time and in a capture, see
chapter "Smoothing 6DOF data" on page 356.
The 6DOF data can be recalculated in a capture file. This is because each 6DOF
body consists of labeled trajectories, from which the 6DOF data is calculated.
This means that if the 6DOF data is wrong you can look at the 3D data to find
the problem. To edit the 6DOF data just change the 3D data of the labeled tra-
jectories and then reprocess the file with only the Calculate 6DOF option activ-
ated. Then QTM will recalculate the 6DOF data from the available 3D data.

NOTE: To delete all 6DOF data in a file you can reprocess the file with
Calculate 6DOF with an empty list of rigid bodies.

When reprocessing a file the 6DOF data is reprocessed in the following ways
depending on the processing steps.

PROCESSING DATA 661


Calculate 6DOF without the Reidentify all body markers option
Data in all of the 6DOF labels
When there are data in all of the labeled trajectories for the 6DOF
body, then the 6DOF data is calculated directly from that data. QTM
does not try to identify any other trajectories.

At least 3 6DOF labels with data


If at least 3 6DOF labels contains data then QTM will calculate the
6DOF data from those and not try to identify the rest of the labels

Less than 3 6DOF labels with data


When less than 3 labels in the 6DOF body is empty, then QTM will try
to identify the 6DOF body again. Which means that the identified tra-
jectories will be unidentified and the body calculated from scratch.

Calculate 6DOF with the Reidentify all body markers option


All cases
With the reidentify option the reprocessing will try to reidentify all of
the markers in the 6DOF bodies. Unless the labels are also included
in an AIM model, then those trajectories will keep their identity.

Apply the current AIM models


6DOF body not included in the AIM model
The 6DOF bodies that are separate from the AIM model will not
change at all, unless Calculate 6DOF is applied as well.

6DOF body included in the AIM model


6DOF bodies that are included in the AIM model will be reidentified.
However the 6DOF data is not recalculated unless the processing
step Calculate 6DOF is activated.

Virtual markers calculated from 6DOF data


The 6DOF bodies can be used to calculate virtual markers in relation to the
6DOF position. These markers are calculated both in real-time and in a file, and
can then be exported and analyzed like any other marker. To virtual markers
have the type Virtual in the Trajectory info window and the residual is always
0. The trace of a virtual marker is a dotted line just like trace of a gap-filled part.
There are two different ways of creating virtual markers:

PROCESSING DATA 662


Virtual point in 6DOF body definition
The virtual trajectory of the virtual point is calculated as soon as the 6DOF
body has data, i.e. there must be at least 3 real markers in a frame to cal-
culate the virtual marker. To make a point virtual in the 6DOF body defin-
ition you just select the checkbox in the Virtual column of the Rigid
bodies list, see chapter "Rigid bodies" on page 346.
To get the desired virtual position you can either put a temporary marker
in the correct position when creating the body and then remove the
marker when making the measurement. Or you can add the point with
the Add Point option on the 6DOF bodies page and enter its position
manually.

Virtual trajectory part in gaps of 6DOF marker


The virtual trajectory part is calculated when a real marker is lost and the
Calculate missing markers in rigid bodies option on the 6DOF bodies
page is activated. There must be at least 3 real markers left in the 6DOF
body to calculate the virtual marker in a frame.

Rotation angles in QTM


The rotation angles of a rigid body can be defined in any number of ways, see
chapter "Euler angles" on page 392. The Qualisys standard of the rotation
angles are defined as:
l Rotation around the X-axis is called roll.

l Rotation around the Y-axis is called pitch.

l Rotation around the Z-axis is called yaw.

l Positive rotation is defined as clockwise rotation when looking in the dir-


ection of the axis
l The angles are applied to the local coordinate system in the order: roll,
pitch and finally yaw. Therefore to find the rotation of a rigid body with
given roll, pitch and yaw angles from QTM, apply the rotations in the same
order: first roll, then pitch and finally yaw.
l QTM uses the following default ranges for the angles, see also the figures
below:

PROCESSING DATA 663


In these ranges, roll, pitch and yaw are unambiguous and can
describe any orientations of a rigid body.

IMPORTANT: When the pitch (f) is close to ±90°, small changes in the
orientation of the measured rigid body can result in large differences in
the rotations because of the singularity at f=±90°, see chapter "Rotation
angle calculations in QTM" on page 1010.

Below follows an example to show the definitions of the rotation angles. It


starts with a 6DOF body, which is in alignment with the global coordinate sys-
tem.

First the local coordinate system is rotated around the X-axis (roll) with an
angle q to the new positions y’ and z’ of the Y- and Z-axis.

After the roll the local coordinate system rotates around the Y-axis (pitch) with
the Y-axis in its new position. The X- and Z-axis is rotated with an angle f to the
new positions x’ and z’.

PROCESSING DATA 664


Finally the local coordinate system is rotated around the Z-axis (yaw) with the
Z-axis in its final position. The X- and Y-axis is rotated with an angle y to the
new positions x’ and y’.

After the rotations the rigid body has a new orientation in reference to the
global coordinate system, see figure below.

Another description of rotations is to use the rotation matrix, which does not
have a singularity. QTM uses the rotation matrix internally to describe the rota-
tion of rigid bodies, and when exporting 6DOF to TSV files the rotation matrix is
included for all bodies in all frames, together with roll, pitch and yaw angles.
For a description of the calculation of the angles from the rotation matrix, see
chapter "Rotation angle calculations in QTM" on page 1010.

6DOF real-time and analog output


With 6DOF real-time output you can export the 6DOF data to another com-
puter. The data that can be exported are position, rotation angles, rotation mat-
rix and residual. The 6DOF data will use the same definitions as you have
specified in QTM and the 6DOF real-time output is accessed the same way as
the 3D real-time, see chapter "Real-time streaming" on page 590. For detailed
information about the real-time protocol, see the RT protocol documentation.
6DOF analog output

With the analog output option the information about 6DOF bodies’ positions
and rotations can be used in feedback to an analog control system. To enable
analog output a D/A board must be installed in the measurement computer.

PROCESSING DATA 665


Analog output is only available with the Measurement Computing board PCI-
DAC6703. This is a 16 channel board with 16 bit resolution. To install the card
insert it in any available PCI slot and then start the program Instacal from Meas-
urement computing.
The analog output is activated on the 6DOF analog export page in the Project
options dialog and it is only used during measurements of 6DOF bodies. To be
able to output the analog data the 6DOF body must be tracked, i.e. Calculate
6DOF must be activated and a 3D View window must be open. The analog sig-
nal is sent whenever the 6DOF body is tracked, which means that the output
will only be in sync with every captured frame in preview and in 6DOF real-time
output.

NOTE: In regular capture it will only work when Display the last
fetched frame is selected and then it will only be used on the frames
that are tracked and displayed.

The data values that will be used are selected on the 6DOF Analog export
page, see chapter "6DOF analog export" on page 388. Since the required board
has 16 channels the output is limited to 16 data values of 6DOF bodies. In order
to maximize the use of the 16 bit resolution, the data on each channel can be
scaled so that the resulting value is then converted to a voltage which rep-
resents the value’s proportional position within the range.

NOTE: The output of a channel will be 0 V if the body is not found. If the
input value is outside of the input range the output will be either the Out-
put min or the Output max value depending on the input value.

PROCESSING DATA 666


Rigid body meshes

A rigid body mesh can be used to visualize the object tracked by a rigid body.
QTM supports meshes in the format of the Wavefront 3D object (.obj ) files. To
use a 3D mesh copy the .obj file and all of the related .mtl and texture files to
the Meshes folder of the project. The mesh can be associated with a rigid body
by the following two ways.
From 3D view window
Right-click in the 3D view and select Rigid Body -> Change Mesh of Rigid
Body.

From Project Options


Double click on the Mesh column fro the rigid body on the 6DOF tracking
page.

The Rigid Body Mesh Settings dialog is opened, see chapter "Rigid body Mesh
Settings dialog" on page 358. Select the mesh file to use from the list and
modify the settings. Use the Apply button to verify that it looks correct in the
3D view. It is often best to use trial and error to find the position and rotation
for the mesh. Use these tips to position the mesh.

PROCESSING DATA 667


1. Use the Scale option to set the correct scale for a mesh.

2. Always scale the mesh first before translation and rotation, because the
distance to the mesh coordinate system is changed when scaling.

NOTE: Using really large meshes can slow down the rendering in QTM.

NOTE: The Meshes folder can be set to any folder on the Folder options
page in Project options. This means that the same meshes can be used
by different projects.

Sharing a file with a rigid body mesh

When sharing a file with a rigid body mesh it is important to include the mesh
files with the QTM file. This can be done in the following ways:
l Share the whole project.

l Share the QTM file and mesh files together. QTM will find the mesh files if
they are located in the same folder as the QTM file. The other user can also
copy the mesh files to the Meshes folder in their project.

Examples of how to use 6DOF bodies


How to use 6DOF bodies

The standard way to use the 6DOF bodies are to create a body for each sep-
arate object that you want to measure. In this case the markers on each subject
have no relation to markers on the other subjects. Follow these instructions to
use 6DOF bodies.

1. To create a 6DOF body follow for example the instructions in chapters


"How to design a 6DOF body" on page 650 and "Definition of 6DOF bod-
ies" on page 653.

PROCESSING DATA 668


2. It is important to think about the definition of rotation and position of the
local coordinate system, see chapter "Definition of local coordinate sys-
tem" on page 658.
3. It is also important to consider which Euler angles definition you want to
use, because the calculated angles will differ depending on the definition,
see chapter "Definition of custom rotation axes" on page 393.
4. The points in the 6DOF definition can be renamed on the 6DOF bodies
page.
5. Remember to turn on the Calculate 6DOF option on the Processing
page.
These are some tips on how you can improve the 6DOF data.
l If you are streaming the data in real time it can help to activate the Do
not require that the whole body is visible before identifying it the
first time setting on the 6DOF bodies page so that the identification
starts even if a marker is hidden.
l The 6DOF data can sometimes be improved with marker filtering, see
chapter "2D Preprocessing and filtering" on page 323.
How to use 6DOF bodies in an AIM model

In this case the 6DOF bodies are placed on parts that move together, e.g.
clusters placed on a human subject. Then the best approach is to use AIM
model to identify the markers and then calculate the 6DOF bodies from the
already calculated markers.

1. Create the bodies and their definition normally. Make sure that you name
the points of the bodies in the same way as they are named in the AIM
model.
2. Create an AIM model from the subject, see chapter "Generating an AIM
model" on page 625. If you already have an AIM model with the correct
marker setup you can add the current measurement to that AIM model.

NOTE: The AIM model can contain markers that are not included in
6DOF bodies.

PROCESSING DATA 669


3. Activate both Apply the current AIM model and Calculate 6DOF on the
Processing page. The AIM model will be applied first and then the 6DOF
bodies will be calculated from the identified labels.

NOTE: If AIM fails to identify a marker, the 6DOF calculation will not
try to identify it either even if you select the Reidentify all body
markers in reprocessing. In most cases the best way to fix this is to
manually identify the data and add it to the AIM model.

4. There can still be 6DOF bodies in the file that are not included in the AIM
models. These will only be identified and calculated when the Calculate
6DOF option is activated, i.e. if you reprocess the file and only apply the
AIM model the trajectories of the separate bodies will not be reidentified.
How to use virtual markers in an AIM model

The virtual markers in the 6DOF functionality can be used in a regular AIM
model if it contains markers that are actually rigid bodies. This example
describes the case when the 6DOF data is only used to create virtual markers
and therefore the actual 6DOF data is not really important.

1. Create an AIM model as described in chapter "Generating an AIM model"


on page 625. If you want to use a completely virtual point then you can
add that as an empty label in the AIM model.
2. Disable the display of the rigid bodies with the Show rigid bodies option
on the 3D view settings page. Otherwise you will see for example the
coordinate systems in the 3D view window.
3. Increase the Bone tolerance to 20 mm on the 6DOF Tracking page. This
will make sure that the 6DOF bodies can be calculated even if the marker
positions are not completely rigid.
4. Also activate the option Calculate missing markers in rigid bodies on
the same page so that the virtual markers are calculated when a real
marker is lost.

PROCESSING DATA 670


NOTE: There must be at least three real markers identified in a
frame to calculate the virtual marker.

5. When you measure a new subject:


a. Unless you are really sure that markers on the rigid body have not
moved, delete all of the bodies in the Rigid bodies list.
b. Create the 6DOF bodies from a static file of the subject where AIM
has identified the markers. Select at least three markers that do not
move in relation to each other, and then use the Define rigid body
option in the Trajectory info window menu. The trajectories will
keep the labels they got from the AIM model.

NOTE: All of the real markers in the 6DOF body must be


included the AIM model, the others will not be identified.
However 6DOF bodies that are completely virtual will be cal-
culated even if they are not in the AIM model.

6. Make sure that both Apply the current AIM model and Calculate 6DOF
are activated on the Processing page.

Tracking of skeletons
The following chapters describe how to define and track skeletons in QTM. The
main applications of skeleton tracking are animation, sports biomechanics, and
virtual reality and gaming. The specific workflow may depend on the applic-
ation. Generally, tracking of skeletons involves the following steps:
Choice of marker set
Skeleton tracking requires the use of a dedicated marker set. For more
information about the available marker sets and the possibilities to cus-
tomize them, see chapter "Marker sets for skeleton tracking" on the next
page. It is also possible to use a custom skeleton definition in QTM, see

PROCESSING DATA 671


chapter "Using a custom skeleton definition" on page 682.

Setting up models for automatic labeling of markers or Traqrs


Automatic labeling of markers or Traqrs is an important part of an effi-
cient workflow. The optimal approach may depend on the application. For
the recommended approaches, see chapter "Automatic labeling of mark-
ers or Traqr configuration for skeleton tracking" on page 682.

Skeleton calibration
Create a skeleton definition or update an existing one, see chapter "Ske-
leton calibration" on page 690.

Modifying the skeleton definition


Edit the skeleton definition to change the scaling factor, marker weights
or segment definitions, see chapter "How to modify the skeleton defin-
ition" on page 694.

Measuring skeleton data


Real time tracking and capturing of skeleton data, see chapter "How to
measure skeleton data" on page 699.

Processing skeleton data


For information about the available processing options, see chapter "How
to process skeleton data" on page 700.

Exporting and streaming


Export or stream skeleton data for further processing in an external pro-
gram or real time applications, see chapter "Export and streaming of skel-
eton data" on page 702

Marker sets for skeleton tracking


Skeleton tracking in QTM requires the use of a designated marker set. There
are three marker sets available. The choice depends mainly on the application.
Animation
The recommended marker set for animation is the Qualisys Animation
Marker Set, see chapter "Qualisys Animation Marker Set" on the next
page. The Animation marker set can be combined with the Qualisys Full
Fingers Marker Set or Qualisys Claw Marker Set for hand tracking, see
chapter "Qualisys Full Fingers Marker Set" on page 677 and"Qualisys
Claw Marker Set" on page 676.

PROCESSING DATA 672


Sports and biomechanics
The recommended marker set for sports and biomechanics applications
is the Qualisys Sports Marker Set, see chapter "Qualisys Sports Marker
Set" on the next page.
VR and gaming
The recommended marker set for location based virtual reality (LBVR)
and gaming applications is the Traqr VR Marker Set, see chapter "Traqr
VR Marker Set" on page 675.
The marker sets can be customized in the following ways.
l You can add extra markers, for example for more robust tracking of specific
movements. For more information, see chapter "Adding extra markers to a
skeleton" on page 679.

l You can change the names of the default labels, see chapter "Skeleton
marker label mapping" on page 681.

l You can create a custom skeleton, see chapter "Using a custom skeleton
definition" on page 682.

Qualisys Animation Marker Set

The Qualisys Animation Marker Set is dedicated to character animation applic-


ations using the Qualisys Skeleton Solver. For detailed information about the
Qualisys Animation Marker Set, you can open the marker set guide via the Ske-
leton menu.

The marker set guide contains information about:

PROCESSING DATA 673


l The markers and their placement.

l The use of optional markers for a better definition of the orientation of


the skeleton segments.
l T-pose requirements, see also chapter "T-pose" on page 691.

l The use of extra markers, see also chapter "Adding extra markers to a
skeleton" on page 679.
Two AIM models for the Qualisys Animation Marker Set are included in the
installation of QTM in the subfolder Models\AIM\:
Animation.qam
Generic AIM model with the default markers.

Animation_Optional.qam
Generic AIM model including both default and all the optional markers as
described in the Animation Marker Set Guide.

Animation AIM models with fingers


To track fingers the Animation marker set can be combined with the Full
Fingers marker set or Claw marker set, for more information about these
AIM models see chapters "Qualisys Full Fingers Marker Set" on page 677,
or "Qualisys Claw Marker Set" on page 676, respectively.

For more information about automatic labeling of the markers, see chapter
"Automatic labeling of markers or Traqr configuration for skeleton tracking" on
page 682.
It is possible to use alternative labels for the markers. This can be useful if you
have existing files with a different markers set. For more information on how to
use a different mapping of markers, see chapter "Skeleton marker label map-
ping" on page 681.
Qualisys Sports Marker Set

The Qualisys Sports Marker Set is dedicated to sports and biomechanics applic-
ations using the Qualisys Skeleton Solver. As opposed to the animation marker
set, the segments associated with the sports marker set are defined in a con-
ventional and biomechanical way which helps to compute and interpret the
joint angles more easily. For detailed information about the Qualisys Sports
Marker Set, you can open the marker set guide via the Skeleton menu.

PROCESSING DATA 674


The marker set guide contains information about:
l The markers and their placement.

l The use of static markers required for the skeleton calibration.

l The use of optional extra markers, see also chapter "Adding extra mark-
ers to a skeleton" on page 679.
Two AIM models for the Qualisys Sports Marker Set are included in the install-
ation of QTM in the subfolder Models\AIM\:
Sports_Static.qam
Generic AIM model including static markers for the skeleton calibration.

Sports_Dynamic.qam
Generic AIM model without static markers for dynamic measurements.

Both AIM models are pre-trained for trouble free automatic labeling of a wide
range of movements and actors. For guidelines on how to use these
AIM models in different scenarios, see chapter "Using AIM for sports and bio-
mechanics" on page 685.
It is possible to use alternative labels for the markers. This can be useful if you
have existing files with a different markers set. For more information on how to
use a different mapping of markers, see chapter "Skeleton marker label map-
ping" on page 681.
Traqr VR Marker Set

The Traqr VR Marker Set is dedicated to VR and gaming applications using the
Qualisys Skeleton Solver. When using the Traqr VR marker set, the Qualisys Ske-
leton Solver utilizes the 6DOF data from six Traqrs placed on the back, head,
hands and feet to calibrate and track the skeleton. It is recommended to use
the Active Traqr for the best real time performance, even though it is possible

PROCESSING DATA 675


to use passive rigid bodies as well. When using backpacks or HMDs it is also
possible to embed the Naked Traqr in these objects for tracking the back and
the head of players.
For detailed information about the Traqr VR Marker Set, you can open the
marker set guide via the Skeleton menu.

The marker set guide contains information about:


l The naming and placement of the Traqrs.

l T-pose requirements, see also chapter "T-pose" on page 691.

l The use of extra Traqrs for improved tracking.

Skeleton tracking with the Traqr VR Marker Set is easy and requires almost no
preparation once the Traqrs have been configured and set up in your QTM pro-
ject. Since the skeleton tracking relies entirely on rigid body tracking, there is
no need to train an AIM model for labeling markers. For more information
about how to set up the Traqrs for VR skeleton solving, see chapter "Setting up
the Traqrs for VR skeleton tracking" on page 687.
Qualisys Claw Marker Set

The Qualisys Claw Marker Set is dedicated to hand animation applications using
the Qualisys Skeleton Solver. For detailed information about the Qualisys Claw
Marker Set, you can open the marker set guide via the Skeleton menu.

The marker set guide contains information about:

PROCESSING DATA 676


l The markers and their placement.

l Calibration requirements, see also chapter "Hand calibration poses" on


page 692.
Four AIM models for the Qualisys Claw Marker Set are included in the install-
ation of QTM in the subfolder Models\AIM\:
Animation_Claw.qam
Generic AIM model with the default markers for the Animation marker set
and the Claw marker set.

Animation_Optional_Claw.qam
Generic AIM model including default and all the optional markers for the
Animation marker set and default markers for the Claw marker set.

Claw_left.qam, Claw_Right.qam
Generic AIM models with the default markers for the Claw marker set,
respectively for the left and right hand . These AIM models are only for
identifying the hands. They can't be combined with the Animation
AIM models, because then the marker labels will overlap. Use the Anim-
ation_Claw or Animation_Optional_Claw AIM models to combine Anim-
ation and Claw marker set.

For more information about automatic labeling of the markers, see chapter
"Automatic labeling of markers or Traqr configuration for skeleton tracking" on
page 682.
It is possible to use alternative labels for the markers. This can be useful if you
have existing files with a different markers set. For more information on how to
use a different mapping of markers, see chapter "Skeleton marker label map-
ping" on page 681.
Qualisys Full Fingers Marker Set

The Qualisys Full Fingers Marker Set is dedicated to hand animation applications
using the Qualisys Skeleton Solver. For detailed information about the Qualisys
Full Fingers Marker Set, you can open the marker set guide via the Skeleton
menu.

PROCESSING DATA 677


The marker set guide contains information about:
l The markers and their placement.

l Calibration requirements, see also chapter "Hand calibration poses" on


page 692.
There are no AIM models included for the Qualisys Full Fingers Marker Set. It is
because the markers on the fingers differs a lot so it is recommended that a
new AIM model is created for each subject. To get the correct marker names,
use the label lists included in the installation of QTM in the subfolder
Models\AIM\:
Animation_FullFingers.txt
Label list with the default markers for the Animation marker set and the
Full Fingers marker set.

Animation_Optional_FullFingers.txt
Label list including default and all the optional markers for the Animation
marker set and default markers for the Full Fingers marker set.

FullFingers_left.txt, FullFingers_Right.txt
Label lists with the default markers for the Full Fingers marker set,
respectively for the left and right hand . These label lists can be used to
create individual AIM models for identifying the hands. The resulting
AIM models can't be combined with the Animation AIM models, because
then the marker labels will overlap. Use the Animation_FullFingers or
Animation_Optional_FullFingers label lists to combine Animation and Full
Fingers marker set.

For more information about automatic labeling of the markers, see chapter
"Automatic labeling of markers or Traqr configuration for skeleton tracking" on
page 682.

PROCESSING DATA 678


It is possible to use alternative labels for the markers. This can be useful if you
have existing files with a different markers set. For more information on how to
use a different mapping of markers, see chapter "Skeleton marker label map-
ping" on page 681.
Adding extra markers to a skeleton

Extra markers or rigid bodies can be placed on the subject and included in the
skeleton definition. This option can be useful for improving the tracking of the
segments of the skeleton or for creating unique marker configurations when
capturing multiple actors. Follow these steps to add extra markers to a skel-
eton:
l The marker label or rigid body name must have the same prefix as the
skeleton, e.g., JD_... for a skeleton with name JD. The prefix should end
with an underscore; the first underscore in the label will be considered as
the separator for the skeleton name.
l The marker or rigid body will be automatically assigned to a segment
based on its placement. However, if you want to specify a specific seg-
ment to which the marker will be associated you can include the segment
name in the marker label, e.g., JD_Hips_... for a marker associated with the
Hips segment of the skeleton definition of JD. The assignment of extra
markers can also be altered via a dialog when calibrating the skeleton.

TIP: Refer to the marker set guides for the segment names.

The use of extra markers will depend on your camera configuration and the
type of movements you want to capture. For the Animation marker set, here
are some recommendations for the body part location of extra markers:
l On RightHand and LeftHand

l On Spine2 (above Chest marker)

l On RightShoulder and LeftShoulder (clavicles)

l On Spine1 (below BackL and BackR markers)

PROCESSING DATA 679


Assigning extra markers to a segment

When calibrating a skeleton with extra markers or rigid bodies, QTM displays a
dialog showing the assigned segments for each extra marker or rigid body. If
needed, you can change the assignment to a different segment, or specify None
if the marker or rigid body should not be used for skeleton solving. The func-
tions of the buttons are:
OK: Approve the chosen segment assignments and proceed with cal-
ibration and solving.

Skip: Proceed with calibration and solving without using the extra mark-
ers. The extra markers will not be included in the skeleton definition.

Cancel: Interrupt the calibration and solving process.

PROCESSING DATA 680


NOTE: When using the Qualisys Sports Marker Set, the segments
RightToeBase, LeftToeBase, Spine1, Spine2 and Neck are locked. Extra mark-
ers added to these segments will primarily affect their respective
(unlocked) parent segments and that way contribute to the skeleton solv-
ing.

Skeleton marker label mapping

The skeleton solver automatically recognizes the default marker labels of the
Qualisys Animation Marker Set and the Qualisys Sports Marker Set. If you want to
use alternative marker labels, for example to apply the skeleton solver to exist-
ing measurements, you can create a custom mapping. If you are using an altern-
ative marker set, the marker positions should correspond closely to those
described in the marker guides of the Qualisys marker sets for Animation and
Sports.
Follow these steps to create a custom mapping.

1. Go to the Skeleton Solver page in the Project Options.

2. Save the default skeleton marker label mapping by pressing the Save but-
ton. Fill in a name in the Save as dialog and press Save. The file contains
all labels of both the Animation and Sports marker sets.
3. Open the created file in a text editor (for example Notepad).

PROCESSING DATA 681


4. Edit the marker names contained in the <Name> tags and save the file
when done.
5. Load the file in the Skeleton Solver page under the Marker Label Map-
ping heading by pressing the drop down menu and choose Custom
(Load from file).
If you want to apply your label mapping to an existing file, you will first need to
reprocess the file with the new skeleton solver settings including the correct
mapping before you can calibrate the skeleton.

1. First, make sure that your label mapping file is loaded in the project set-
tings.

2. Reprocess the file with the Solve Skeletons option enabled and select
project settings.

3. After reprocessing the file you can calibrate the skeleton from the file.

4. You can now (batch) reprocess the other files with Skeleton solver project
settings to apply the calibrated skeleton.

Using a custom skeleton definition

It is also possible solve custom skeletons with QTM by importing an XML with a
valid skeleton definition. Qualisys supports a workflow for creating and editing
skeleton definitions in Maya that can be used with QTM using QTM-Connect-
For-Maya available on https://github.com/qualisys.

Automatic labeling of markers or Traqr configuration for


skeleton tracking
To facilitate automatic labeling of markers, generic AIM models for the sup-
ported marker sets are included in QTM. The AIM models are stored in the QTM

PROCESSING DATA 682


installation (default location: C:\Program files\Qualisys\Qualisys Track Manager)
in the subfolder Models\AIM. For information about how to use AIM for different
applications, refer to the following chapters:
For animation applications with the Qualisys Animation Marker Set, Full
Fingers Marker Set and Claw Marker Set, see chapter "Generating an AIM
model for animation" below.

For sports and biomechanics applications with the Qualisys Sports Marker
Set, see chapter "Using AIM for sports and biomechanics" on page 685.

NOTE: The folder with generic AIM files includes a text file with the label
list for each AIM model.

To apply one or more AIM model(s) follow these steps:

1. Add the model(s) to the Applied models list on the AIM page in Project
Options.

2. Make sure that the Apply current AIM models is checked as an action (real time
and capture) in the Processing page under Project Options.

When using the Traqr VR Marker Set, the Traqrs need to be configured and set
up for rigid body tracking, see chapter "Setting up the Traqrs for VR skeleton
tracking" on page 687.
Generating an AIM model for animation

This chapter describes the best practice for using AIM for skeleton tracking for
animation.
It is highly recommended to create a specific AIM model for each actor to be
tracked. To create a new AIM model, follow these steps:

1. On the AIM page in Project Options, add one of the generic AIM models
for animation to Applied models. The following generic AIM models are
available in the QTM installation in the subfolder Models\AIM:
a. The file Animation.qam contains a generic AIM model for the default
markers of the Qualisys Animation Marker Set.

PROCESSING DATA 683


b. The file Animation_Optional.qam contains a generic AIM model for
the default markers and all the optional markers described in the
Animation Marker Set Guide.
c. The file Animation_Claw contains a generic AIM model for the default
markers of the Qualisys Animation and Qualisys Claw Marker Set.
d. The file Animation_Optional_Claw.qam contains a generic AIM model
for default and all the optional markers for the Animation marker
set and default markers for the Claw marker set.
e. The files Claw_left.qam and Claw_Right.qam contains a generic
AIM models with the default markers for the Claw marker set,
respectively for the left and right hand . These AIM models are only
for identifying the hands. They can't be combined with the Anim-
ation AIM models, because then the marker labels will overlap. Use
the Animation_Claw or Animation_Optional_Claw AIM models to com-
bine Animation and Claw marker set.
2. Make a capture of the actor. A good practice is to make a capture includ-
ing a T-pose and a Range of Motion sequence, which can be used afterward
to calibrate and validate the skeleton definition.
3. Make sure that the labeling by the selected AIM model is correct.

NOTE: You may also use the load the label lists for each respective
AIM model and label the trajectories manually. The label lists are
available in the subfolder Models\AIM. This option needs to be used
for the Full Finger Marker Set, because there are no generic
AIM models for the Qualisys Full Fingers Marker Set. The reason is
that the identification of the finger markers is improved by cap-
turing a specific AIM model for that actor.

4. If you are using extra markers, add the labels manually to include them in
the new AIM model.
5. Add a prefix to labels. You must end the prefix with an underscore, e.g.,
JD_ for a skeleton with name JD.

PROCESSING DATA 684


NOTE: The first underscore in a trajectory label will be used as the
separator for the skeleton name.

NOTE: You must have different prefixes on the right and left hand
if the hands are captured without the body.

6. Create a new AIM model, see chapter "Generating an AIM model" on


page 625 for more detailed information.
If needed, you can improve the AIM model by training it with more movement
data, see chapter "Generating an AIM model" on page 625 for more detailed
information.
If you want capture multiple actors, repeat the above steps for each actor.
When done, add all actors to be included in the capture to Applied models on
the AIM page in Project Options.
Using AIM for sports and biomechanics

There are two generic AIM models for the sports marker set available in the
installation of QTM in the subfolder Models\AIM:
l The file Sports_Static.qam contains all markers, including the static mark-
ers that must be included for the skeleton calibration
l The file Sports_Dynamic.qam contains all markers that are needed for skel-
eton tracking.
To apply one of these AIM models, add it to the Applied models list on the
AIM page under Project Options.
The best way to use AIM for sports and biomechanics depends on your specific
application. Two possible use scenarios are suggested below, but of course you
can choose an alternative approach that works best for your application.
Scenario 1
Use the generic AIM models included with QTM for calibration and track-
ing. The generic AIM models include a standard prefix Q_ for all the
marker labels, corresponding to a standard skeleton name Q. This scen-
ario is only suitable when measuring a single person.

PROCESSING DATA 685


For the static pose:
l Use the static AIM model Sports_Static.qam for labeling the static
pose. The static AIM model includes the static markers that are
required for the skeleton calibration.

For the dynamic measurements there are the following options:


l Use the static AIM model Sports_Static.qam if you want to keep the
static markers during the dynamic measurements.

l Use the dynamic AIM model Sports_Dynamic.qam if you want to do the


dynamic measurements without the static markers.

NOTE: It is not possible to automatically label extra markers with


the generic AIM models. You need to create a new AIM model if you
want to include extra markers.

NOTE: The prefix of the marker labels can be changed by editing


the AIM file (.qam) in a text editor using search and replace.

Scenario 2
Create a specific AIM model for an individual person. This scenario is sim-
ilar to the one described for animation applications, see chapter "Gen-
erating an AIM model for animation" on page 683.

For the static pose:


1. Use the generic static AIM model Sports_Static.qam for automatic
labeling of the first capture for skeleton calibration.

2. Optionally, you can rename the prefixes of the markers for creating
a skeleton definition with a different name. The easiest way to do
this is to select the trajectories in the trajectory info window, remove
the prefix of the selected trajectories and add a new prefix. Do not
forget to use an underscore character "_" as a separator.

PROCESSING DATA 686


3. Use a static pose to calibrate the skeleton, see chapter "Skeleton cal-
ibration" on page 690.

For the dynamic measurements:


1. Capture a file with a range of motion. Use one of the generic AIM
models to label it, with or without static markers, depending on the
marker setup you want to use for your dynamic trials.

2. Rename the prefixes so that they correspond to the skeleton name.

3. If you want to use extra markers you need to manually label them.
Use the same prefix if you want to include them in the skeleton
definition.

4. Create a new AIM model, see chapter "Generating an AIM model" on


page 625 for more detailed information.

If needed, you can improve the AIM model by training it with more move-
ment data, see chapter "Generating an AIM model" on page 625 for more
detailed information.

If you want capture multiple persons, repeat the above steps for each per-
son. When done, add all persons to be included in the capture to Applied
models on the AIM page in Project Options.
Setting up the Traqrs for VR skeleton tracking

To get started with VR skeleton tracking using the Traqr VR Marker Set, you will
first need to configure your Traqrs and set them up in your QTM settings.
For setting up your Traqrs, follow these steps:

1. Traqr configuration
When using the Active Traqr make sure that the all markers have a unique
marker ID. Use the Traqr Configuration Tool to change the configuration
of the Traqrs if needed.
2. Definition of the rigid bodies
Create a rigid body definition for each Traqr. The easiest way to do this is
to attach a set of 6 Traqrs to a person and do a short T-pose capture.
Then, for each Traqr, select the markers in the 3D view window and press
F8 to create a new rigid body and add it to the project. The rigid bodies
should be named according to the naming conventions described in the

PROCESSING DATA 687


Traqr VR Marker Guide, for example, Player1_Head for the Traqr
attached to the head of Player1. The prefix before the first underscore,
here Player1, will be the name of the associated skeleton. For more
information about creating rigid bodies, see chapter "Creating 6DOF bod-
ies" on page 650.
After defining the rigid bodies, check the definitions on the 6DOF Track-
ing page under Project Options. Make sure that the rigid bodies are cor-
rectly named and that the marker IDs are correct when using the Active
Traqr.
When you are using a backpack or HMD to track the back or the head, you
will need to translate the origin of the rigid body to the correct position
(forward so it is placed on the back or top of the head) using the Trans-
late dialog. The default location of the origin is the geometric center of
the markers. For the translation it is easiest if the rigid body orientations
are aligned with the person in T-pose. This can be achieved by having the
person stand aligned with the global axes during the capture used to
define the rigid bodies, or afterwards by using the Reset Rotation button
on the 6DOF Tracking page while in preview.
Optionally, configure the other rigid body settings. It is recommended to
increase the Maximum residual and the Bone tolerance values, for
example to 25 mm or higher, for maximizing fill rates of 6DOF data. You
can also use filters for smoothing, which may lead to a better playing
experience. See the picture below for an example of a typical 6DOF track-
ing set up for a single player.

3. Attach the Traqrs to the players

PROCESSING DATA 688


The Traqrs can be attached to the player using dedicated strap mounts.
Make sure that the Traqrs are securely attached to the correct positions.
When needed adjust the mounting straps so that they are not too loose.
You can use labels to avoid mixing up the Traqrs. However, make sure not
to cover the IR eye when attaching a label to an Active Traqr.
4. Calibrate the skeleton
Start a new measurement (Ctrl+N) in QTM and calibrate the skeleton(s)
while the player(s) are in T-pose. The skeleton definition will be added to
the QTM project. When a skeleton definition with the same name is
already present in QTM it will be updated. The alignment of the players or
the rigid bodies during the calibration is not important since this will be
taken into account by the calibration. The below figure shows a skeleton
in the 3D view window after calibration.

Once the Traqrs have been defined as rigid bodies in the project, only step 3
and 4 are needed to prepare the player(s) for a session.

PROCESSING DATA 689


Skeleton calibration
Skeleton calibration needs to be done to fit a skeleton definition to the meas-
ured marker positions. The skeleton calibration can be applied in real time dur-
ing preview or to a recorded capture.
Before applying the skeleton calibration, make sure that the actor stands in the
correct calibration pose. The pose depends on marker set.
l For the Traqr VR Marker Set a T-pose is required for a correct calibration.

l For the Qualisys Animation Marker Set a T-pose is required for a correct cal-
ibration. Optional markers can be added for a better definition of the ori-
entation of the segments. For more information, refer to the marker set
guide that can be opened in QTM via the Skeleton menu.
l For the Qualisys Full Fingers Marker Set a Hand calibration pose is
required for a correct calibration.
l For the Qualisys Claw Marker Set a Claw calibration pose is required for a
correct calibration.
l For the Qualisys Sports Marker Set make sure that the static markers are
present. The actor or subject should stand in upright position with both
feet flat on the floor. A specific pose is not required, since the marker pos-
itions in the Qualisys Sports Marker Set provide sufficient information for
correct definition of the skeleton segments.
Press the Calibrate skeletons button (keyboard shortcut F10) to apply the skel-
eton calibration. When applying the skeleton calibration for the first time, a
new skeleton definition will be added to the Skeleton Solver page under Pro-
ject Options. When the skeleton is already defined, the skeleton calibration
will replace the existing skeleton definition in the project.
If there are multiple actors in the preview or capture, the skeleton calibration is
applied simultaneously to all actors.
When the skeleton calibration is applied to a capture, the new skeleton defin-
ition will be automatically applied to the data in the current measurement
range.
It is possible to calibrate and solve upper and lower body parts of the skeleton,
see chapter "Partial skeleton calibration" on page 692.

PROCESSING DATA 690


Once the skeleton is calibrated it is possible to modify the skeleton definition,
see chapter "How to modify the skeleton definition" on page 694.
It is also possible to adapt the scale of the skeleton tracked in QTM to an anim-
ated object, see chapter "Scale factor" on page 693.
The calibrated Animation skeleton has native support for motion gloves, see
chapter "Glove" on page 344.

NOTE: To apply a new or updated skeleton definition to existing cap-


tures they need to be reprocessed using the Skeleton Solver settings of
the project.

T-pose

When using the Qualisys Animation Marker Set, the actor must stand in a correct
T-pose when applying the calibration. When applying the skeleton calibration to
a file, make sure that the actor stands in T-pose in the current frame. For a cor-
rect T-pose, make sure that the following requirements are fulfilled:
l Thighs and shanks must be vertical making a small gap between both
ankles.
l Feet must be parallel to each other and pointing in the front of the sub-
ject.
l Head and neck must be aligned with the spine, i.e., the subject must stand
with the head straight, facing forward.
l Arms, forearms and hands must be parallel to the floor. Check that the
HandOut and WristOut markers are horizontally aligned.
l Palms of the hands must face the floor.

l Arm and forearm must be aligned. A virtual line should go through the
glenohumeral joint (underneath ShoulderTop marker), the elbow joint and
the wrist joint.
l When combined with the Qualisys Full Fingers Marker Set for the hands:
l Fingers must be straight, no bending.

l Hands parallel to the floor.

PROCESSING DATA 691


l When combined with the Qualisys Claw Marker Set for the hands:
l Fingers must be straight, no bending.

l Thumb held tight against the index finger.

Hand calibration poses

When using the Qualisys Full Fingers Marker Set or Qualisys Claw Marker Set, the
actor must have their hands in a correct Calibration pose when applying the cal-
ibration. When applying the skeleton calibration to a file, make sure that the
actor have their hands in a Calibration pose in the current frame. For a correct
Calibration pose, make sure that the following requirements are fulfilled:

Full Finger Marker Set


l Fingers must be straight, no bending.

l Hands parallel to the floor.

Claw Marker Set


l Fingers must be straight, no bending.

l Thumb held tight against the index finger.

Partial skeleton calibration

It is possible to define and solve only a part of the skeleton.


For the Animation Marker Set the upper body can be solved. For detailed
information about the required markers, see the Animation Marker Set
guide.

For the Sports Marker Set the lower and upper body can be solved. For
detailed information about the required markers, see the Sports Marker
Set guide.

Once the required markers are labeled, calibrating and solving the skeleton
works similar as calibrating and tracking the complete skeleton:

1. You can create an AIM model including the markers of the partial skeleton
to facilitate tracking, see chapter "Automatic labeling of markers or Traqr
configuration for skeleton tracking" on page 682.

PROCESSING DATA 692


2. When solving the skeleton only the segments included in the skeleton
definition will be tracked.
3. You can modify the scale factor or marker weights, see chapter "How to
modify the skeleton definition" on the next page.
4. You can export and stream partial skeleton data, see chapter "Export and
streaming of skeleton data" on page 702.
An alternative way to apply partial skeleton solving is by removing segments
from a complete skeleton definition. For more information, see chapter "How
to modify the skeleton definition" on the next page.
Scale factor

The scale factor can be used to indicate the scale of the skeleton tracked in
QTM relative to an animated object, for example, an avatar in an external anim-
ation application. The scale factor can be set in the following ways:

1. By setting the Scale factor (%) value for a skeleton in the Skeletons list
on the Skeleton Solver page. For scaling up the skeleton, use a per-
centage larger than 100%, and for scaling down use a percentage less
than 100%.
2. By modifying the value in the Scale tag in the skeleton definition XML and
re-importing it in the project. In the XML file the scale factor is defined as
the inverse ratio. For example, a Scale value of 0.8 in the XML file cor-
responds to a scale factor of 125% in QTM. Alternatively, the skeleton
definition can also be updated by sending an XML packet via the real time
protocol, see the QTM RT Protocol documentation included in the
QTM installation for more information.
The scale factor is applied to the following data:
l Real time skeleton data streamed by QTM.

l The Skeletons and Characters data in the FBX export.

l Skeleton data in TSV and MAT export.

NOTE: The scale factor is not applied to the skeleton data that is dis-
played in the Data info window.

PROCESSING DATA 693


How to modify the skeleton definition
It is possible to influence the skeleton tracking by modifying the skeleton defin-
ition. The skeleton definition can be modified in two ways:

1. Using a skeleton template that modifies the parameters that are used
when creating the skeleton definition in the skeleton calibration process,
see chapter "Skeleton template" below.
2. By manually editing the parameters in the skeleton definition after the
skeleton calibration, see chapter "Manual editing of the skeleton defin-
ition" on page 696.
For more information about editing a skeleton definition or template, see
chapter "Skeleton XML editing" on page 697.
Skeleton template

The skeleton solver can use a skeleton template for the skeleton calibration.
The template can modify the degrees of freedom and the weights of markers
for segments used during the skeleton calibration. When the Skeleton tem-
plate option is set to default then QTM uses the predefined parameters for
each skeleton marker set.

NOTE: Changing the template will not update existing skeletons.

Follow this process to create and use a skeleton template.

PROCESSING DATA 694


1. Go to the Skeleton solver page in Project options.

2. Make sure that there is a skeleton in the list.

3. In the Skeleton template section, click on the drop-down arrow on the


Edit button and select Generate file and edit.
4. Select the skeleton that will be used for the template from the list and
enter a name in Template name.

5. A file is created from the skeleton and opened automatically in Notepad.

PROCESSING DATA 695


l The template is created using the definition for the skeleton. It
means that the parameters names and values will depend on the
Qualisys skeleton marker set. Extra markers are included in the tem-
plate and if marker label mapping is used then the alternative label
is used in the template.
l The file only includes the values from the skeleton that can be mod-
ified by the template.
l The template is only used when calibrating one of the predefined
skeleton marker sets, so there is no point in using a custom skeleton
to create the skeleton template.
6. Modify the DegreesOfFreedom and Weight in the file, for more information
about editing the template, see chapter "Skeleton XML editing" on the
next page. The names and the structure of the file must not be modified,
since then it will not match the skeleton marker set that is calibrated. For
details about the parameters in the file please contact Qualisys AB.
7. Save the file and make sure the new template is selected in the Skeleton
template option.
8. Calibrate the skeleton to apply the new parameters, see chapter "Skeleton
calibration" on page 690.

NOTE: The skeleton calibration uses the template for all skeleton types.
For example, a skeleton template created from a Sports marker set will
change the corresponding parameters in an Animation marker set as
well.

Manual editing of the skeleton definition

Once a skeleton is defined it is possible to influence the tracking by modifying


the skeleton definition. The skeleton definition can be modified in two ways:

1. By importing a modified skeleton XML file.


a. Use the Save button in the Skeleton Solver page to export the skel-
eton definition XML file .

PROCESSING DATA 696


b. Edit the XML file in a text editor, for example Notepad. For more
information about editing the skeleton definition, see chapter "Ske-
leton XML editing" below.
c. Use the Load button in the Skeleton Solver page to import the mod-
ified skeleton definition XML file .
2. By sending an XML packet via the real time protocol, see the QTM RT Pro-
tocol documentation included in the QTM installation for more inform-
ation.

NOTE: To apply a modified skeleton definition to existing captures they


need to be reprocessed with the modified skeleton definition.

NOTE: All manual modifications will be undone when recalibrating a skel-


eton in QTM. If you want to apply the same modification to subsequent
calibrations, consider using a skeleton template instead.

Skeleton XML editing

The skeleton definition or template can be edited in a text editor. The most
important elements that can be edited are listed below. For the complete XML
specification, see the information about Skeleton XML parameters in the RT pro-
tocol documentation.
Marker weight
Edit the value of the <weight> tag to change the relative weighting of a
marker for solving the segment pose.

Degrees of freedom
Add or remove degrees of freedom for a segment by adding or removing
the corresponding tags under the <DegreesOfFreedom> tag.

PROCESSING DATA 697


It is also possible to apply constraints to the degrees of freedom by
adding or modifying upper and lower bounds.

The following elements can also be edited in a skeleton definition:


Scale factor
Change the skeleton scale factor by modifying the <scale> tag. For more
information about the scale factor, see chapter "Scale factor" on
page 693.

Segment labels
Edit the segment label to use alternative segment names.

NOTE: Modified segment names cannot be used in combination


with a Skeleton template.

Marker labels
Edit the marker labels, for example to apply the skeleton definition to
existing labeled files with different marker names.

PROCESSING DATA 698


NOTE: For calibrating a skeleton with alternative marker labels, you
will need to use a label mapping, see chapter "Skeleton marker label
mapping" on page 681.

Removing segments
Remove segments for partial skeleton solving. Note that the root segment
and all intermediate segments between the root and the respective end
segments should be present.

NOTE: All elements can be modified when manually editing the skeleton
definition, but only marker weights and degrees of freedom of segments
can be modified in the skeleton template.

How to measure skeleton data


For measuring skeleton data, make sure that the Solve Skeletons option is
checked under Processing in the Project Options. Skeleton data can be cal-
culated both in real time and in a capture. The skeleton data can be displayed
and plotted in real time and from a file via a Data info window, see chapter
"Skeleton data information" on page 172 for more information.

NOTE: The real time skeleton data may differ from the skeleton data in a
capture as the process used in real time is optimized for faster cal-
culation.

For animation applications it is good practice to start and end each capture
with the actor(s) standing in T-pose. This allows for the possibility to recalibrate
the skeleton if the quality of the fit is insufficient, for example if a marker has
moved.

PROCESSING DATA 699


How to process skeleton data
During post-processing you can label and edit the trajectories as usual. In addi-
tion to the standard methods you can use skeleton assisted labeling (SAL) and
kinematic gap filling. For more information about SAL see chapter "How to use
SAL" below.
You can apply kinematic gap filling to the markers included in the skeleton
definitions, see chapter "Filling of gaps" on page 642. The kinematic gap fill
option is available in the Trajectory Editor window or via the Trajectory Info
Window menu.
After you made changes to trajectories, for example (re)labeling, gap filling or
smoothing, the file needs to be reprocessed to recalculate the skeleton data as
follows:

1. Open the Reprocessing dialog by clicking the Reprocessing icon or


Reprocess in the Capture menu.
2. Check the Solve Skeleton option. Make sure that other processing
options that may alter the trajectories, e.g., Track the measurement and
Apply the current AIM models are unchecked.

NOTE: Use the Marker count threshold option to control how


many identified markers that are needed to solve a skeleton, see
chapter "Marker count threshold" on the next page

3. Press the OK button.

How to use SAL

Skeleton assisted labeling (SAL) identifies trajectories using the segment mark-
ers in a skeleton. The unidentified trajectory part that is closest to a segment
marker and fulfills the set distance criteria will be identified as the cor-
responding labeled trajectory. The options for SAL can be set at the SAL set-
tings page.
Use the Claim threshold option to set the required closeness between a
marker and a segment marker for claiming the associated trajectory label. The
default value is 20 mm. Use a lower value when markers can be close to each
other for example when solving fingers.

PROCESSING DATA 700


Use the Disqualification threshold option to set the maximum tolerated dis-
tance for a claimed trajectory part from a segment marker at any frame. This
option can be used to prevent that the wrong markers that happened to be
close to a missing segment marker at some instance are accepted as a solution.
It is recommended to start with a rather high value (e.g. 500 mm) since the
solved skeleton may be offset due to the missing marker. Decrease the value
when too many wrong markers are labeled through SAL.
SAL can be used both in real-time and in post-processing. Activate the cor-
responding Apply SAL processing step on the Processing page in the Project
options dialog.
Real-time
To use SAL in real-time it is required to have a solved skeleton, which
means that the AIM and Solve Skeletons processing step must be activ-
ated as Real time actions. The AIM algorithm has priority over the SAL
algorithm, but SAL can label trajectories which were missed by AIM. These
labeled trajectories are then used for the next frame of AIM and skeleton
solver processing steps.

Post-processing
To use SAL as a post-processing step it is required that the file has solved
skeletons. Apply SAL either via the Reprocessing dialog or with the
Identify trajectories using skeleton (SAL) on the Skeleton menu. When
changing the SAL settings, the file should be reprocessed with the new val-
ues for them to take effect. Check that the labeled trajectories are correct
and then run the Solve skeletons processing step again in reprocessing
to update the skeleton data. Repeat the steps of SAL and skeleton solving
as many times as needed to get the skeleton data.

NOTE: The process can be combined with manual labeling for


example by drag and drop of a unidentified trajectory on a segment
marker.

Marker count threshold

The Marker count threshold controls the percentage of identified segment


markers needed to solve a skeleton. You can use this setting for example to
make QTM solve skeletons with few labeled trajectories. The percentage is set

PROCESSING DATA 701


from 1 to 100 % of the skeleton segment markers.
The threshold is applied according to the following rules.
l The percentage is rounded down to the nearest number of segment mark-
ers. For example if the skeleton has 10 segment markers then a 35%
threshold means that at least 3 segment markers are needed to solve the
skeleton.
l Each trajectory in the skeleton is associated with at least one segment
marker. Some trajectories are associated with two segment markers, e.g.
the elbows. It means that the number of segment markers are usually
more than the number of trajectories in the skeleton.
l The skeleton root segment must have at least 3 identified trajectories. For
the Animation and Sports marker sets the root segment is the pelvis. For
the hand skeletons the root segment is the hand.
l The finger skeletons consists of sub-skeletons for each finger. The per-
centage applies to each skeleton individually, so that the skeleton for the
body can be solved even if many finger markers are missing. Note that
the markers on the hand are included in the body skeleton so if those are
missing in combination with a high marker count threshold, then the body
skeleton is not solved.

Export and streaming of skeleton data


You can export skeleton data for further processing in third-party software,
such as MotionBuilder and Maya. The export formats are:
l FBX: a widely used format for exchange of 3D data for animation soft-
ware, see chapter "Export to FBX file" on page 742
l TSV: a tab-separated text format, see chapter "Export to TSV format" on
page 711
l MAT: Matlab data file, see chapter "Export to MAT format" on page 729

l JSON: a structured text format, see chapter "Export to JSON file" on


page 743.
Real time streaming of skeleton data is supported by the QTM RT protocol.
QTM connect applications are available on GitHub for MotionBuilder, Maya,
Unity and Unreal. The streamed skeleton data can be scaled by using a scale

PROCESSING DATA 702


factor, see chapter "Scale factor" on page 693. For more information on how to
stream skeleton data, see the documentation included in the respective QTM
connect applications or the documentation on the RT Protocol included in the
QTM installation.

Force data calculation

Calculating force data


The force data is calculated from the analog signals of the force plate. It uses
the parameters of the force plate on the Force plate page in the Project
options dialog to calculate the correct values, see chapter "Force plate set-
tings" on page 362. Force data can be calculated as a processing step or manu-
ally with the Recalculate forces icon .
The reset signal is sent at the following operations in QTM:
l New file

l Changing a setting in Project option in QTM during preview

l Just before the Start capture dialog is opened before a capture

l In a batch capture just before QTM starts Waiting for next meas-
urement/trigger

NOTE: It is important to not stand on the force plate at the start of the
measurement, if you use the Remove offset/drift option on the analog
data. It is also important to not stand on a force plate when the reset sig-
nal is sent.

When the Recalculate forces command is used the settings of the force plate
can be changed in the File reprocessing dialog. The settings in the dialog are
the same as when the file was created. However, if the settings were not spe-
cified for the motion capture the current settings in the Project options dialog
are copied instead.

PROCESSING DATA 703


Viewing force data
The force data in a capture file can be viewed in the Data info window and as a
vector in the 3D view window. The data can be used to confirm that the analog
capture works and that the settings for the force plate are correct. For inform-
ation about the data in the Data info window and how to plot it see chapter
"Force data information" on page 176.

In the 3D view window the force data is displayed as a vector with its base in
the Center Of Pressure (COP). The vector displays the opposite force applied to
the force plate, that is the reaction force. The purple squares represent the
force plates and are placed at the force plate coordinates that are specified on
the Force plate page in the Project options dialog. The light blue force traces

PROCESSING DATA 704


(Pedotti diagrams) display the force data for the selected measurement range.
Place the mouse over a force plate or force arrow to view a tool tip with the
force data in the current frame.
The display of force arrows, plates and traces can be toggled on the 3D view
settings page in the Project options dialog. The color and size of the force can
also be changed on this page.

NOTE: The force plates that are activated on the Force data page will be
shown in the 3D view window even if there is no analog data. So you can
show a force-plate even if there is no analog data. Which can be used if
the force is collected by another program, but you want to see the force
plate location in QTM.

NOTE: If you transform the global coordinate system the force plate
coordinates will be the same, which means that you have to change them
to move the force plate to the correct location, see chapter "Force plate
location" on page 382.

NOTE: The Z direction of the force plate coordinate system is pointing


down.

To make the most of the force data it can be exported to an analysis software.
The best format to use are TSV, C3D or MAT file, because then the force plate
location and for TSV and MAT the force data is included in the file, see chapter
"Data export to other applications" on page 710. For example Visual3D uses
C3D and recalculates the forces from the original data, therefore it can differ
some from what is displayed in QTM.

PROCESSING DATA 705


How to use events

Adding events
Events can be used to mark something that is happening. The events is sent in
RT and can be added to a QTM file during the measurement and after the meas-
urement.
There are two ways to create an event.
Trigger event
You can use an external trigger to generate an events during a meas-
urement. When using the Qualisys trigger button, it is recommended that
you release the button quite quick, because releasing the trigger button
can sometime also generate an event. If you have trouble with extra
events, you can increase the hold-off time on the Synchronization page.

The event functionality is activated by default but can be changed with the
Generate event setting for the external trigger on the Synchronization
page, see chapter "Trigger ports" on page 273.

NOTE: Events cannot be created if you have selected to stop on the


external trigger.

When the events are created during a capture it will be stored in the QTM
file as Trigger or Trigger start event. The Trigger start event is only used
for start event of a pretrigger measurement. You can change the default
color of these events on the Synchronization page in Project options,
see chapter "Synchronization" on page 266.

The timing of the Trigger event will be the exact time when the signal is
received by the camera. It is therefore the most exact way to set an event,
especially if the trigger signal is generated automatically. Because the
time of the event can be any time between two frames, it means that it is
most likely not placed exactly at the capture of a frame. The frame num-
ber of the event will be rounded to the nearest frame.

PROCESSING DATA 706


Manual event
A manual event can be created with the Add event button or by using
the keyboard shortcut Ctrl+E. The events can be created with the button
both during RT/capture and after a measurement. In file mode, events
can even be added via the Events submenu by right-clicking in the
Timeline control bar.

When the events are created during a capture it will be stored in the QTM
file as Manual event. The timing of the Manual event will be the time
when you press the button in QTM. This means that it is most likely not
placed exactly at the capture of a frame. The frame number of the event
will be rounded to the nearest frame.

Events created with Add event button in a file will open the Add event
dialog. The event will be placed on the current frame in the file. You can
change the Label of the event and also Time, Frame and Color.

You can also use event shortcuts in the Add event dialog to create the
events. Double-click on an event shortcut in the list to load the label name
and color. To edit the shortcuts click on Edit event shortcuts, that opens
the Events page in the Project options dialog, see chapter "Events" on
page 430.

PROCESSING DATA 707


Viewing and editing events
The events are displayed above the timeline at the time of the event as a red tri-
angle . In plots the event is displayed as a red line at the time of the event.
Place the mouse over the event on the timeline to see information about the
event. For information about all of the events open the Edit event list dialog,
for example by right-clicking on an event and then on Edit event list.
The Label, Time and Frame of events in a file can be edited in the following
ways.
l Right-click on an event in the Timeline control bar and select Edit event. It
will open the Edit event dialog where you can change the Label, Time and
Frame.

l You can go to next and previous event in the file with Page Down respect-
ively Page Up.

You can access all of the events in the Edit event list dialog. Which can be
opened by right-clicking on an event and then on Edit event list.

Add
Add a new event with the Add event dialog.

Remove
Remove the selected event.

Edit
Open the Edit event dialog where you can change the Label, Time and
Frame.

Goto
Go to the frame of the event in the file.

PROCESSING DATA 708


Exporting events
The events can be exported to other programs with the C3D, TSV and MAT
export. In the C3D export the events are always included. The format of the
C3D file follows the C3D standard.

NOTE: For Visual3D 2020.8.3 or later it is required to use the option Fol-
lowing the C3D.org specification for the C3D export.

For the TSV export you need to activate the Include events option to export
the events. For more information about the TSV format, see chapter "Motion
data (.tsv)" on page 713.
For the MAT export you need to activate the Events option to export the
events. For more information about the MAT format, see chapter "MAT file
format" on page 730.

How to use Euler angles


Euler angles (rotation angles) are the way that QTM shows the rotation of a
6DOF body and skeleton segments. It is also how you enter any rotation that
should be applied to a global or a local coordinate system. It is therefore
important to understand how Euler angles work to be able to use 6DOF data
correctly.
Euler angles are used to represent rotations in the user interface of QTM.
Internally, rotations are represented by rotation matrices in QTM. This means
that changing the Euler angles definition only affects the way rotations are dis-
played and exported. The rotation angles are converted according to the cal-
culations described in chapter "Calculation of rotation angles from the rotation
matrix (Qualisys standard)" on page 1011. The same calculations are then used
to acquire the measured rotation angles, since they can be more directly inter-
preted and visualized by the user.

PROCESSING DATA 709


IMPORTANT: When you change the Euler angles definition that change
will be effective immediately everywhere in QTM. This means for example
that if you open a file after you have changed the definitions it will be dis-
played using the new definitions and not those used when the file was
saved. It also means that the angles on the Transformation page will
change to reflect the rotation of the global coordinate system with the
new definition.

In QTM you can define the Euler angles as any possible rotation of a right-hand
coordinate system, see chapter "Euler angles" on page 392. By default QTM
uses the Qualisys standard definition, which is described in the chapter "Rota-
tion angles in QTM" on page 663.

Data export to other applications


Exported measurement data can be analyzed in other applications. Data can be
exported in the following ways:
l Manually, via the File/Export menu.

l Via the Batch Exporting dialog, see chapter "Batch exporting" below.

l As a processing step, including reprocessing and batch processing of files,


see chapter "Introduction to data processing" on page 600.

Batch exporting
With batch export several files can be exported with the same settings at once.

1. In the File menu, go to Export... and click on Batch Export....

2. Select the files you want to export in the file dialog with the mouse and by
holding the Control or Shift key, and press Open.
3. Select one or multiple export formats in the Batch Exporting dialog.

PROCESSING DATA 710


4. To review or change the export options for a specific export format, click
on it in the Batch Exporting list on the left pane. The options correspond to
the Project Options.
5. Press OK to export the files.

Export to TSV format


By exporting the data to TSV format you can analyze the data in any other pro-
gram reads text files, e.g. Excel. Click Export on the File menu and then To TSV
to export to the TSV format (Tab Separated Values).

PROCESSING DATA 711


For information about the settings see chapter "TSV export" on page 397. The
frames that are included in the export are shown under the Selected range
heading. The range is set by the measurement range on the Timeline control
bar.
The following data types can be exported as TSV (Tab Separated Values) from
QTM:
l 3D and 2D motion data

l 6DOF data

l Skeleton data

l Analog data

PROCESSING DATA 712


l Force data

l Eye tracker data

Each data type is exported as a separate file. By default, TSV files have the
same name as the QTM file with an additional suffix, depending on the data
type. In case there are multiple devices (e.g. analog devices, force plates), one
file per device is exported with an index added to the suffix.
Motion data (.tsv)

The motion data file contains the data of the trajectories of the motion capture
and it has two parts: file header and data. It can contain either 2D or 3D data
depending on the settings on the TSV export page in the Project options dia-
log.

Header

The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable value as a
second word. Each variable is on a new line. The following variables are avail-
able in the file header:
FILE_VERSION
Version number of export format, currently 2.0.0.

NOTE: FILE_VERSION is not included for versions earlier than 2.0.0.

NO_OF_FRAMES
Total number of frames in the exported file.

NO_OF_CAMERAS
Number of cameras used in the motion capture.

NO_OF_MARKERS
Total number of trajectories (markers) in the exported file.

PROCESSING DATA 713


NOTE: Only included in 3D data export.

FREQUENCY
Measurement frequency used in the motion capture.

NOTE: When using external timebase the frequency is set to the


actual frequency for the Multiplier/Divisor mode and to EXT for
the Edge triggered mode.

DESCRIPTION
At present not in use by QTM.

NOTE: Only included in 3D data export.

TIME_STAMP
Date and time when the motion capture was made. The date and time is
followed by a tab character and then the timestamp in seconds and ticks
from when the computer was started.

NOTE: The time stamp is recalculated when trimming the file.

WARNING: The time stamps indicate the start of the capture


according to the QTM software. The time stamps may not exactly
correspond to the first frame of the capture and it is discouraged to
use them for synchronization with data from other devices con-
nected to the same or another synchronized computer.

DATA_INCLUDED
Type of data included in the file, i.e. 2D or 3D.

PROCESSING DATA 714


EVENT
Each event is added on a new row starting with word EVENT. Then fol-
lowed by the name of the event, frame number and time, each separated
by a tab character.

NOTE: The events are only added if Include events are active for
the TSV export settings.

MARKER_NAMES
List of trajectory labels (trajectories in the Labeled trajectories window)
separated by tab characters. Unidentified trajectories have no names and
are therefore only represented by a tab character. The number of names
in the list corresponds to the value given by the NO_OF_MARKERS vari-
able.

NOTE: Only included in 3D data export.

TRAJECTORY_TYPES
List of trajectory type for each exported trajectory separated by tab char-
acters. The list is sorted in the same order as the MARKER_NAMES. The
available types are Measured, Mixed, Virtual, Gap-filled and - (empty
label).

Data

In 3D export the data part follows on a new line after the last marker name.
The trajectory data (in mm) is then stored in tab-separated columns, where
every row represents one frame. Each trajectory has one column for each dir-
ection (X, Y and Z). The data for the first marker is therefore stored in the first
three columns and data for the second marker is stored in column 4, 5 and 6
etc.
In 2D export the data part follows on a new line after the variable DATA_
INCLUDED. The data part starts with a row with the number of the cameras.
The marker data for each camera is then given below its corresponding head-
ing, e.g. Camera: 1. Every marker has four columns with data, which is the
same as that shown in the Data info window: x, y, xSize and ySize. Each row in

PROCESSING DATA 715


the marker data represents one frame and it is just the markers that are visible
in that frame that are included. The order of the markers can therefore not be
used to identify the markers.

NOTE: By default the 2D data in the TSV export is linearized in contrast


to that in the Data info window. Therefore the 2D data will differ
between the two. You can change to raw unlinearized data by disabling
the Export linearized 2D data option.

There are three options in the TSV export which you can use to add more
information to the file.
When the Export time data for every frame option is checked, two
columns will be added containing time data. The first column contains the
frame number and the second contains the time in seconds relative to the
start of the capture. If the camera frames contain timestamps (SMPTE or
IRIG), a third column with timestamp data is added. The format of the
timestamp string depends on the type of timestamp used.

When the Write column header option is checked, a header is added


above each column describing the contents of that column.

When the Include type information per frame option is checked, a


column is added for each trajectory with the trajectory type per frame.
6DOF data format (_6d.tsv)

The TSV export of files with 6DOF data creates a TSV file (.tsv) with a file header
and a data part. The variable names in the file header are followed by a tab
character and then the value. Each variable is on a new line.

Header

The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable value as a
second word. Each variable is on a new line. The following variables are avail-
able in the file header:

PROCESSING DATA 716


FILE_VERSION
Version number of export format, currently 2.0.0.

NOTE: FILE_VERSION is not included for versions earlier than 2.0.0.

NO_OF_FRAMES
Total number of frames in the exported file.

NO_OF_CAMERAS
Number of cameras used in the motion capture.

NO_OF_BODIES
Total number of rigid bodies in the exported file.

FREQUENCY
Measurement frequency used in the motion capture.

NOTE: When using external timebase the frequency is set to the


actual frequency for the Multiplier/Divisor mode and to EXT for
the Edge triggered mode.

DESCRIPTION
At present not in use by QTM.

TIME_STAMP
Date and time when the motion capture was made. The date and time is
followed by a tab character and then the timestamp in seconds from
when the computer was started.

DATA_INCLUDED
Type of data included in the file, i.e. 6D.

EVENT
Each event is added on a new row starting with word EVENT. Then fol-
lowed by the name of the event, frame number and time, each separated
by a tab character.

PROCESSING DATA 717


NOTE: The events are only added if Include events are active for
the TSV export settings.

BODY_NAMES
Tab-separated list with the names of the rigid bodies in the exported file.

BODY_FILTERS
Tab-separated list with the names of the used filter presets for the
respective rigid bodies.

TRANSLATION_ORIGIN
Tab-separated list with the translation origin for each rigid body. The
alternatives are Global, Relative 'Name of reference rigid body' and Fixed [X,
Y, Z]

ROTATION_ORIGIN
Tab-separated list with the rotation origin for each rigid body. The altern-
atives are Global, Relative 'Name of reference rigid body' and Fixed [Rota-
tion matrix]

Data

On a new line after the last rigid body name follows a tab-separated list of
the data headings for the rigid bodies. The headings are:
X, Y and Z
The position of the origin of the local coordinate system of the rigid
body. Where X, Y and Z are the distance in mm to the origin of the
coordinate system for rigid body data, see chapter "Coordinate sys-
tem for rigid body data" on page 354.

Roll, Pitch and Yaw


Roll, pitch and yaw of the rigid body in degrees.

PROCESSING DATA 718


NOTE: The names and their definition will change if the defin-
ition is changed on the Euler angles page in the Project
options menu.

Residual
The average of the errors (in mm) of each measured marker com-
pared to the 6DOF body definition. This error is probably larger than
the 3D residual.

Rot[0] - Rot[8]
The elements of the rotation matrix for the rigid body. Where the ele-
ments are placed in the matrix according to the following table:

Rot[0] Rot[3] Rot[6]


Rot[1] Rot[4] Rot[7]
Rot[2] Rot[5] Rot[8]

NOTE: For information about the rotation matrix, see "Rota-


tion angle calculations in QTM" on page 1010.

The data part follows on a new line after Rot[8]. The data is stored in tab-sep-
arated columns, where each row represents a frame. The columns are in the
same order as the heading list described above. If there is more than one rigid
body, their frames are stored on the same rows as the first body. They are just
separated by two tab characters after the Rot[8] data of the previous body.
There are two options in the TSV export which you can use to add more inform-
ation to the file.
When the Export time data for every frame option is checked, two
columns will be added containing time data. The first column contains the
frame number and the second contains the time in seconds relative to the
start of the capture. If the camera frames contain timestamps (SMPTE,
IRIG or Camera time), a third column with timestamp data is added. The
format of the timestamp string depends on the type of timestamp used.

PROCESSING DATA 719


When the Write column header option is checked, a header is added
above each column describing the contents of that column.

NOTE: Each rigid body name is only entered before its respective X
column header, the following headers only includes the contents of
the column

Skeleton data (_s.tsv)

The skeleton data files contain the skeleton data included in the capture. There
will be one file for each skeleton in the capture. The file name has suffix _s_
<skeleton name>, where the last part corresponds to the name of the respect-
ive skeletons in the Skeleton Solver page in Project Options. The file contain
a file header and a data part. The variable names in the file header are followed
by a tab character and then the value. Each variable is on a new line.

Header

The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable value as a
second word. Each variable is on a new line. The following variables are avail-
able in the file header:
FILE_VERSION
Version number of export format, currently 2.0.0.

NOTE: FILE_VERSION is not included for versions earlier than 2.0.0.

NO_OF_FRAMES
Total number of frames in the exported file.

NO_OF_CAMERAS
Number of cameras used in the motion capture.

PROCESSING DATA 720


FREQUENCY
Measurement frequency used in the motion capture.

NOTE: When using external timebase the frequency is set to the


actual frequency for the Multiplier/Divisor mode and to EXT for
the Edge triggered mode.

TIME_STAMP
Date and time when the motion capture was made. The date and time is
followed by a tab character and then the timestamp in seconds from
when the computer was started.

REFERENCE
Reference used for the skeleton data. Global: all segment positions and
rotations are expressed relative to the global coordinate system. Local: all
segment positions and rotations except the Hips segment are relative to
their respective parent segment.

SCALE
The scale setting used for the skeleton. In the TSV file the scale factor is
defined as the inverse ratio. For example, a SCALE value of 0.8 in the TSV
file corresponds to a scale factor of 125% in QTM.

SOLVER
The solver used for the skeleton.

When the Write column header option is checked, a header line is added
above each column describing the contents of that column. The column
headings for each segment are:
Segment name
Name of the respective skeleton segments. There will be no data in
the column below.

X, Y and Z
Segment position data in mm.

QX, QY, QZ, QW


Segment rotation data in global coordinate system expressed as qua-
ternions.

PROCESSING DATA 721


Data

The data is stored in tab-separated columns, where each row represents a


frame. The format of the data part depends on the chosen options.
When Write column header is disabled there are 7 columns per seg-
ment containing the position and orientation data. Columns for the
respective segments are appended without any extra space, making up a
total of 154 columns for all 22 segments.

When Write column header is enabled there are 8 columns per seg-
ment. The first column below the segment name is empty, the remaining
7 columns contain the position and orientation data for the segment. The
total number of columns is 176 for all 22 segments.

When the Export time data for every frame option is checked, two
columns will be added containing time data. The first column contains the
frame number and the second contains the time in seconds relative to the
start of the capture. If the camera frames contain timestamps (SMPTE,
IRIG or Camera time), a third column with timestamp data is added. The
format of the timestamp string depends on the type of timestamp used.
Analog data (_a.tsv)

The analog data files contain the data of the analog capture. Each file contains
the data from one analog board or EMG system. If there is only one source of
analog data the file name ends with _a. If there are more than one source of
analog data the files are numbered in the same order as they appear in the
Data info window (_a_1, _a_2 and so on). There are two parts in the file: file
header and data.

Header

The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable. Each variable
is on a new line. The following variables are available in the file header:
FILE_VERSION
Version number of export format, currently 2.0.0.

PROCESSING DATA 722


NOTE: FILE_VERSION is not included for versions earlier than 2.0.0.

TOT_NO_OF_CHAN
Total number of channels in the exported file.

NO_OF_CALC_CHAN
At present not used by QTM.

TIME_STAMP
Date and time when the measurement was made. The date and time is fol-
lowed by a tab character and then the timestamp in seconds from when
the computer was started.

DESCRIPTION
At present not used by QTM.

DATA_INCLUDED
Type of data included in the file. Set to ANALOG by the QTM software

CHANNEL_NAMES
List of tab-separated channel names. The number of names in the list cor-
responds to the number of channels given by the TOT_NO_OF_CHAN vari-
able.

CHANNEL_GAIN
List of tab-separated gains for each channel.
At present they are all set to 1 by QTM.

CHANNEL_FREQUENCIES
List of tab-separated analog sampling frequency values for each channel.
The sampling frequencies can differ between channels.

CHANNEL_SAMPLE_COUNTS
List of tab-separated sample count values for each channel. The number
of samples can differ between channels.

FP_LOCATION, FP_CAL, FP_GAIN


At present not used by QTM.

PROCESSING DATA 723


Data

The data part follows on a new line after FP_GAIN. The data of the analog chan-
nels are then stored in tab-separated columns, one column for each channel
and one row per sample. The data is always saved with 6 digits.

NOTE: For analog boards, the units of the exported data are in V. EMG
data from integrated devices are exported in μV or mV, depending on the
units provided by the device. For other data types from integrated
devices, the units are converted to SI units associated with the data type.

There are two options in the TSV export which you can use to add more inform-
ation to the file.
When the Export time data for every frame option is checked, each
data column is preceded by two columns containing time data. The first
column (SAMPLE) contains the sample number and the second column
(TIME) contains the time in seconds relative to the start of the capture.

When the Write column header option is checked, a header is added


above each column describing the contents of that column.
Force data (_f.tsv)

The force data files contain the data of the force plates. Each file contains the
data from one force plate. The file names end with _f and are indexed with the
number of the force plate in the same order they appear on the Force data
page (_f_1, _f_2 and so on). There are two parts in the file: file header and data.

Header

The header is included when the Include TSV header option is checked in the
TSV export settings.
In the file header, the variable names are written as one word (without any
embedded spaces) followed by a tab character and the variable. Each variable
is on a new line. The following variables are available in the file header:
NO_OF_SAMPLES
Total number of samples in the exported file.

PROCESSING DATA 724


FREQUENCY
Sampling frequency of the force data.

TIME_STAMP
Date and time when the measurement was made. The date and time is fol-
lowed by a tab character and then the timestamp in seconds from when
the computer was started.

FIRST_SAMPLE
Original number of the first frame in the range that is exported from the
QTM software. The start time of the exported data can then be calculated
as FIRST_SAMPLE / FREQUENCY.

DESCRIPTION
Information about the coordinate system of the force plate in the file. It
can be either Force data in local (force plate) coordinates or Force
data in world (lab) coordinates.

DATA_INCLUDED
Type of data included in the file. Set to Force by the QTM software

FORCE_PLATE_TYPE
The type of force plate.

FORCE_PLATE_MODEL
The model of the force plate.

FORCE_PLATE_NAME
The name of the force plate defined on the Force data page.

FORCE_PLATE_CORNER_POSX_POSY_X, FORCE_PLATE_CORNER_POSX_
POSY_Y and FORCE_PLATE_CORNER_POSX_POSY_Z
Position (in mm) of the top left corner when looking at the internal force
plate coordinate system. The position is in the measurement coordinate
system.

FORCE_PLATE_CORNER_NEGX_POSY_X, FORCE_PLATE_CORNER_NEGX_
POSY_Y and FORCE_PLATE_CORNER_NEGX_POSY_Z
Position (in mm) of the top right corner when looking at the internal force
plate coordinate system. The position is in the measurement coordinate
system.

PROCESSING DATA 725


FORCE_PLATE_CORNER_NEGX_NEGY_X, FORCE_PLATE_CORNER_NEGX_
NEGY_Y and FORCE_PLATE_CORNER_NEGX_NEGY_Z
Position (in mm) of the bottom left corner when looking at the internal
force plate coordinate system. The position is in the measurement
coordinate system.

FORCE_PLATE_CORNER_POSX_NEGY_X, FORCE_PLATE_CORNER_POSX_
NEGY_Y and FORCE_PLATE_CORNER_POSX_NEGY_Z
Position (in mm) of the bottom right corner when looking at the internal
force plate coordinate system. The position is in the measurement
coordinate system.

FORCE_PLATE_OFFSET_X, FORCE_PLATE_OFFSET_Y and FORCE_PLATE_


OFFSET_Z
Offset (in mm) applied to the force plate position. X, Y and Z are defined in
the internal coordinate system of the force plate.

FORCE_PLATE_LENGTH
The length of the force plate (in mm).

FORCE_PLATE_WIDTH
The width of the force plate (in mm).

Data

The data part follows on a new line after FORCE_PLATE_WIDTH. A header is


always included above each column describing the contents of that column.
The data of the force plate are then stored in tab-separated columns, in the fol-
lowing order: Force_X, Force_Y, Force_Z, Moment_X, Moment_Y, Moment_Z,
COP_X, COP_Y, and COP_Z and one row per sample. The data is in the internal
coordinate system of the force plate and always saved with 6 digits. The data is
in N (forces), Nm (moments) and mm (COP).
There are two options in the TSV export which you can use to add more inform-
ation to the file.
When the Export time data for every frame option is checked, two
columns will be added containing time data. The first column contains the
frame number and the second contains the time in seconds relative to the
start of the capture.

PROCESSING DATA 726


When the Write column header option is checked, a header is added
above each column describing the contents of that column.
Eye tracker data (_g.tsv, _e.tsv)

The eye tracker data files contain the data of eye tracking devices included in
the capture. The data exported depends on the eye tracking device used. For
more information about the export of specific eye tracker devices, refer to the
detailed information of the device.
l For Tobii, see " Process and export Tobii gaze vector data" on page 879.

Export to C3D format


Click Export on the File menu and then To C3D to export to the C3D format.
The C3D format is a motion capture industry file format. It supports sim-
ultaneous storage of motion data, analog data and other data types in the
same file.

NOTE: C3D data can be exported even if the file does not contain any 3D
data. For example if you have measured forces or EMG.

PROCESSING DATA 727


For information about the settings see chapter "C3D export" on page 400. The
frames that are included in the export are shown under the Selected Range
heading. The range is set by the measurement range on the Timeline control
bar.
The Zero Force Baseline option is only used on force data and its frames are
set in relation to the exported range. However the range is also displayed in
relation to all frames. Make sure that the force plates are unloaded during the
frames that are used for correction. Otherwise the data will be wrong in for
example Visual3D.
C3D file format

The C3D export creates a C3D file, for information about the binary C3D format
see http://www.c3d.org.
The C3D format has the following limitations for analog data:

PROCESSING DATA 728


l The analog frequency should be an integer multiple of the capture rate.

l All analog channels should have the same sample frequency.

If this is not the case for the analog data stored in the QTM file, the analog data
will be resampled at a multiple of the capture rate equal to or higher than the
highest analog frequency. For example, if the marker frequency is 120 Hz, the
EMG frequency is 1500 Hz, and the analog frequency is 1200 Hz, all of analog
data will be resampled to 1560 Hz.
Multiple subjects can be included in a C3D file, for more information see
chapter "Parameter Groups" on page 401.

Export to MAT format


Click Export on the File menu and then To MAT to export to the MAT format.
The MAT file can then be opened in Matlab. You can select the data that is
exported on the Matlab file export page in the Project options dialog, see
chapter "Matlab file export" on page 402.

PROCESSING DATA 729


MAT file format

When the data from QTM is exported to a MAT file a struct array is saved in the
file. The struct array is named the same as the file. If the file name does not
start with an English letter, a prefix qtm_ will be added to the name of the struc-
ture array. To use the struct array, write the name of the struct array and the
fields with a period between them. If several files have been exported to Mat-
lab, write the variable as QTMmeasurements(1), QTMmeasurements(2) and
so on to get the data of the file.
The struct array contains different fields depending on if the file includes 3D,
6DOF, analog (including EMG), force, Eye tracker, SMPTE timecode and events
data. The fields and their contents are described below:
FileVersion
Version number of export format (array with 3 elements), currently 2, 0, 0.

NOTE: FileVersion is not included for versions earlier than 2, 0, 0.

File
File name and directory path.

Timestamp
Time when measurement was started. In date format YYYY-MM-DD,
HH:MM:SS.SSS, followed by a tab character and the timestamp in seconds
and ticks from when the computer was started.

NOTE: The time stamp is recalculated when trimming the file.

WARNING: The time stamps indicate the start of the capture


according to the QTM software. The time stamps may not exactly
correspond to the first frame of the capture and it is discouraged to
use them for synchronization with data from other devices con-
nected to the same or another synchronized computer.

PROCESSING DATA 730


StartFrame
The measurement start frame number.

Frames
Number of frames.

FrameRate
Frame rate in frames per second.

NOTE: When using external timebase the frequency is set to the


actual frequency for the Multiplier/Divisor mode and to EXT for
the Edge triggered mode.

Trajectories
Struct with fields Labeled for labeled markers and, optionally Uniden-
tified for unidentified markers. These fields are structs with the following
fields:
Count
Number of trajectories in the window.

Labels
A list of the trajectory labels.

NOTE: This field is only included in the Labeled struct array.

Data
The location of the 3D points (in mm) of the trajectories in the win-
dow. The data is given in a matrix with the dimensions: Trajectories
* X, Y, Z direction and Residual * Frames.

Type
Type specification of the markers per frame. The data is given in a
matrix with the dimensions: Trajectories*Frames. The types have val-
ues 0-4, which indicate: Missing=0, Measured=1, Gap-filled=2, Vir-
tual=3, Edited=4.

PROCESSING DATA 731


TrajectoryType
A list of the type for each trajectory. For information about the tra-
jectory types see chapter "Data in Trajectory info windows" on
page 138.

Analog/EMG
Struct array with data from the analog capture. The analog and EMG data
from integrated wireless EMGs are stored in separate struct arrays, but
the structure is the same.
BoardName
The name of the board that was used in the capture.

NrOfChannels
The number of channels that were used in the capture.

ChannelNumbers
The channel numbers that were used in the capture.

Labels
An array with the names of the channels that were used on the ana-
log board.

Range
The range of the channels on the analog board.

NrOfFrames
The number of exported motion capture data frames.

SamplingFactor
The multiplication factor compared with the motion capture frame
rate. The SamplingFactor is specified per channel in a 1 x NrOfChan-
nels array.

NrOfSamples
The number of exported analog samples per channel (1 x NrOfChan-
nels array).

Frequency
The sampling frequency of the analog data per channel (1 x
NrOfChannels array).

PROCESSING DATA 732


Data
Analog data per channel. The data is formatted as a matrix with the
dimensions NrOfChannels x max(NrOfSamples). In case the chan-
nels have different frequencies, the shorter rows are appended with
NaNs.

NOTE: For analog boards, the units of the exported data are
in V. EMG data from integrated devices are exported in μV or
mV, depending on the units provided by the device. For other
data types from integrated devices, the units are converted to
SI units associated with the data type.

Force
Struct array with data from the force plates. The elements of the struct
array contain the data of the respective force plates present in the file.
ForcePlateName
The name of the force plate that was used in the capture.

NrOfFrames
The number of channels that were used in the capture.

SamplingFactor
The multiplication factor compared with the motion capture frame
rate.

NrOfSamples
The number of samples in the analog capture.

Frequency
The frequency of the analog capture.

Force
The force data in newton (N), the data is given for X, Y and Z dir-
ection.

Moment
The moment data in newton meter (Nm), the data is given for X, Y
and Z direction.

PROCESSING DATA 733


COP
The centre of pressure on the force plate (in mm), the data is given
X, Y and Z direction.

ForcePlateLocation
The location of the force plate in measurement coordinate system.
The corners are in the order upper left, upper right, lower right and
lower left seen in the force plate coordinate system.

ForcePlateOrientation
Coordinate system in which force data is expressed: 0 (local force
plate coordinates), 1 (global coordinate system).

ForcePlateOffset
The force plate offset values as filled in for the specific force plate.
The order is the same as on each respective force plate calibration
settings in Project options.

RigidBodies
Struct with data for the 6DOF bodies. The struct contains the data of all
rigid bodies present in the file.

Bodies
The number of 6DOF bodies.

Name
The names of the 6DOF bodies.

Filter
Struct array with information about the filter used for the respective
rigid bodies.
Preset
Name of the filter preset used for the rigid body.

CoordinateSystem
Struct array with information about the reference coordinate system
used for the respective rigid bodies.
Reference
Coordinate system option used for the rigid body. The possible
options are Global, Relative and Fixed.

PROCESSING DATA 734


ParentRigidBody
The number of the reference rigid body used for the Relative
coordinate system option. For the Global and Fixed options the
number is set to zero.

DataOrigin
The fixed origin (x,y,z) used for the Fixed coordinate system
option. For the Global and Relative options the value is set to
(0,0,0).

DataRotation
The fixed rotation matrix in relation to the global coordinate
system used for the Fixed coordinate system option. For the
Global and Relative options the value is set to the unit matrix.

Positions
The position of the origin of the measured rigid body’s local coordin-
ate system. It is given as a matrix with the dimensions: Bodies *
Distances (X, Y and Z) * Frames. The distances are in mm to the ori-
gin of the coordinate system of the motion capture.

Rotations
The rotation matrices of the rigid bodies. It is given as a matrix with
the dimensions: Bodies * Rotation matrixes (elements 0-8) * Frames.
The elements are placed in the matrix according to the following
table:

[0] [3] [6]


[1] [4] [7]
[2] [5] [8]

NOTE: For information about the rotation matrix, see "Rota-


tion angle calculations in QTM" on page 1010.

RPYs
The roll, pitch and yaw of each rigid body. It is given as a matrix with
the dimensions: Bodies * Rotation angles (roll, pitch and yaw) *
Frames. The rotation angles are in degrees.

PROCESSING DATA 735


NOTE: The matrix will always be called RPYs even if the defin-
itions are changed on the Euler angles page in the Project
options menu.

Residuals
The residual of the rigid bodies.

Skeletons
Struct array with data of the skeletons. The elements of the struct array
contain the data of the respective skeletons present in the file.
SkeletonName
The name of the skeleton [char array].

Solver
The type of solver used for the skeleton.

Scale
The scale setting used for the skeleton. In the MAT export the scale
factor is defined as the inverse ratio. For example, a Scale value of
0.8 corresponds to a scale factor of 125% in QTM.

Reference
Reference used for the skeleton data. Global: all segment positions
and rotations are expressed relative to the global coordinate sys-
tem. Local: all segment positions and rotations except the Hips seg-
ment are relative to their respective parent segment.

NrOfSegments
Number of segments of the skeleton [double].

SegmentLabels
Names of the segments of the skeleton [1 x NrOfSegments cell array
with char elements].

PositionData
Position data (X, Y, Z) of the segments of the skeleton [3 x NrOfSeg-
ments x Frames double].

PROCESSING DATA 736


RotationData
Segment rotation data in global coordinate system expressed as qua-
ternions (QX, QY, QZ, QW) [4 x NrOfSegments x Frames double].

SMPTETimecode
Struct array with the SMPTE timestamps of the frames in the file.
Hour
The hour of the timestamp.

Minute
The minute of the timestamp.

Second
The second of the timestamp.

Frame
The SMPTE frame number of the timestamp.

Missing
Indicates if the SMPTE timecode is extrapolated if the SMPTE syn-
chronization source is lost during the measurement.

NOTE: In QTM versions 2.15 and earlier an additional field


Subframe is included, representing an index of marker frames
within a SMPTE frame.

NOTE: The values are represented as int64 numbers in Mat-


lab.

IRIGTimecode
Struct array with the IRIG timestamps of the frames in the file.
Year
The year of the timestamp.

Day
The day of the timestamp.

PROCESSING DATA 737


Hour
The hour of the timestamp.

Minute
The minute of the timestamp.

Second
The second of the timestamp.

Tenth
The decisecond of the timestamp.

Missing
Indicates if the timecode is extrapolated if the IRIG synchronization
source is lost during the measurement.

NOTE: The values are represented as int64 numbers in Mat-


lab.

CameraTimecode
Struct array with the camera timestamps of the frames in the file. For
more information about Camera time, see chapter "Timestamp" on
page 284.

NOTE: The CameraTimecode struct array is only available when the


Camera time option is enabled during the capture.

Tick
Tick value of camera timestamp. This value represents the time in
seconds multiplied with a factor 107.

Missing
Indicates if the timecode is extrapolated if the synchronization
source is lost during the measurement.

PROCESSING DATA 738


NOTE: The values are represented as int64 numbers in Mat-
lab.

Events
Struct array with a list of all the events in the file.

NOTE: This struct array is only included if the capture file contains
events.

Label
The label of the event.

Frame
The corresponding frame of the event. The time will be rounded to
the nearest frame.

Time
The time of the event.

GazeVector and EyeTracker


The contents of these fields is described in the chapter " Process and
export Tobii gaze vector data" on page 879.

Export to AVI file


With the export to AVI file option you can create a video file of any camera view,
2D view or 3D view window. There are three different ways to make an export,
the options for the exports are the same and are described in the chapter "AVI
Export" on page 404.
File menu
On the File/Export/To AVI menu you can export to an AVI file from the cur-
rently open file. You can select any open window or a saved view to be
used for the export. The settings that are used in the export are then
copied to the Project options. This include the Previous settings view in
the Window settings list.

PROCESSING DATA 739


A 3D view window will have the current rotation and zoom throughout the
AVI file so it is important to set them as you want them. In a 2D view win-
dow the currently active camera views are included in the export, how-
ever when resizing the video dimensions the positions of the view may
change compared to the original layout.

Use the Preview/Resize button to open a window where you can check
the dimensions and also change the dimensions.

PROCESSING DATA 740


Right-click on 3D or camera view
On the right click menus of the 3D and 2D view windows there is an
option to export that view to an AVI file. In this case only the current view
is available for export and the settings are not copied to the Project
options.

NOTE: Only the camera view that you right-click on is exported


from the 2D view and the dimensions of the video will default to
number of pixels used in the capture. If you want to export several
camera views then you can use the export to AVI option on the File
menu.

Processing step
The AVI export can be done as processing step either directly after a meas-
urement, in reprocessing or in batch processing. The view that is used in
the export is the one saved in Previous settings in the Window settings
list. Since the Previous settings view is changed when you make an
export, it is important that you save any view that you want to use with
the Save view option. Then you can select that view again for the pro-
cessing step before the processing.

It is recommended to use a codec when creating the AVI file, since the video file
will be very large uncompressed. For more information about recommended
codecs, see chapter "Recommended codecs" on page 583.
The video export is done by displaying the view that you want export in a spe-
cial window that is then captured to an AVI file. Because the video has to be dis-
played in the window, the export will be faster on a computer with a good
external graphics board. The processing time can also be reduced by using no
more than 30 Hz and by making the dimensions of the video export smaller.

PROCESSING DATA 741


If you are exporting a Qualisys video to an AVI file, e.g. with a 3D overlay, then
the video image will be linearized. I.e. the same parameters that are used to
correct the marker data is applied to the video data to correct for the lens dis-
tortion. Therefore the pixels will not match exactly with the original video
image. The linearization can be turned off with the Show linearized data
option on the 2D view settings page in Project options.
The exported AVI files contain meta information about the QTM version, the
capture time and the SMPTE time code, if used, according to the AVI standard
(isft, idit and ismp).

Export to FBX file


Click Export on the File menu and then To FBX to export to the FBX format.
The FBX format is widely used for exchange of 3D data for animation software.
The FBX format is especially useful for exporting skeleton-related data in QTM
for the use in animation software such as MotionBuilder, Maya or Blender. For
more information about the types of exported data, see chapter "FBX export"
on page 409.

PROCESSING DATA 742


Export to JSON file
Click Export on the File menu and then To JSON... to export to the JSON
format. Select the data components to be exported in the dialog. The JSON
(JavaScript Object Notation) format is an open standard text-based format. For
more information about the types of exported data, see chapter "JSON export"
on page 411.

PROCESSING DATA 743


Export to TRC file
Click Export on the File menu and then To TRC... to export to the TRC format.
The type of exported data is the labeled 3D trajectory data. In the TRC settings
dialog, choose the mapping of the axes. Most commonly, the Y-axis is defined
as the upward pointing axis in the TRC format.

PROCESSING DATA 744


A TRC file (.trc) is a text-based 3D data file format, including meta data about
the capture, such as capture rate. The TRC format is used by OpenSim. If your
file contains force data, it can be exported separately as a STO file, see chapter
"Export to STO file" below.

Export to STO file


Click Export on the File menu and then To STO... to export to the STO format.
In the STO settings dialog, choose the mapping of the axes. The type of expor-
ted data is the ground reaction force (GRF) data from included force plates.
Most commonly, the Y-axis is defined as the upward pointing axis in the STO
format.

PROCESSING DATA 745


A STO file (.sto) is a text-based file format for ground reaction force data, includ-
ing meta data about the capture, such as the sample rate. The STO format is
used by OpenSim, most commonly in combination with a TRC file containing 3D
trajectory data, see chapter "Export to TRC file" on page 744.

PROCESSING DATA 746


Ext ernal devices and int eg-
rat ions

How to use analog boards


An analog board is needed on the measurement computer to capture analog
signals together with the motion capture data.
QTM supports the following analog boards:
l USB-2533

l USB-1608G

Any analog device which has an output voltage between ± 10 V can be con-
nected to the analog board. QTM will however only make calculations on data
from force plates. For information about force plates and EMG devices, see
chapters "How to use force plates" on page 756 and "How to use EMG" on
page 803.

Installing drivers for the A/D board


The Instacal software (drivers for the A/D board) must be installed on the com-
puter before the installation of the hardware. Instacal is included in the QTM
installer. Alternatively, the latest version of Instacal can always be downloaded
from www.mccdaq.com.
After Instacal has been installed you can go on to install and configure your A/D
board.
l For the 64-channel USB-2533 board, see chapter "Installing the USB-2533
board" on the next page.
l For the 16-channel USB-1608G board, see chapter "Installing the USB-
1608G board" on page 750.
For more information about InstaCal read the README.TXT file from Meas-
urement Computing that is put in the folder for the Instacal software.

EXTERNAL DEVICES AND INTEGRATIONS 747


Installing the USB-2533 board

Follow this procedure to install the USB-2533 A/D board:

1. Connect the supplied power supply to the POWER connection. It is import-


ant to connect the power supply before the USB, because when power
adapter is connected to the board it cannot get the power from the USB
connector. With a laptop the power supply must always be used oth-
erwise the analog board will not start.
2. Connect the USB port on the analog board to an USB port on the meas-
urement computer.
3. In case the Found New Hardware Wizard is opened, follow these steps.
First, the wizard installs the MCC USB2 Loader Device and then it installs
USB-2533 board. In the installation use the options No, not this time
and Install the software automatically (Recommended). When done,
restart the computer.
4. Run the Instacal software on the Windows Start menu.

5. Instacal will detect the USB A/D board and display the following dialog.

6. Click OK. The board will then be listed in Instacal as shown below.

EXTERNAL DEVICES AND INTEGRATIONS 748


7. Right-click on the board and then click on Configure. Then change No. of
Channels to 64 Single Ended. The reason for using single ended is
among other things that force plates usually have this type of signal out-
put.

NOTE: It is possible to run this board in differential mode. However


the wiring inside the connection box has not been optimized for dif-
ferential signals. For differential signals HI and Lo is separated by 8
channels, e.g. Channel 1 HI and LO would be connected to Ch. 1
respectively Ch. 9.

It is important to check that the XAPCR Edge setting matches the setting
for the Synchronization output on the Synchronization page in the Pro-
ject options dialog. The default TTL signal polarity is Negative which
matches XAPCR Edge = Falling.

CAUTION: Keep the XAPCR Pin Direction setting to Input.


Changing this setting may cause damage to the board or to the cam-
era.

EXTERNAL DEVICES AND INTEGRATIONS 749


8. Click OK and then exit Instacal.

When the board is properly installed it will be listed on the Input devices page
in the Project options dialog in QTM.
Installing the USB-1608G board

Installing the USB-1608G board requires that InstaCal has been installed on the
computer. If needed, reinstall QTM with the InstaCal option checked. Follow
this procedure to install the USB A/D board:

1. Connect the USB-1608G board to USB port.

2. Start InstaCal via the Windows start menu.

3. The USB-1608G board should appear in the plug and play dialog. Click OK.

EXTERNAL DEVICES AND INTEGRATIONS 750


4. Open the configuration dialog (double click or right click and select "Con-
figure").

5. Set "Input mode" to "Single Ended (16 ch).

EXTERNAL DEVICES AND INTEGRATIONS 751


Connection of analog board
The analog board is activated on the Input Devices page in the Project
options dialog, see chapter "Input Devices" on page 218. The analog board set-
tings are saved in the project with the serial number of the board. So after an
analog board has been used in a project then it is visible even if you disconnect
the board from the computer or it is not active in Instacal.
It is important to activate channels, where a cable is connected, on the page for
the active analog board in the Project options dialog. It is also a good idea to
not use the channels where nothing is connected, since these channels will only
make the file larger.
The number of analog samples (frames) depends on the Sample rate setting
on the page of the analog board in the Project options dialog, see chapter
"Sample rate" on page 292. The number of analog frames, therefore, does not
have to be the same as the number of frames in the marker capture.
Another setting on the page for the analog board in the Project options dialog
is the remove offset and drift option. Offset is for example that a force plate
has an output that differs from 0 V when it is unloaded. Drift is for example
when the output from the force plate slowly changes even when it is unloaded.
There is a warning in QTM if the offset value is higher than 5 V, see chapter
"Analog offset warning" on the next page.
To get synchronized analog data and motion capture data, a synchronization
signal must be sent to the analog board, either from a sync out connector on
an Oqus camera or one of the sync out ports of the Camera Sync Unit.

NOTE: For Oqus it is recommended to connect the sync cable to the


Master camera if the synchronization signal is faster than 5000 Hz.

EXTERNAL DEVICES AND INTEGRATIONS 752


The recommended synchronization option is the frame synchronization. Activ-
ate it with the External sync (frame sync) option for the analog board, see
chapter "Sample rate" on page 292. For this option you must connect the Sync
out connection from an Oqus camera or a Camera Sync Unit to the Sync con-
nection on the back of the box.
For USB-2533 the trigger edge that is used by the board can be set in Instacal
see chapter "Installing the USB-2533 board" on page 748. Make sure that the
XAPCR edge setting for the board matches the TTL signal polarity setting on
the Synchronization page in the Project options dialog.
The other option is to simultaneously start the analog capture with the camera
system. Then the frequency is set internally by the analog board, which means
that there may be a small drift between the systems. To use this option connect
the Sync out connection from an Oqus camera or one of the Sync Outputs of
the Camera Sync Unit to the External trigger or Trig connection on the A/D
board. Then select the Simultaneous start option for the analog board, see
chapter "Sample rate" on page 292.

Analog offset warning


QTM will warn you if the offset that is being removed from a channel is higher
than 5 V. This is because it usually indicates that there is more offset than you
would usually like to remove. For example if you stand on a force plate when
the measurement starts, then if you remove the offset the force data will be
wrong.

EXTERNAL DEVICES AND INTEGRATIONS 753


The offset compensation is activated for each analog board on its respective
page, see chapter "Compensate for analog offset and drift" on page 295. The
warning is always displayed at the end of a measurement if the offset is too
high.

You have the following two options in the warning after you have captured a
file.

NOTE: If you have selected the Remove offset in real-time option, the
warning will appear directly when you start the capture. The capture will
however continue and if you like you can wait to choose what you want to
do until the capture is finished. If you choose what to do before the file
has finished, the warning will not appear again.

Subtract offset anyway


Subtract the analog offsets on the board in the file and keep the offset
compensation options activated on the Analog board page.

Don't subtract offsets for this board


Do not subtract the offsets on the board in the file and turn off the offset
compensation options on the Analog board page.

NOTE: If you do not want to turn off the offset compensation


options, select Subtract offsets anyway and then right click in the
Data info window to turn off the offset compensation on the chan-
nels with too high analog offset, see chapter "Analog data inform-
ation" on page 174.

If the In RT too option is activated on the Analog board page, then there is
also a warning when you start RT/preview with New on the File menu. If you do
not use the RT output it is recommended to inactivate the Remove offset in

EXTERNAL DEVICES AND INTEGRATIONS 754


real-time option, because for example if you zero a force plate after you star-
ted RT/preview then you will get the wrong force data during RT. However the
data in a captured file will still be ok, because the offset is calculated again
from the first samples in the file.

You have the following options when the warning is displayed in RT/preview
when you start a new measurement with New on the File menu.

NOTE: The offset check is performed every time the RT has to be restar-
ted, for example if you change a setting in Project options.

Subtract offset anyway


Subtract the analog offsets on the board in RT and keep the offset com-
pensation options activated on the Analog board page.

Don't subtract offsets for this board


Do not subtract the offsets on the board in RT and turn off the offset com-
pensation options on the Analog board page.

NOTE: If you do not want to turn off the offset compensation for all
of the channels, select Go to setting... and then select on which
channels to use the offset compensation, see chapter "Channels" on
page 297.

Go to settings...
Open the Project options dialog and go to the Analog boards page. You
have to select the correct analog board manually to change the offset set-
tings, see chapter "Compensate for analog offset and drift" on page 295.

EXTERNAL DEVICES AND INTEGRATIONS 755


Analog saturation warning

The analog data is checked after a measurement so that no channels have


been saturated, i.e. that the analog value has not reached the limits of the
range. This warning is shown in the status bar and in the Messages window.

How to use force plates

Digital force plate integrations


Connecting AMTI Digital force plates

When using an AMTI Gen5, OPT-SC (Optima Signal Conditioner) amplifier or the
AccuGait Optimized and AccuPower Optimized plate you need to follow these
steps. The steps are the same for all of the amplifiers and plates so the instruc-
tions only refer to Gen5.

1. Download the drivers to the computer. There is a Download device


drivers link on the Input Devices page in Project Options or you can
copy them from the CD supplied with AMTI Gen5.
2. Connect the AMTI Gen5 to the computer. The installer will ask for the
drivers browse to the folder where you saved them. Alternatively, follow
the instructions for how to install the AMTI drivers, which are included in
the ZIP-file with the driver.
3. Connect a BNC cable from Sync out on the Oqus camera to the Gen-
lock/Trigger input connection on the AMTI Gen5 amplifier. An adapter is
needed from the BNC connector to the RCA connection of AMTI Gen5, this
is supplied by Qualisys AB.

EXTERNAL DEVICES AND INTEGRATIONS 756


4. Connect the force plate to the AMTI Gen5 amplifier.

These are the hardware connections that are needed. The next step is to activ-
ate the AMTI Gen5 force plate in QTM.

1. Go to Input Devices page in the Project options dialog.


a. Activate the AMTI Gen5 amplifier that are connected to the system.

2. Go to the page for each amplifier under the Force plates page, see
chapter "AMTI Digital force plates" on page 306.
a. Set the Sync source option to the device that the sync cable is con-
nected to.
b. Click on Advanced to change the frequency of the force plate. You
will go to the Synchronization page with just the device used as
Sync source selected. Change the Synchronization output setting
to the desired frequency, see chapter "Synchronization output" on
page 285. It is recommended to use the Camera frequency mul-
tiplier option to set the frequency, because for example when

EXTERNAL DEVICES AND INTEGRATIONS 757


exporting to C3D then the force frequency must be a multiple of the
marker capture rate.

NOTE: If there is an analog board in the system, that is also


frame synchronized, then it is recommended to use the same
camera as synchronization source for the analog board and
the AMTI Gen5. Then you can use the Sample rate option on
the Analog board page to control the frequency on both of the
boards.

3. Go to the Force data page in the Project options dialog. Rename the
force plate if you want it to have a custom name.
4. Open the Force plate page for each AMTI Gen5 force plate.
l Enter the position of the force-plate, see chapter "Force plate loc-
ation" on page 382. It is good to do this after each new calibration
especially if the calibration L is not placed at the same position.
5. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured file make sure that it is activ-
ated both for Real-time actions and Capture actions.
6. Test the force plates in QTM. If there is no force, check that the Sync cable
is connected correctly.
l If the cable is connected, make a test in AMTI Netforce to see if there
is force data in that program.

EXTERNAL DEVICES AND INTEGRATIONS 758


NOTE: The AMTI Gen5 plate is reset at the following operations in
QTM:
l New file

l Changing a setting in Project option in QTM during pre-


view
l Just before the Start capture dialog is opened before a
capture
l In a batch capture just before QTM starts Waiting for
next measurement/trigger

Connecting Arsalis force plates

The Arsalis force plates are digitally integrated in QTM. For further information
about the force plates, refer to the manufacturer's documentation.
The sections below describe how to connect the force plates and how to set
them up in QTM.

Hardware connections

The hardware is connected in the following ways.


Connection of the force plates
The force plates can be connected to the same computer running QTM or
a different computer connected through the same local area network

EXTERNAL DEVICES AND INTEGRATIONS 759


(LAN). Most commonly, the Arsalis connection box is directly connected to
a computer via an Ethernet adapter. When connecting to the same com-
puter that is running QTM, make sure that it is not connected to the same
physical network as the Qualisys camera system.
These instructions describe a setup in which the force plates are con-
nected to the same computer that is running QTM. For detailed instruc-
tions on how to connect the force plates and install the 3D-Forceplate
software for controlling the force plates, refer to the manufacturer's doc-
umentation.

Synchronization
The use of hardware synchronization is required and requires a Camera
Sync Unit for Arqus or Miqus systems, or a Sync/Trigger splitter for Oqus
systems.
Connect one of the Sync out outputs of the Camera Sync Unit to the Trig
In port of the Arsalis connection box. If you are using an Oqus camera as
sync device, use the Sync out connector of the Sync/Trigger splitter. In
the Synchronization settings, set the Synchronization output mode for the
Sync Out port to Mulitplier with a multiplier of 1. Make sure that the polar-
ity matches between the settings in QTM and 3D-Forceplate. Use the
default option of Negative polarity for the sync out signal.

Set up a data stream

Before setting up the connection in QTM, start a data stream in the 3D-For-
ceplate software.

1. Start the 3D-Forceplate software. If you need to unlock, try the default
password 1234.

2. Press Stream data, and zero the force plate. This opens up a 3D-For-
ceplate Data Streaming window with information about the connection.

EXTERNAL DEVICES AND INTEGRATIONS 760


NOTE: If the Stream data option is not available, contact Arsalis sup-
port.

3. If running the 3D-Forceplate software on a different computer, make note


of the Local server IP address for setting up the connection in QTM.
4. Make sure that the synchronization settings for the data stream are cor-
rect. To open the settings dialog, double click on the gray Hardware area
in the 3D-Forceplate data streaming window.
Start trigger if enabled should be set to on a falling edge on TRIG input.

Set up and configuration in QTM

Once the 3D-Forceplate data stream is set up, the Arsalis device can be added
and configured in QTM.

1. Open the Input Devices page in the QTM Project Options.

2. Click the Add Device button, and select Arsalis in the drop down menu.

EXTERNAL DEVICES AND INTEGRATIONS 761


3. Check the Arsalis item in the Input Devices list. The Arsalis device should
now show up as an input device under the Force Plates category.
4. Open the Arsalis settings page, see chapter "Arsalis" on page 308.

5. Fill in the Local server IP address in the IP address setting. If the force
plate is connected to the same computer that is running QTM, you can
use the localhost IP address (127.0.0.1).
6. Fill in the Port Number of the server. Make sure that it is the same as in
the 3D-Forceplate Streaming Window.
7. Press the Locate Force Plates button in QTM. If the connection is estab-
lished, the information about the Arsalis force plates will show up in the
list below.
8. Make sure that the Trigger mode option is set to start only to get syn-
chronized start of the force plates.
9. Finalize configuring the device by setting the sample rate with the Fre-
quency option.
10. You can also zero the force plates from the Arsalis settings page.

When the force plates have been added to QTM, the next step is to configure
the force data calculation. The steps below presume that the force plate loc-
ations for your setup have already been defined in the 3D-Forceplate software.

1. Go to the Force Data page.

2. Click on Define Plates to import the definitions for all the current Arsalis
force plates to the Force plates list.
3. Make sure that the plates you want to use are enabled in the list. You do
not need to do any other settings for the plates, but you can open the set-
tings for each force plate by double-clicking on it in the list.

EXTERNAL DEVICES AND INTEGRATIONS 762


4. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the force plates in QTM.
5. You can set the location of the force plate with the View/Edit button, but
it is recommended to use the locations that are entered in 3D-Forceplate.
This requires that the L-frame for the camera calibration is placed at the
origin used to define the locations in 3D-Forceplate.
6. Optionally, activate the COP threshold to suppress the visualization of
the force vector in QTM when there is no load on the force plate.
7. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured files, make sure that it is activ-
ated both for Real-time actions and Capture actions.

Capturing, viewing and exporting data

To collect data with the Arsalis force plates, simply start a capture in QTM. The
3D-Forceplate Streaming window should show an active client connection when
QTM is in preview and during capture.
To view the Arsalis data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Arsalis analog data includes forces, COP, free moment
and several other signals. To show the force data calculated by QTM, right-click
on the Data Info window and select Force data.

EXTERNAL DEVICES AND INTEGRATIONS 763


When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Connecting Digital Bertec force plates

The Bertec force plates are digitally integrated in QTM when used with the
digital amplifiers AM6500 and AM6800. For further information about the force
plates and how to install them physically, refer to the manufacturer's doc-
umentation.
The sections below describe how to connect the force plates and how to set
them up in QTM.

Hardware connections

The hardware is connected in the following ways.


Connection of the force plates
The force plates are connected with a USB connection to the computer
running QTM. It is recommended to use a USB 3.0 or 3.1 port on the
computer.
Synchronization
The use of hardware synchronization is required and requires a Camera
Sync Unit for Arqus or Miqus systems, or a Sync/Trigger splitter for Oqus
systems.

EXTERNAL DEVICES AND INTEGRATIONS 764


Connect one of the Sync out outputs of the Camera Sync Unit to the
Sync port of the Bertec Amplifier. If you are using an Oqus camera as
sync device, use the Sync out connector of the Sync/Trigger splitter.
The Sync port on the AM6500 requires an adapter from the BNC con-
nector to the SMB connection, this is supplied by Bertec. The Sync port
on the AM6800 amplifier is available on the analog breakout cable sup-
plied with the amplifier.

Software requirements

Make sure that the latest version of the Bertec Digital Plugin is installed. Follow
these steps to download and install the Bertec Digital Plugin:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Download device drivers link.

3. Download the installer for the Bertec digital integration.

4. Run the installer.

Set up and configuration in QTM

Add input device

Once Bertec force plate is connected to the computer, the device can be added
and configured in QTM.

1. Open the Input Devices page in the QTM Project Options.

2. Click the Add Device button, and select Bertec Corporation Device in the
drop down menu.

EXTERNAL DEVICES AND INTEGRATIONS 765


3. Check the Bertec Corporation Device item in the Input Devices list. The
Bertec Corporation Device device should now show up as an input
device under the Force Plates category.
4. Open the Bertec Corporation Device settings page, see chapter "Bertec
corporation device" on page 311.
5. Press the Detect Plates button in QTM. If the connection is established,
the information about the Bertec force plates will show up in the list
below.
6. Make sure that the Bertec sample rate set with the Frequency option is
the same as the frequency set for the used Synchronization output port
on the Synchronization settings page.

Synchronization settings

To configure the synchronization, follow these steps:

1. Open the Bertec Corporation Device settings page, see chapter "Bertec
corporation device" on page 311.
2. To change the frequency of the force data, set the Frequency value and
press the Sync Settings button. Check that the frequency values for the
force plate channels are updated.
3. Open the Synchronization page under Project Options > Input Devices
> Camera System.
4. Use the following settings for the used Synchronization output (Out 1,
Out 2 or Synchronization output):
l Mode: Independent frequency

l Output Frequency: Set the Output Frequency value to the same


Frequency value as specified on the Bertec Corporation Device set-
tings page.

EXTERNAL DEVICES AND INTEGRATIONS 766


Set up of force data

When the force plates have been added to QTM, the next step is to configure
the force data calculation.

1. Go to the Force Data page.

2. Click on Define Plates to import the definitions for all the current Bertec
force plates to the Force plates list.
3. Make sure that the plates you want to use are enabled in the list.

4. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the force plates in QTM.
5. Set the location of the force plate with the Generate or View/Edit but-
ton, see chapter "Force plate location" on page 382.
6. Optionally, activate the COP threshold to suppress the visualization of
the force vector in QTM when there is no load on the force plate, see
chapter "COP (Center Of Pressure) threshold" on page 386
7. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured files, make sure that it is activ-
ated both for Real-time actions and Capture actions.

EXTERNAL DEVICES AND INTEGRATIONS 767


Capturing, viewing and exporting data

To collect data with the Bertec force plates, simply start a capture in QTM. The
force data is automatically synchronized with the start of the capture. Make
sure to re -zero the force plates when needed using the Zero Plates button on
the Bertec Corporation Device settings page, see chapter "Bertec corporation
device" on page 311.
To view the Bertec data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Bertec analog data includes forces and moments. To
show the force data calculated by QTM, right-click on the Data Info window
and select Force data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Connecting Kistler digital force plates

Hardware requirements

The Kistler force plate integration supports Kistler force plates with digital out-
put of the following types:
l Force plates with built in digital output (e.g., Digital force plate Type
9667AA...)
l Force plate with charge output after upgrade (e.g., Type 9281EA/E or
9287CA/C with corresponding DAQ 2.0 Type 5437A1). For more inform-
ation about upgrading force plates to digital force plates, contact Kistler
support.

EXTERNAL DEVICES AND INTEGRATIONS 768


For the synchronization, you need a Kistler Sync Box Type 5699A and a Qualisys
system with a Camera Sync Unit. If you have an Oqus system, you need a syn-
c/trig splitter cable (art. 510870) connected to the control port of one of the
cameras.

NOTE: If you have force plates connected to a Kistler DAQ Type 5695B,
please refer to chapter "Connecting Kistler DAQ Type 5695B" on
page 780.

Software requirements

The following software is required for configuring and using Kistler digital force
plates with QTM.
Kistler software:
l BioWare (including DataServer)

l SetupWizard.exe: Network setup wizard for Kistler devices

Please, refer to Kistler resources or support for information about version


requirements and download.
Make sure that the latest version of the Kistler force plate integration for QTM
is installed. Follow these steps to download and install the integration:

EXTERNAL DEVICES AND INTEGRATIONS 769


1. In QTM, open the Input Devices page under Project Options.

2. Click the Download device drivers link.

3. Download the installer for the Kistler force plate integration.

4. Run the installer.

Configuration of force plates

The Kistler system should be configured for use with QTM according to the
instructions below. For more detailed information, contact Kistler or Qualisys
support.

Network configuration

The force plates are connected to the computer through Ethernet. It is recom-
mended to connect the Kistler system to the same network as the Qualisys cam-
eras. Use the Kistler SetupWizard utility to configure the network settings of the
Kistler devices (force plates and Kistler Sync Box). The Kistler devices should be
configured with static IPv4 addresses.

NOTE: If the static IP addresses of the Kistler devices are known, you can
also configure the Qualisys camera network to use the same subnet as
the Kistler devices, rather than reconfiguring the network settings of the
Kistler devices. For instructions on how to configure the network adapter
settings with QDS, see chapter "Advanced" on page 467.

To configure the network settings of the Kistler devices, follow these steps:

1. Take note of the subnet that is used for the camera network. In QTM, this
can be found on the Camera System page under Project Options > Input
Devices as the Interface in the Camera system settings information tab
(typically, 192.168.254.x).
2. Run SetupWizard.exe and click Find devices. All Kistler devices connected
to the network should show up in the list.

EXTERNAL DEVICES AND INTEGRATIONS 770


3. Select a force plate and click Next.

4. Set the IPv4 mode to Static IP. Optionally, change the configured name to
help you identify the force plate. Click Next when done.

EXTERNAL DEVICES AND INTEGRATIONS 771


5. Enter a unique IP address for the device on the subnet noted in step 1.
For example, if the subnet is 192.168.254.x, x can be set to any number
between 2 and 255 (number 1 is reserved for the QTM computer).

6. Press Next to store the configuration changes to the device. The SetupW-
izard should show the confirmation page.
7. Press Start over to configure the next device, or Exit when all devices are
correctly configured.

NOTE: Once you set the IP addresses of the Kistler devices, you may
need to reboot the Qualisys camera system to make sure that they don't
have overlapping IP addresses.

Force plate configuration

Next, the force plate configuration should be defined in the Kistler BioWare
software and stored in a configuration file. Follow these steps:

EXTERNAL DEVICES AND INTEGRATIONS 772


1. Open the Kistler BioWare program.

2. In the Setup menu, go to Hardware > A/D board and select Ethernet DAQ
Device(s).
3. Open the Device Setup window (Setup > Hardware > Devices) and click
New....

4. Click Find... to discover the connected Kistler force plates.

EXTERNAL DEVICES AND INTEGRATIONS 773


5. Select the first force plate from the list and click OK.

6. Add the force plate to the Active Devices list.

EXTERNAL DEVICES AND INTEGRATIONS 774


7. Optionally, you can edit the properties of the force plate, such as the amp-
lifier ranges for shear and vertical forces. These properties will be stored
in the configuration file.

8. Repeat steps 4-6 until all force plates are present in the Active Devices list.

EXTERNAL DEVICES AND INTEGRATIONS 775


Once all devices have been added to the Active Devices list, you can create the
configuration file. The order of the force plates in the configuration file, and
consequently in the input device list in QTM is the same as that in the Active
Devices list. Follow these steps to create the configuration file:

1. In the Setup menu, click Save DataSever configuration file....

2. Name the file config.xml and save it in the BioWare XML folder (typically
C:\Kistler\BioWare\XML).
3. Open the config.xml file in a text editor and add the following lines after
the <ConfigurationName> line to add the Kistler Sync Box (replace Seri-
alNumber and Address with the actual serial number and IP address,
respectively):

<EthernetTriggerDevice>
<Type>bio-digital</Type>
<SerialNumber>1234567</SerialNumber>
<Manufacturer>Kistler</Manufacturer>
<Name>bio-digital</Name>
<Address>123.456.789.123</Address>
</EthernetTriggerDevice>

4. Save the modified config.xml file.

Hardware setup

EXTERNAL DEVICES AND INTEGRATIONS 776


The Kistler system is connected as follows:

1. Connect up to 16 Kistler digital force plates to the Kistler Sync Box (Type
5699A). The digital Kistler devices can be connected in a daisy chain.
2. Use the Power/Ethernet cable set (Type 5793) to power the Kistler devices
and to connect them to the computer via Ethernet. It is recommended to
use the same network as used for the Qualisys camera system using an
Ethernet switch.

NOTE: Contact Qualisys support for a recommended Ethernet


switch.

3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync input of the Kistler Sync Box with a BNC cable. If you are using an
Oqus camera as sync device, use the Sync out connector of the Syn-
c/Trigger splitter.

Set up and configuration in QTM

Add input device

Add the Kistler force plates to QTM:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Add Device button and select Kistler Force Plates in the drop
down menu.

3. Check the Kistler Force Plates item in the Input Devices list. The Kistler
Force Plates device should now show up as an input device under the
Force Plates category. The force plates should be included in the device
list as defined in the Kistler config.xml file.

EXTERNAL DEVICES AND INTEGRATIONS 777


4. Open the Kistler Force Plates settings page to access the Kistler device
settings, see chapter "Kistler Force Plates" on page 313.
5. To change the frequency of the force data, set the Frequency value and
press the Sync Settings button. Check that the frequency values for the
force plate channels are updated.

NOTE: It is highly recommended to use a sample frequency that is


an integer multiple of the capture frequency.

Synchronization settings

To configure the synchronization, follow these steps:

1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used Synchronization output (Out 1,
Out 2 or Synchronization output):
l Mode: Independent frequency

l Output Frequency: 1 Hz

l TTL signal polarity: Positive

Set up of force data

For setting up the force data in QTM, follow these steps:

1. In QTM, open the Force Data page under Project Options > Processing.

2. Click the Define Plates button to add the Kistler force plates to the Force
Plates list.

EXTERNAL DEVICES AND INTEGRATIONS 778


3. Make sure that the plates you want to use are enabled in the list.

4. Select a force plate and click the Edit Plate button (or double click) to edit
the force data settings.

5. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the force plates in QTM.
6. Set the location of the force plate with the Generate or View/Edit but-
ton, see chapter "Force plate location" on page 382.

EXTERNAL DEVICES AND INTEGRATIONS 779


7. Optionally, activate the COP threshold to suppress the visualization of
the force vector in QTM when there is no load on the force plate, see
chapter "COP (Center Of Pressure) threshold" on page 386.
8. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured files, make sure that it is activ-
ated both for Real-time actions and Capture actions.

Capturing, viewing and exporting data

To collect data with the Kistler force plates, simply start a capture in QTM. The
force data is automatically synchronized with the start of the capture. The force
plates are automatically re-zeroed when starting a preview or a capture, so
make sure that the force plates are unloaded at these instances.
To view the Kistler data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Kistler analog data includes forces and moments. To
show the force data calculated by QTM, right-click on the Data Info window
and select Force data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Connecting Kistler DAQ Type 5695B

Hardware requirements

The Kistler force plate integration supports Kistler force plates that can be con-
nected via the Kistler DAQ Type 5695B.
The following hardware is required:
l Kistler DAQ Type 5695B (or A)

l Sync/trig splitter cable for Kistler DAQ (e.g., art. 230133)

For the synchronization, you need a Qualisys system with a Camera Sync Unit.
If you have an Oqus system, you need a sync/trig splitter cable (art. 510870)
connected to the control port of one of the cameras.

EXTERNAL DEVICES AND INTEGRATIONS 780


Software requirements

The following software is required for configuring and using the Kistler DAQ
Type 5695B with QTM:
l Instacal (included with QTM, see chapter "Software installation" on
page 54)
l Kistler BioWare (including DataServer)

Please, refer to Kistler resources or support for information about version


requirements and download.
Make sure that the latest version of the Kistler force plate integration for QTM
is installed. Follow these steps to download and install the integration:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Download device drivers link.

3. Download the installer for the Kistler force plate integration.

4. Run the installer.

Configuration of force plates

The Kistler system should be configured for use with QTM according to the
instructions below. For more detailed information, contact Kistler or Qualisys
support.

Configuration of the Kistler DAQ Type 5695B

The Kistler DAQ must be configured in Instacal:

1. Connect the Kistler DAQ Type 5695B to the computer via USB and make
sure it is switched on.
2. Open the Instacal program. The Kistler DAQ should appear as a USB-2533
board in the list.

EXTERNAL DEVICES AND INTEGRATIONS 781


3. Double click on the board to open the configuration dialog.

4. Configure the board with the following settings:


l No. of channels: 64 Single Ended

l XAPCR Edge: Falling

5. Click OK and close Instacal.

Force plate configuration

The force plate configuration is done in the Kistler Bioware program.


First, select the A/D board as follows:

EXTERNAL DEVICES AND INTEGRATIONS 782


1. Open the Bioware program.

2. In the Setup menu, click Hardware > A/D board to open the Data Acquis-
ition Configuration dialog.

3. In the Data Acquisition Dialog, select USB DAQ Device, select the USB-
2533 (Type 5695) board, and click OK.

Continue to configure the connected force plates:

1. In the Setup menu, click Hardware > Devices to open the Device Setup
dialog.

EXTERNAL DEVICES AND INTEGRATIONS 783


2. Click the New button to add a new force plate.

3. Follow the instructions in the Device Wizard to complete the configuration


of the force plate. Please refer to Kistler documentation and the force
plate calibration sheet to make sure that the configuration is done cor-
rectly.

EXTERNAL DEVICES AND INTEGRATIONS 784


4. Repeat these steps for all connected force plates.

When done with the configuration of the force plates, save the configuration
file:

1. In the Setup menu, click Hardware > Save DataServer Configuration


File.

2. Name the file config.xml and save it in the BioWare XML folder (typically
C:\Kistler\BioWare\XML).
The force plate range settings are set in the force plate properties and included
in the config.xml file. If you want to change the range settings, you will need to
change the force plate properties in Bioware and overwrite the config.xml file.

EXTERNAL DEVICES AND INTEGRATIONS 785


Hardware setup

The Kistler system is connected as follows:

1. Make sure that the Kistler DAQ Type 5695B is connected to the computer
via USB and that it is switched on.
2. Connect up to 8 Kistler digital force plates to the Kistler DAQ.

3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync input of the Kistler Trig/sync splitter cable with a BNC cable. If you
are using an Oqus camera as sync device, use the Sync out connector of
the Sync/Trigger splitter.

Setup and configuration in QTM

Add input device

Add the Kistler force plates to QTM:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Add Device button and select Kistler Force Plates in the drop
down menu.

EXTERNAL DEVICES AND INTEGRATIONS 786


3. Check the Kistler Force Plates item in the Input Devices list. The Kistler
Force Plates device should now show up as an input device under the
Force Plates category. The force plates should be included in the device
list as defined in the Kistler config.xml file.
4. Open the Kistler Force Plates settings page to access the Kistler device
settings, see chapter "Kistler Force Plates" on page 313.
5. To change the frequency of the force data, set the Frequency value and
press the Sync Settings button. Check that the frequency values for the
force plate channels are updated.

NOTE: It is highly recommended to use a sample frequency that is


an integer multiple of the capture frequency.

WARNING: The Kistler DAQ also shows up as a USB-2533 device in the


input device list. Make sure to only check the Kistler Force Plates device,
and NOT the USB-2533 input device.

Synchronization settings

To configure the synchronization, follow these steps:

1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used Synchronization output (Out 1,
Out 2 or Synchronization output):

EXTERNAL DEVICES AND INTEGRATIONS 787


l Mode: Independent frequency

l Output Frequency: Use the same frequency as set on the Kistler


Force Plates settings page.
l TTL signal polarity: Negative

Set up of force data

For setting up the force data in QTM, follow these steps:

1. In QTM, open the Force Data page under Project Options > Processing.

2. Click the Define Plates button to add the Kistler force plates to the Force
Plates list.

3. Make sure that the plates you want to use are enabled in the list.

4. Select a force plate and click the Edit Plate button (or double click) to edit
the force data settings.

EXTERNAL DEVICES AND INTEGRATIONS 788


5. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the force plates in QTM.
6. Set the location of the force plate with the Generate or View/Edit but-
ton, see chapter "Force plate location" on page 382.
7. Optionally, activate the COP threshold to suppress the visualization of
the force vector in QTM when there is no load on the force plate, see
chapter "COP (Center Of Pressure) threshold" on page 386.
8. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured files, make sure that it is activ-
ated both for Real-time actions and Capture actions.

Capturing, viewing and exporting data

To collect data with the Kistler force plates, simply start a capture in QTM. The
force data is automatically synchronized with the start of the capture. The force
plates are automatically re-zeroed when starting a preview or a capture, so
make sure that the force plates are unloaded at these instances.
To view the Kistler data during preview or a capture, open a Data Info window
via the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data. The Kistler analog data includes forces and moments. To
show the force data calculated by QTM, right-click on the Data Info window
and select Force data.

EXTERNAL DEVICES AND INTEGRATIONS 789


When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.

Connection of analog force plates


With a force plate the force can be captured together with the motion capture.
The force plate is connected to any free channels on the analog board of the
measurement computer, see chapter "How to use analog boards" on page 747.
The data of any force plate with an analog output between ± 10V can be col-
lected into QTM through the supported analog boards. QTM supports cal-
culation of force data from analog data from the force plates of the following
manufacturers: AMTI, Bertec and Kistler.
For information about how to use the force data see chapter "Force data cal-
culation" on page 703. To calculate the force the settings for the force plate
must be specified on the Force plate page in the Project options dialog, see
chapter "Force plate settings" on page 362.
Connecting Kistler force plates

Kistler force plates must have both the analog channels and a control cable con-
nected to the analog board. The following is a description of how to install a
Kistler force plate with an analog board. It only describes the connection from
the Kistler connection box to the analog board. For a description of the con-
nection between the force plate and the connection box, please refer to the
Kistler manual.

NOTE: For information about digital Kistler integrations in QTM, see


chapters "Connecting Kistler digital force plates" on page 768 and "Con-
necting Kistler DAQ Type 5695B" on page 780.

Start with the hardware connections to the analog board. The picture below is
an example of the setup.

EXTERNAL DEVICES AND INTEGRATIONS 790


1. Make sure that the force plate is working in Kistler's BioWare.

2. Check whether you have a Kistler connection box 5606 or Kistler control unit
5233A2.

3. Connect the analog signals (8 BNC cables) from the Kistler box to 8 channels
on the analog board.

l Make sure that the analog channels are connected in the correct order.

l Do not connect the analog signals from one force plate to different ana-
log boards.

4. Connect the Digital cable from the Digital I/O on the analog board where
the Kistler force plate is connected. The force plate is then controlled with
the settings on the Force plate control settings page in the Project
options dialog.
The connection differs between the analog boards. Check that you have
the correct cable with Qualisys AB.
USB-2533
There are two Digital I/O ports (DSUB15) on the front of the analog
board.

EXTERNAL DEVICES AND INTEGRATIONS 791


5606
Connect the DSUB15 end of the cable (230118) to port A on the ana-
log board. Connect the DSUB37 end of the cable to the Digital input
on the 5606 box, i.e. the bottom right connector.

5233A2
You can either use a cable that controls two force plates per I/O port
(230137) or a cable that controls one force plate on the first I/O port
(230129).
Connect the DSUB15 end of the cable to port A on the analog board.
Connect the DSUB37 end of the cable to the DSUB37 connector on
the 5233A2 units.
Remember to press the Remote on the 5233A2 units to activate the
digital control.

NOTE: If you want to use port B on the analog you must have
a cable for 2 force plates in port A so that you can control the
3rd and 4th force plates from port B.

The Digital I/O is also used to reset the force plate before a measurement.
It is important to not stand on a Kistler force plate when the reset signal is
sent. It is sent at the following operations in QTM:
l New file

l Changing a setting in Project option in QTM during preview

l Just before the Start capture dialog is opened before a capture

l In a batch capture just before QTM starts Waiting for next meas-
urement/trigger

NOTE: The auto-zeroing of the force data can be disabled on the


Kistler Force-Plate Control Settings page in the Project Options
dialog, see chapter "Force plate auto-zero" on page 302.

EXTERNAL DEVICES AND INTEGRATIONS 792


5. Remember to check that the sync out signal from the camera system is con-
nected to the Sync input on the analog board, see chapter "Connection of
analog board" on page 752. Otherwise the analog capture will not start.

This is all of the connections that is needed to connect the force plate to the
analog board. Then you must add the force plate in QTM follow these steps:

1. Go to Analog board settings page in the Project options dialog. Make


sure that it is the analog board where the analog channels and the digital
I/O cable are connected.
a. Activate the channels for the force plate. You can rename the analog
channels to the Kistler signal names so that it is easier to define the
channels for the force plate.
b. Set the Sample rate for the analog board:
External Sync (recommended)
Select the correct synchronization output port and sample rate
to get the desired frequency, see chapter "Sample rate" on
page 292.

Trigger start
Specify the frequency in multiples of the marker capture rate.
For normal gait measurements you can use a sample rate of
600-1000 Hz. For sport measurements you need a bit higher
sample rate.

c. Go to the Force plate control settings page and add the number of
Kistler force plates that you want to control to the list.
2. Create the force plates on the Force data page in the Project options dia-
log.
3. Open the Force plate page.
a. Enter all of the calibration parameters for the force plates, see
chapter "Kistler force plate calibration parameters" on page 370.
They are found in the manual for the force plate. Use the option
Select by forceplate control, so that the ranges used in the cal-
culation are always correct.

EXTERNAL DEVICES AND INTEGRATIONS 793


b. Select the correct analog channels for each force-plate, see chapter
"Kistler force plate settings" on page 373.
4. Enter the position of the force-plate, see chapter "Force plate location" on
page 382. It is good to do this after each new calibration especially if the
calibration L is not placed at the same position.
5. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured file make sure that it is activ-
ated both for Real-time actions and Capture actions.
6. Test the force plates in QTM. If there is no force, first check that there is
no signal on the analog channels.
a. If there are signals on the analog channels the error is in the settings
in QTM. Check the steps 1-5 above.
b. If there are no analog signals.
Disconnect the digital I/O cable from the Kistler unit and connect the
cable for BioWare.
Start a measurement in BioWare and then start a measurement in
QTM at the same time. The analog signals is sent to both BioWare
and QTM, so you can see the force data in both.
l If there is now analog signals in QTM the control signals are
not sent from QTM to the Kistler force plate. Connect the
digital I/O cable again. Check that it is connected to the correct
ports and that the Remote button is pressed on the 5233A2
control unit.

NOTE: It is possible to see the current ranges on the


5233A2 control unit and on some Kistler amplifiers.
Check them so that the ranges are the same as in QTM.

l If there is still no analog signal the BNC cables are probably


connected to the wrong channels. Double-check the con-
nections.

EXTERNAL DEVICES AND INTEGRATIONS 794


Connecting AMTI and Bertec force plates

For AMTI and Bertec force plates only the analog channels needs to be con-
nected to the analog board. The following is a description of how to install a
AMTI or Bertec force plate in QTM. It only describes the connection from the
AMTI or Bertec amplifier to the analog board. For a description of the con-
nection between the force plate and the amplifier, please refer to the AMTI or
Bertec manual.

NOTE: For information about digital AMTI and Bertec integrations in


QTM, see chapters "Connecting AMTI Digital force plates" on page 756
and "Connecting Digital Bertec force plates" on page 764.

IMPORTANT: For the AMTI portable plate you need an extra box [art.
no. 230009] from Qualisys ABthat has an output of the 8 analog channels
as BNC.

Start with the hardware connections to the analog board. The picture below is
an example of the setup.

1. If you have AMTI's or Bertec's software make sure that the force plate is
working in those programs.

EXTERNAL DEVICES AND INTEGRATIONS 795


2. Connect the analog signals from the AMTI (6 BNC cables) or AMTI portable
and Bertec amplifier (8 BNC cables) to 6 respectively 8 channels on the
analog board.
l Make sure that the analog channels are connected in the correct
order.
l Do not connect the analog signals from one force plate to different
analog boards.
3. Remember to check that the sync out signal from the camera system is
connected to the Sync input on the analog board, see chapter "Con-
nection of analog board" on page 752. Otherwise the analog capture will
not start.
This is all of the connections that is needed to connect the force plate to the
analog board. Then you must add the force plate in QTM follow these steps:

1. Go to Analog board settings page in the Project options dialog.


a. Activate the channels for the force plate. You can rename the analog
channels to the force plate signal names so that it is easier to define
the channels for the force plate.
b. Set the Sample rate for the analog board, it is specified in multiples
of the marker capture rate. For normal gait measurements you can
use a sample rate of 600-1000 Hz. For sport measurements you
need a bit higher sample rate.
2. Create the force plates on the Force data page in the Project options dia-
log.
3. Open the Force plate page.
a. Enter all of the calibration parameters for the force plates, see
respectively chapter "AMTI force plate calibration parameters" on
page 363 and "Bertec force plate calibration parameters" on
page 368.

EXTERNAL DEVICES AND INTEGRATIONS 796


NOTE: Notes on calibration parameters:
l AMTI is supplied with the calibration file, use it to import
the calibration matrix.
l Bertec usually only supplies the diagonal of the cal-
ibration matrix in the manual.

b. Select the correct analog channels for each force-plate, see respect-
ively chapter "AMTI force plate settings" on page 364 and "Bertec
force plate settings" on page 369.
4. Enter the position of the force-plate, see chapter "Force plate location" on
page 382. It is good to do this after each new calibration especially if the
calibration L is not placed at the same position.
5. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured file make sure that it is activ-
ated both for Real-time actions and Capture actions.
6. Test the force plates in QTM. If there is no force, first check that there is
no signal on the analog channels.
l If there are signals on the analog channels the error is in the settings
in QTM. Check the steps 1-5 above.
l If there are no analog signals. Check if the BNC cables are connected
to the wrong channels.

How to use instrumented treadmills

Connecting a Gaitway-3D instrumented treadmill


The Gaitway 3D is an instrumented treadmill based on a joint design by h/p/-
cosmos and Arsalis. It is a single belt treadmill that measures the ground reac-
tion forces and torques in three directions. For further information about the
treadmill, refer to the manufacturer's documentation.
The sections below describe how to connect the treadmill and how to set it up
in QTM.

EXTERNAL DEVICES AND INTEGRATIONS 797


Hardware connections

The hardware is connected in the following ways.


Connection of the treadmill
The treadmill can be connected to the same computer running QTM or a
different computer connected through the same local area network (LAN).
Most commonly, the Gaitway-3D amplifier is directly connected to a com-
puter via an Ethernet adapter. When connecting to the same computer
that is running QTM, make sure that it is not connected to the same phys-
ical network as the Qualisys camera system.

These instructions describe a setup in which the treadmill is running on


the same computer that is running QTM. For detailed instructions on how
to connect the treadmill and install the Gaitway-3D software for con-
trolling the treadmill, refer to the manufacturer's documentation (manual
reference: TM-MAN-0004-ARS).

Synchronization
The use of hardware synchronization is optional but recommended. Hard-
ware synchronization requires a Camera Sync Unit for Arqus or Miqus sys-
tems, or a Sync/Trigger splitter for Oqus systems.

Connect the MEAS. TIME output of the Camera Sync Unit to the Trigger in
port of the Gaitway-3D amplifier. If you are using an Oqus camera as sync
device, use the Sync out connector of the Sync/Trigger splitter and in the
Synchronization settings, set the Synchronization output mode to Meas-
urement time.

EXTERNAL DEVICES AND INTEGRATIONS 798


Set up a data stream

Before setting up the connection in QTM, start a data stream in the Gaitway-3D
software.

1. Start the Gaitway-3D software. If you need to unlock, try the default pass-
word 1234.

2. Press Stream data, and zero the force plate. This opens up a Gaitway-3D
Data Streaming window with information about the connection.

NOTE: If the Stream data option is not available, contact h/p/-


cosmos support.

3. If running the Gaitway-3D software on a different computer, make note of


the Local server IP address for setting up the connection in QTM.
4. Make sure that the synchronization settings for the data stream are cor-
rect. To open the settings dialog, double click on the gray Hardware area
in the Gaitway-3D data streaming window.

EXTERNAL DEVICES AND INTEGRATIONS 799


Start trigger if enabled should be set to on a falling edge on TRIG input.

Set up and configuration in QTM

Once the Gaitway-3D data stream is set up, the treadmill can be added and con-
figured in QTM.

1. Open the Input Devices page in the QTM Project Options.

2. Click the Add Device button, and select Gaitway-3D Instrumented Treadmill
in the drop down menu.

3. Check the Gaitway-3D item in the Input Devices list. The Gaitway-3D
device should now show up as an input device under the Instrumented
Treadmills category.
4. Open the Gaitway-3D settings page, see chapter "Gaitway-3D" on
page 315.
5. Fill in the Local server IP address of the treadmill. If the treadmill is con-
nected to the same computer that is running QTM, you can use the loc-
alhost IP address (127.0.0.1).

EXTERNAL DEVICES AND INTEGRATIONS 800


6. Press the Connect button in QTM. If the connection is established, the
information about the Gaitway-3D device should show up in the Gaitway-
3D section on the settings page, and the Gaitway-3D Streaming Window
should show an active client connection.
7. Finalize configuring the device by setting the sample rate. You can also
zero the force plate from the Gaitway-3D settings page.
When the treadmill has been added to QTM, the next step is to configure the
force data calculation.

1. Open the Gaitway-3D force plate settings under Project Options > Pro-
cessing > Force Data.

2. Make sure that the force plate dimensions are known in the Force plate
status settings list. The force plate dimensions are retrieved automatically
when connecting the treadmill in QTM.
3. Set the location of the force plate. The recommended option is to press
the Use default button to set the default location. This requires that the
L-frame is placed at the origin of the tread mill at the rear right corner
when calibrating the camera system, and that it is level with the Gaitway-
3D surface. For alternative options, see chapter "Force plate location" on
page 382.

EXTERNAL DEVICES AND INTEGRATIONS 801


4. Optionally, activate the COP threshold to suppress the visualization of
the force vector in QTM when there is no load on the force plate. However
Gaitway3D has a built-in COP threshold which automatically detects
whether the subject is walking or running. For walking the threshold is 50
N and for running it is 150 N.
5. Activate the Calculate force data option on the Processing page. To see
the force both in preview and in captured files, make sure that it is activ-
ated both for Real-time actions and Capture actions.
Capturing, viewing and exporting data

To collect data with the Gaitway-3D, simply start a capture in QTM.


To view the Gaitway-3D data during preview or a capture, open a Data Info win-
dow via the View menu (keyboard shortcut Ctrl + D), right-click in the window
and select Analog data. The Gaitway-3D analog data includes forces, COP, free
moment and several other signals. To show the force data calculated by QTM,
right-click on the Data Info window and select Force data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.

EXTERNAL DEVICES AND INTEGRATIONS 802


Decomposition of force data

Both the Gait and the Running Analysis Module support the automatic decom-
position of force data with Arsalis’ software to reconstruct separate ground
reaction forces for the left and right foot. To use this decomposition, make sure
that:
l The Gaitway-3D is connected to the computer when starting the analysis.

l The export of analog data for TSV is enabled in the Project Options.

For more information, refer to the manual of the Gait or Running Module.

NOTE: The decomposition of Gaitway-3D force data requires Gait Mod-


ule version 2.1.2 or later, or Running Module version 7.0.4 or later.

How to use EMG

Introduction
With an EMG (electromyography) device the muscle activity can be measured
together with the motion capture. QTM can collect EMG data directly from sev-
eral EMG devices. For an overview of integrated EMG devices and instructions
on how to connect them, see chapter "Wireless EMG systems" on the next
page.
It is also possible to connect an EMG device via an analog board. Then the EMG
device is connected to any free channels on the analog board of the meas-
urement computer, see chapter "How to use analog boards" on page 747. The
channels must be activated on the page for the analog board in the Project
options dialog, see chapter "Channels" on page 297. The EMG device must
have an analog output with an output voltage between ± 10 V.

EXTERNAL DEVICES AND INTEGRATIONS 803


NOTE: QTM can be used to record and plot EMG voltage data with the
mocap data. For more advanced analysis of EMG data you have to export
it to other analysis software.

Wireless EMG systems


Delsys Trigno Integration

This chapter describes the Delsys Trigno integration in QTM. This integration
supports Delsys Trigno Centro and Delsys Research+ systems.

Hardware requirements

The following base stations are supported:


l Delsys Trigno Centro

l Delsys Research+

NOTE: A Delsys Trigger module is required for recording syn-


chronized data with a Delsys Research+ System.

For both types of base stations, all sensors that are compatible with Delsys
Trigno Discover software are supported.
For the synchronization, you need a Qualisys system with a Camera Sync Unit.
If you have an Oqus system, you need a sync/trig splitter cable (art. 510870)
connected to the control port of one of the cameras.

NOTE: For Oqus users.


If you are connecting an analog board or force plates in addition to your
EMG system, you will need a second sync/trig splitter cable. This will be
used to connect the other external system through the control port of a
different Oqus camera. Please contact Qualisys Sales if you need to pur-
chase a second splitter.

EXTERNAL DEVICES AND INTEGRATIONS 804


Software requirements

The following software is required for configuring and using Delsys Trigno with
QTM:
l Delsys Trigno Discover, version 2.0.1.3 or higher.

Please, refer to Delsys resources or support for more information about ver-
sion requirements and download.
Make sure that the latest compatible version of the Delsys Trigno integration
for QTM is installed. Follow these steps to download and install the integration:
l In QTM, open the Input Devices page under Project Options.

l Click the Download device drivers link.

l Download the installer for the Delsys Trigno integration.

l Run the installer.

NOTE: If you need to update Delsys Trigno Discover software of the


Delsys Trigno integration for QTM, you must first uninstall the previous
version before running the installer.

Hardware setup

Delsys Trigno Centro

The base station is connected as follows:

1. Connect the Delsys Trigno Centro base station to the computer via USB.

2. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
first trigger input of the base station with a BNC cable. If you are using an
Oqus camera as sync device, use the Sync out connector of the Syn-
c/Trigger splitter.

EXTERNAL DEVICES AND INTEGRATIONS 805


Delsys Research+

The base station is connected as follows:

1. Connect the base station to the computer via USB.

2. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Trigger input of the Delsys Trigger module with a BNC cable. If you are
using an Oqus camera as sync device, use the [Trig in/Sync out] con-
nector of the Sync/Trigger splitter.

EXTERNAL DEVICES AND INTEGRATIONS 806


Sensor configuration

Before using the Delsys device with QTM for the first time, the sensors must be
configured with the Delsys Trigno Discover software. This is done in a similar
way for Delsys Trigno Centro and Delsys Research+ systems. Follow these steps
to set up and configure the sensors:

1. Connect the Delsys base station to the computer with USB and make sure
it is switched on.
2. Open the Delsys Trigno Discover software.

EXTERNAL DEVICES AND INTEGRATIONS 807


3. To add sensors, press the + button, pair the first sensor and continue pair-
ing the subsequent sensors. For detailed instructions, see the Delsys
Trigno Discover manual.
4. To choose the sensor mode, press the gear icon for a sensor, and specify
the sensor mode and sample rates in the Sensor Configuration dialog.

EXTERNAL DEVICES AND INTEGRATIONS 808


5. In the Trial Settings tab, specify the channel names for the channels of the
respective sensors.
6. Press the Preview Stream button and let the data run for a couple of
seconds to apply the configuration.
7. Close the Delsys Trigno Discover software.

For more detailed information, please refer to Delsys help resources or sup-
port.

Setup and configuration in QTM

Add input device

Add the Delsys Trigno device to QTM:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Add Device button and select xxx in the drop down menu.

3. Check the Delsys Trigno item in the Input Devices list. The Delsys Trigno
device should now show up as an input device under the EMGs category.

Device settings

The Delsys Trigno device settings are managed via the Delsys Trigno settings
page.

EXTERNAL DEVICES AND INTEGRATIONS 809


The Delsys Trigno page contains the following buttons to communicate with the
device and a list with settings for the sensors included in the configuration.

Restore Default Settings


Reset settings to their default values.

Synchronize Settings
Synchronize changed settings to the [xxx] device.

The settings list contains a top section with common settings and a section with
individual settings for each [device/sensor].

Common settings
The common settings are always visible.

API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

Start Input Trigger


Check to use hardware synchronized start.

EXTERNAL DEVICES AND INTEGRATIONS 810


Stop Input Trigger/Output Triggers
Not used.

Individual settings and information for each sensor


The individual settings are displayed for the sensors included in the con-
figuration.

Sensor name, number and type


Each sensor name start with Delsys Trigno, followed by the sensor
name, number and type.

SID
Serial number of the sensor.

Battery Percentage
Indication of the charge level of the battery. Press the Synchronize
Settings button for updating the reading.

Channels
Information about the channels of the sensor (channel name, units,
sample frequency).

Configuration in QTM

Setting up Delsys Trigno in QTM

Follow these steps to set up the Delsys device and sensors in QTM:

1. Make sure that the base station is connected to the computer, and that it
is switched on.
2. Take out the sensors from the charger and switch them on by holding
them against the lock decal on the charging dock.
3. Open the Delsys Trigno settings page under Project Options > Input
Devices > EMGs.
4. Enable the Start Input Trigger check box, and disable the Stop Input Trig-
ger and Output Triggers checkboxes.

EXTERNAL DEVICES AND INTEGRATIONS 811


5. Press the Synchronize Settings button.

NOTE: During the “Synchronize Settings” process, a Windows


“switch to” dialogue box may pop up. Select “switch to”, which will
bring up the windows panel, and then this can be closed by clicking
outside of it.

6. Check that the sensor settings are correct, as configured in the Trigno Dis-
cover software.
l In case you enable or disable sensors, press the Scan button to
update the sensor list.
7. Press OK to close the Project Options dialog.

Synchronization settings

To configure the synchronization, follow these steps:

1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used port Out 1, Out 2 or Syn-
chronization output (Oqus cameras):
l Mode: System Live Time

l TTL signal polarity:


l For the Trigno Centro base station: Positive

EXTERNAL DEVICES AND INTEGRATIONS 812


l For the Research+ base station: make sure that the polarity cor-
responds to the polarity switch on the Delsys Trigger module.

Capturing, viewing and exporting data

To stream or collect data with Delsys Trigno, simply start a preview or a capture
in QTM.
To view the Delsys Trigno data during preview or in a capture, open a Data
Info window via the View menu (keyboard shortcut Ctrl + D), right-click in the
window and select Display Analog data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Delsys Trigno (API integration)

This chapter describes the Delsys Trigno API integration in QTM. The
API integration is recommended if you are using Trigno Avanti-style sensors
(button-less, with a large arrow-shaped status LED). For a complete list of sup-
ported sensors, see "Hardware requirements" on the next page.
The Trigno API integration does not support Trigno legacy sensors (classic and
IM, with a button and a small status LED). If you have Trigno legacy sensors,
you can use the Delsys Trigno SDK integration instead, see "Delsys Trigno EMG
(SDK legacy integration)" on page 826.

IMPORTANT: The Delsys Trigno API integration will be discontinued in


the near future. It is recommended to use the latest Delsys Trigno integ-
ration, see chapter "Delsys Trigno Integration" on page 804.

For using the Delsys Trigno API integration with QTM, follow these steps:

1. Check the hardware requirements

2. Add Delsys Trigno API as input device in QTM, see chapter "Setting up
Delsys Trigno (API) in QTM" on page 816.
3. Choose a synchronization method and connect the equipment accord-
ingly, see chapter "Connecting Delsys Trigno (API)" on page 816.

EXTERNAL DEVICES AND INTEGRATIONS 813


4. Configure the sensors and channels in QTM, see chapter "Configuration of
Delsys Trigno (API) in QTM" on page 820.
5. Capture, view, and export data, see chapter "Capturing, viewing, and
exporting data" on page 825.

Hardware requirements

Delsys hardware

l Delsys Trigno base station.

NOTE: Trigno Lite is also supported but not recommended for use
with QTM. The Trigno Lite does not support hardware syn-
chronization and the bandwidth is limited to 4 data slots.

l Delsys Trigno sensors, see list of supported sensors below.

Hardware needed for synchronization

l Qualisys Camera Sync Unit, or a sync/splitter cable if you are using an Oqus
system.

l Delsys Trigger Module

The following types of sensors are supported in the Delsys Trigno API integ-
ration.

EMG sensors

l Trigno Avanti Sensor

l Trigno Mini Sensor

l Trigno Duo Sensor

l Trigno Quattro Sensor

EXTERNAL DEVICES AND INTEGRATIONS 814


l Trigno Spring Contact Adapter

l Trigno Snap Lead Sensor

AUX sensors

l Trigno Analog Input Adapter

l Trigno Goniometer Adapter

l Trigno Load Cell Adapter

l Trigno EKG Biofeedback Sensor

l Trigno 4-Ch FSR Adapter

Sensors with limited support

l Trigno Galileo Sensor

l Trigno Maize Sensor

NOTE: Trigno classic and IM sensors are not supported by the Delsys
API integration. If you have Trigno legacy sensors, you can use the Delsys
Trigno SDK integration, see "Delsys Trigno EMG (SDK legacy integration)"
on page 826.

Version information

The following firmware is required for use with the Delsys API integration:
l Delsys Trigno base station firmware: MA2919-BE1506-DS0806-US2008-
DA0901/0000.

l Sensor firmware: 40.49.

For more information about compatibility of Delsys firmware and software ver-
sions or up/downgrading Delsys firmware, refer to Delsys documentation or
support.

EXTERNAL DEVICES AND INTEGRATIONS 815


USB drivers

The API integration requires that the 64-bit version of the Delsys USB drivers is
installed. The easiest way to install the correct drivers is by installing the Delsys
Trigno Discover software. For more information, refer to Delsys documentation
or support.

Setting up Delsys Trigno (API) in QTM

For measuring data with the Delsys Trigno API, start by adding it as an input
device as follows.

1. Go to QTM Project Options > Input Devices, and press the Add Device
button.
2. Select Delsys Trigno API from the drop down menu and click OK. The
Delsys Trigno API is now added to the Input devices list.

3. Select Delsys Trigno API in the Input devices list. This will add the Delsys
Trigno API settings page under Input Devices > EMGs.

Connecting Delsys Trigno (API)

The Delsys Trigno base station must be connected via USB to the computer run-
ning QTM. The exact connection of the hardware depends on the method used
for synchronization. Three possible synchronization methods are described
below:

1. Measurement time synchronization

2. Trigger synchronization with Qualisys trigger button

3. Trigger synchronization with Delsys Trigger Module button

EXTERNAL DEVICES AND INTEGRATIONS 816


NOTE: It is also possible to acquire data with Delsys Trigno in
QTM without hardware synchronization, but this is not recommended.

Measurement time synchronization

When using measurement time synchronization, connect the hardware accord-


ing to the below schematic.

Follow these steps to connect the hardware and configure the synchronization:

1. Connect the MEAS. TIME output of the Camera Sync Unit to the
START Trigger input of the Delsys Trigger Module, using a BNC cable.
2. In the QTM Project Options, go to Input Devices > EMGs > Delsys
Trigno API and set Synchronization input to Measurement Time.
3. Make sure that the polarities of the synchronization signals are set cor-
rectly, see chapter "Polarity of synchronization signals" on page 819.

NOTE: If you have an Oqus system without a Camera Sync Unit, use the
Sync out connector of the sync/trigger splitter, and set the syn-
chronization output mode to Measurement time in the Synchronization
settings.

Trigger synchronization with Qualisys trigger button

When using a Qualisys trigger button for synchronization, connect the hard-
ware according to the below schematic.

EXTERNAL DEVICES AND INTEGRATIONS 817


Follow these steps to connect the hardware and configure the synchronization:

1. Use a BNC T-connector to connect the Trig NO port of the Camera Sync
Unit with the Qualisys trigger button and the START Trigger input of
the Delsys Trigger Module.
2. In the QTM Project Options:
a. Go to Input Devices > EMGs > Delsys Trigno API and set Syn-
chronization input to Trigger.

b. Go to Input Devices > Camera System > Synchronization and set


Trig NO function to Start capture.

3. Make sure that the polarities of the synchronization signals are set cor-
rectly, see chapter "Polarity of synchronization signals" on the next page.
4. Use the Qualisys trigger button to start your captures.

NOTE: If you have an Oqus system without a Camera Sync Unit, use the
Trig in connector of the sync/trigger splitter, and set the Trigger port
function to Start Capture in the Synchronization settings.

Trigger synchronization with Delsys Trigger Module button

When using the button on the Delsys Trigger Module, connect the hardware
according to the below schematic.

EXTERNAL DEVICES AND INTEGRATIONS 818


Follow these steps to connect the hardware and configure the synchronization:

1. Connect the Trig NO port of the Camera Sync Unit to the START Trigger
output of the Delsys Trigger Module, using a BNC cable.
2. In the QTM Project Options:
a. Go to Input Devices > EMGs > Delsys Trigno API and set Syn-
chronization input to Trigger.

b. Go to Input Devices > Camera System > Synchronization and set


Trig NO function to Start capture.

3. Make sure that the polarities of the synchronization signals are set cor-
rectly, see chapter "Polarity of synchronization signals" below.
4. Use the button on the START Trigger input side of the Delsys Trigger Mod-
ule to start your captures.

NOTE: If you have an Oqus system without a Camera Sync Unit, use the
Trig in connector of the sync/trigger splitter, and set the Trigger port
function to Start Capture in the Synchronization settings.

Polarity of synchronization signals

QTM uses by default negative polarity for the Measurement time and Trig NO
synchronization signals, which means that the trigger signal corresponds to a
falling edge. It is therefore recommended to set the Edge selectors on the
Delsys Trigger Module to negative polarity as indicated in the below illustration.

EXTERNAL DEVICES AND INTEGRATIONS 819


In QTM, the signal polarity settings are found under Project Options > Input
Devices > Synchronization. Make sure that the signal edge/polarity settings
for Trig NO and Measurement time are set to Negative (default values).

Configuration of Delsys Trigno (API) in QTM

The Delsys Trigno API configuration is managed via the Delsys Trigno API set-
tings page under Project Options > Input Devices > EMGs.

EXTERNAL DEVICES AND INTEGRATIONS 820


Scanning of sensors

The upper section of the Delsys Trigno API settings page displays information
about the connected base station. Make sure that the correct firmware is
installed on the base station, see "Hardware requirements" on page 814.
For populating the sensor list with the available sensors, follow these steps:

1. Activate the Delsys Trigno sensors.

2. Open the Delsys Trigno API settings page in the QTM Project Options.

3. Press the Scan button. A dialog comes up with information of how many
sensors were detected. Press OK to proceed with the detected sensors.
For canceling and keeping the previous list of sensors, press X (close win-
dow) in the upper-right corner.

The detected sensors appear in the order as they are paired to the Trigno base
station. The pairing of sensors is managed in Delsys Trigno Discover software.
For more information about pairing the sensors to the base station, refer to
Delsys documentation or support.

EXTERNAL DEVICES AND INTEGRATIONS 821


The sensor and channel configuration settings are stored in the QTM project
and retained even in case sensors are absent during a scan.

Configuration of sensors

Once the sensors have been detected, you can view and configure the sensors
in the sensor list.

The sensor list contains the following information:


Header: The header shows the number of data slots that are in use, as
well as the total number of data slots available.

sid: Sensor serial number.

Battery: Battery level of sensor in percent. The battery level is updated


during a scan or a measurement.

Type: Sensor type.

Name: Name of sensor (editable field). QTM will assign a default name,
based on the sensor type and the position in the list.

Data Slots: Number of data slots occupied by the sensor. The number of
data slots may depend on the bandwidth required for the selected mode.

Mode: Currently selected sensor mode (drop down menu).

Firmware: Current firmware version of the sensor.

Configuration of the sensors:


Sensor name
Edit the Name field of a sensor if you want to use a custom name. This
will also update the default channel names associated with the sensor in
the channel list.

EXTERNAL DEVICES AND INTEGRATIONS 822


Sensor mode
Click on the right side of the Mode field to open the drop down menu
with available modes, and select the desired mode. This will also update
the channel information associated with the sensor in the channel list. For
recommendations on sample rates when using multiple sensors, see
"Notes on sample rate per channel" on page 825.

NOTE: When certain channel types are absent in a mode, any cus-
tomizations made to those channels will be erased when finalizing
the changes.

Edited and updated fields in the sensor and channel list are highlighted in yel-
low. The changes will be finalized when pressing Apply or OK. Press the Cancel
button to discard any pending changes, for example if you changed to an
undesired mode leading to loss of custom channel names.

Configuration of channels

The channel list contains the following information:


Row number: Numbering of the available channels in the list.

Sensor: Name of sensor from the sensor list.

Name: Channel name (editable field). The default name assigned by QTM
is based on the data type.

Active: Checkbox for enabling or disabling the channel.

Type: Data type associated with channel.

Default Output Name: Checkbox for use of default or custom name.

Output Name: Name of the channel, editable if Default Output Name is

EXTERNAL DEVICES AND INTEGRATIONS 823


unchecked.

Hz: Sample rate in Hz of channel.

Editing and customization of channel information:


Selection of channels
Enable or disable channels by checking or unchecking the Active check-
box.

Customization of channels names


Uncheck the Default Output Name checkbox to edit the Output Name
field.

Edited and updated fields in the channel list are highlighted in yellow. The
changes will be finalized when pressing Apply or OK. Press the Cancel button
to discard any pending changes.

TIP: Once you have completed the configuration of sensors and chan-
nels, create a Project back up, or a Project preset in case you want to use
the same sensor configuration for multiple projects.

Synchronization method

In the last section of the Delsys Trigno API, you can select the synchronization
method. Make sure that the synchronization method corresponds to the hard-
ware setup, see "Connecting Delsys Trigno (API)" on page 816. The available
options are:
Trigger: Synchronization using a trigger signal.

Measurement time (default): Synchronization using the MEAS. TIME sig-


nal from the Camera Sync Unit.

Unsynchronized (not recommended): Data collection without hardware-


based synchronization. This option allows for recording data in situations
when no hardware for synchronization is available. With this option, the
synchronization of Delsys data and mocap data is not guaranteed.

EXTERNAL DEVICES AND INTEGRATIONS 824


WARNING: The Unsynchronized mode cannot be used in com-
bination with any type of trigger start of captures in QTM.

WARNING: Using the wrong synchronization method for the used hard-
ware setup will lead to synchronization errors.

Notes on sample rate per channel

The sample rates of the Delsys Trigno data channels are defined per channel,
dependent on the selected sensor modes. In QTM, all channels are auto-
matically upsampled to the channel with highest sample rate. This device fre-
quency is determined by the selected sensor modes, and not influenced by the
selection of channels in the channel list. The upsampling is done in real time by
repeating the current value when no new value is available for a channel. This
means that channels with lower sample rates are updated in steps in the ana-
log data stored in QTM.
When measuring with multiple sensors, it is recommended to choose sensor
modes with the same EMG frequency, whenever possible. When using auxiliary
channels (acceleration, gyro, etc.) in combination with EMG, it is recommended
to use modes with auxiliary frequencies with an integer relation with the EMG
frequency. For example, when using an EMG frequency of 1925.93 in com-
bination with an acceleration frequency of 148.15, the sample rate factor is 13.

Capturing, viewing, and exporting data

After setting up and configuring the Delsys Trigno device and sensors, you can
start capturing data. Before you start a measurement session, it is good prac-
tice to first scan for the available sensors on the Delsys Trigno API settings
page and make sure that all sensors used in the project are detected and have
sufficient battery level. If the sensors have not been scanned after opening the
QTM project, a scan will be automatically performed at the beginning of the
first measurement.
To view Delsys data during preview or a capture, open a Data Info window via
the View menu (keyboard shortcut Ctrl + D), right-click in the window and
select Analog data.

EXTERNAL DEVICES AND INTEGRATIONS 825


When exporting, all Delsys data will be exported as analog data from a single
device. The sample rate corresponds to that of the channel with the highest
sample rate, see "Notes on sample rate per channel" on the previous page.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Delsys Trigno EMG (SDK legacy integration)

QTM supports integration with Delsys Trigno systems. The Delsys Trigno SDK
integration is a legacy integration, which can be used if you have Delsys Trigno
classic or IM sensors. If you have a Delsys system with Trigno Avanti and other
types of button-less sensors, also recognizable by a large arrow-shaped status
LED, it is recommended to use the newer Delsys Trigno integration, see chapter
"Delsys Trigno Integration" on page 804.

IMPORTANT: The Delsys Trigno SDK integration will be discontinued in


the near future. It is recommended to use the latest Delsys Trigno integ-
ration, see chapter "Delsys Trigno Integration" on page 804.

The Delsys Trigno SDK integration in QTM supports the following types of
Trigno and Trigno Avanti sensors.

Trigno Avanti sensors

l Trigno Avanti Sensor (EMG)

l Trigno Avanti Snap Sensor

l Trigno Avanti Spring Contact Sensor

l Trigno Avanti EKG Sensor

l Trigno Avanti Mini Sensor

Trigno sensors

l Trigno Legacy EMG Sensor

l Trigno Spring Contact Adapter

EXTERNAL DEVICES AND INTEGRATIONS 826


l Trigno Snap Lead EMG Sensor

l Trigno Standard EMG Sensor

l Trigno EKG Biofeedback Sensor

l Trigno Load Cell Adapter

l Trigno Goniometer Adapter

l Trigno MiniHead Sensor

l Trigno Analog Input Adapter

l Trigno IM Sensor

l Trigno DR Sensor

The following instructions for Delsys Trigno EMG will only cover how to connect
to and capture data in QTM. For more detailed information on EMG and Trigno
please refer to the Delsys documentation.

Trigno installation

Follow these instructions to install a Delsys Trigno system on a new computer


for use with the Delsys Trigno SDK integration.

1. Connect and install the Delsys Trigno system according to the chapters
"Trigno computer installation" on the next page and "Trigno syn-
chronization connections" on page 830.
2. Start the Trigno system.

3. Go to the Input Devices page in the Project options dialog.


a. Click on the Add Device option.

EXTERNAL DEVICES AND INTEGRATIONS 827


b. Select Delsys Trigno SDK (legacy) from the drop-down list and click
OK.
4. Activate the Delsys Trigno SDK in the list.

5. Double-click on the Delsys Trigno SDK line and go to the Delsys Trigno
SDK page.
l Make sure that the Connection settings are working, see chapter
"Delsys Trigno QTM settings" on page 834
6. Activate the EMG and acceleration channels that you want to use.

Trigno computer installation

Before you connect the Trigno system to the computer you need to install the
Delsys Trigno Control Utility program. This is included in the Trigno SDK. It is
highly recommend to download the SDK via QTM via Project Options>Input
Devices>Download device drivers to make sure that you use a compatible
version.
The SDK integration requires that the 32-bit version of the Delsys USB drivers is
installed. The easiest way to install the correct drivers is by installing the Delsys
EMGWorks software. For more information, refer to Delsys documentation or
support.

NOTE: Make sure that your Trigno base station and sensors have the
firmware installed that is compatible with the Delsys SDK version used for
QTM. For upgrading or downgrading firmware, please refer to Delsys doc-
umentation or support.

Follow these instructions to start the Delsys Trigno EMG.

1. Connect the power and USB connector of the Trigno Module.

2. Start the Trigno Control Utility program.

EXTERNAL DEVICES AND INTEGRATIONS 828


NOTE: We recommend to use the Trigno Control Utility that
comes with the Trigno SDK, even if you have Delsys EMGworks soft-
ware installed. Make sure that you run SensorBaseControl.exe,
which is located under C:\Program Files (x86)\Delsys, Inc\Trigno
SDK.

a. The first time you use the Trigno system on a computer you need to
pair the sensors. Click Pair on the sensor images in the program and
then hold down the button on sensor for about 3 seconds until it is
detected. For Trigno Avanti sensors, the sensor is paired by tapping
it over the built-in magnet of the base station, indicated by the lock
decal.
b. Press the Configure button and make sure that the following set-
tings are correct:

EXTERNAL DEVICES AND INTEGRATIONS 829


l On the tab Digital Output, make sure that the option Back-
wards Compatibility is checked.

WARNING: Unchecking the Backwards Compatibility


option will lead to unsynchronized EMG data.

l On the tab Orientation Filter, make sure that the option Turn
on Orientation Filter for all sensors is unchecked.

c. You can configure the sensors (e.g., the gain of the accelerometers)
by pressing the configure button of the sensors. For Avanti IMU
sensors, make sure to use the EMG+IMU mode.

NOTE: QTM automatically sets the Avanti sensor mode to


EMG+IMU if the chosen mode is not supported.

d. For more information on how to set up the Trigno system, please


refer to the Delsys documentation.

Trigno synchronization connections

For synchronization of the Trigno system with Qualisys the use of the Delsys
Trigger Module is required. There are two synchronization options:

EXTERNAL DEVICES AND INTEGRATIONS 830


Use of the trigger button for synchronization, see chapter "Trigno
trigger connection" below. For this option, the use of a trigger but-
ton is required.

Use of Measurement Time for synchronization, see chapter


"Trigno Measurement Time connection" on the next page. Use this
option if you want to start a measurement without a trigger button.

For correct synchronization of the EMG data, it is very important to select the
correct Input option corresponding to the used hardware connection in the
Trigno QTM settings, see chapter "Delsys Trigno QTM settings" on page 834.

WARNING: Using the wrong Input option in the QTM settings will lead
to synchronization errors.

Trigno trigger connection

Follow these instructions to connect the trigger to the Trigno EMG.

1. Connect the Delsys Trigger Module to the Trigger port on the Trigno
base station.

2. Put a BNC T-coupling on the Start Input BNC connector (green side of
the module) so that you can also connect a trigger button to the input.
The measurement must be started with the trigger button.
3. Connect the Start Input BNC connector to the trigger input of the cam-
era system.

EXTERNAL DEVICES AND INTEGRATIONS 831


l When using the Camera Sync Unit, use the Trig NO input on the
Camera Sync Unit.

NOTE: The Trig NC input on the Camera Sync Unit cannot be


used with the Qualisys trigger button.

l When using an Oqus camera for synchronization, use the Trig in


connector on the control port splitter cable.
4. Make sure that you select the correct starting edge with the Start Input
Edge selector on the module. The default setting is falling edge for the
trigger signal on an Oqus system.

NOTE: The green LED on the trigger button of the Trigger module
lights up when the trigger signal arrives.

In QTM, make sure the synchronization settings are correct.

1. Under Project Options go to Input Devices > EMGs> Delsys Trigno. Set
Input to Trigger.

2. Open the Synchronization page in the Project Options and manage the
Trigger port(s) settings, see chapter "Trigger ports" on page 273.
l For Oqus, set Function to Start capture and make sure that TTL sig-
nal edge is set to Negative.
l When using a Camera Sync Unit, set Trig NO: Function to Start cap-
ture and make sure that Trig NO: TTL signal edge is set to Negative.

Trigno Measurement Time connection

Follow these instructions to set up the hardware for synchronization of the


Trigno system using Measurement Time.

EXTERNAL DEVICES AND INTEGRATIONS 832


1. Connect the Delsys Trigger Module to the Trigger port on the Trigno
base station.

2. Connect the Start Input BNC connector (green side of the Delsys Trigger
Module) to the synchronization output of the Qualisys system.
l When using a Camera Sync Unit, use the Meas. Time input on the
Camera Sync Unit
l When using an Oqus camera for synchronization, use the Sync out
connector on the control port splitter cable.
3. Make sure that you select the correct starting edge with the Start Input
Edge selector on the Delsys Trigger Module. The default setting in QTM
is negative polarity, or falling edge.

NOTE: The green LED on the trigger button of the Delsys Trigger
Module lights up when the synchronization start signal arrives.

In QTM, make sure the synchronization settings are correct.

1. Under Project Options go to Input Devices > EMGs> Delsys Trigno. Set
Input to Measurement Time.

EXTERNAL DEVICES AND INTEGRATIONS 833


2. Open the Synchronization page under Project Options.

l When using a Camera Sync Unit, make sure that under Measurement
time the TTL signal polarity is set to Negative, see chapter "Meas-
urement time (Camera Sync Unit)" on page 290.

l When using an Oqus camera for synchronization, select the camera to


which the splitter cable is connected. Under Synchronization Output
set Mode to Measurement time, and make sure that TTL signal polarity
is set to Negative, see chapter "Synchronization output" on page 285.

Delsys Trigno QTM settings

The Delsys Trigno SDK page, with settings for the Delsys Trigno EMG, is
included in the Project options dialog when Delsys Trigno is activated on the
Input Devices page. To use Delsys Trigno you need to install the drivers that

EXTERNAL DEVICES AND INTEGRATIONS 834


can be found on the Download device drivers link on the Input Devices page.
For more information about how to use Trigno, see chapter "Delsys Trigno EMG
(SDK legacy integration)" on page 826.
The settings page contains the different options for the Trigno EMG. To change
any Trigno settings that are not included on the page, go to the Trigno Control
Utility.
The following settings are available for Delsys Trigno:
Connection settings
Address
The IP address of the computer that is running the Trigno Control
Utility. The default is localhost which is the same as the QTM com-
puter.

Command port, EMG data port, ACC data port, IM EMG data and
IM Aux port
Ports used to communicate with the Trigno Control Utility. The
defaults are 50040, 50041, 50042, 50043 and 50044.

Synchronization settings

Input
Select the synchronization input. The selected input depends on which sig-
nal from the Qualisys system is used as synchronization signal. The
options are:
Trigger: This option should be selected when the trigger button is
connected to the Delsys Trigger Module as described in chapter
"Trigno trigger connection" on page 831. When this option is selec-
ted, the trigger delay from the Qualisys system is compensated for.

Measurement Time: This option should be selected when the syn-


chronization or measurement time output of the Qualisys system is
connected to the Delsys Trigger Module as described in chapter
"Trigno Measurement Time connection" on page 832

Use trigger in real time


The Use trigger in real time option activates the use of the trigger but-
ton in Real-time mode. When the option is activated the real-time will not
start until you press the trigger button. This is to ensure the

EXTERNAL DEVICES AND INTEGRATIONS 835


synchronization in real-time. If you do not use the real-time output you
can uncheck the option and then you only need to press the trigger but-
ton before the measurement.

NOTE: Activate the Use trigger in real time option when the setup
is finished to minimize the number of times you have to press the
trigger button.

Channels
Channel name
Activate the EMG and Acceleration sensors and enter a name for the
channels. Make sure that the channel number matches those that
are active in the Trigno Control Utility. The name of the EMG channel
is the same as the Channel name. The auxiliary data channels have
the channel name with a suffix, for example _ACC_X, _ACC_Y and _
ACC_Z for the acceleration data.

Auxiliary data
Check this option for the channels for which you want to retrieve
auxiliary data from the sensors (e.g., acceleration, gyro, mag-
netometer). If you do not need the auxiliary data, uncheck it to
reduce the amount of analog data in the QTM files.

Making a measurement with Trigno

With the Trigno EMG it is possible to capture up to 16 EMG channels including


auxiliary data. Follow these steps to make a measurement with Trigno EMG in
QTM.

1. The first time a Trigno EMG is connected to a computer you must follow
the instructions in the chapter "Trigno computer installation" on
page 828.
2. Make sure to set up the synchronization correctly according to the instruc-
tions in chapter "Trigno synchronization connections" on page 830.
3. Make sure that the Trigno Control Utility program is running on the com-
puter. This program is needed for the communication with the EMG sys-
tem.

EXTERNAL DEVICES AND INTEGRATIONS 836


l Check that the EMG sensors that you want to use are active in the
program and that they are paired with the correct number.
l The EMG+IMU mode is the only supported mode for Avanti sensors.
If another mode is selected in Trigno Control Utility then the mode
is automatically changed to EMG+IMU.
4. Make sure that Delsys Trigno is activated on the Input Devices page.

5. Go to the Delsys Trigno page and select the settings below, for more
information see chapter "Delsys Trigno QTM settings" on page 834.
l Make sure that QTM has connection with the Trigno Control Utility
program. If it does there is an OK after the port numbers.
l Activate the EMG and accelerometer sensors you want to use in the
list. Do not activate sensors that are not needed since it will only res-
ult in larger files.
6. Close the Project options dialog and start a new measurement.
l When Use Trigger in Real Time is selected, the Trigno data must be
triggered by an external trigger signal to be synchronized in real
time. Then the dialog below will appear every time you start a new
measurement and every time you change something in the Project
options dialog. If you do not use the real-time output from QTM dis-
able the Use Trigger in Real Time option on the Delsys EMG page.
This option requires that the trigger start option is used for syn-
chronization.

7. Open the Data info window on the View menu and then right-click and
select Display Analog data.
8. The data from the Delsys EMG is listed with the analog data. The channels
are called the Channel name for the EMG data. The auxiliary channels
are called the Channel name and then _ACC_X and so on. The Board

EXTERNAL DEVICES AND INTEGRATIONS 837


name is Delsys Trigno.

NOTE: The Trigno EMG and auxiliary data is automatically


upsampled to 2000 Hz. The original sample frequencies are
1925.925 Hz for EMG data and 148.148 Hz for auxiliary data.

9. When starting a capture the start of the EMG data will be synchronized
with the motion data according to the selected synchronization method.

Export Trigno EMG and auxiliary data

The EMG data is resampled at 2000 Hz via the Delsys SDK when recorded in
QTM. Auxiliary sensor data is upsampled in QTM to 2000 Hz. For the export to
MAT and TSV the sample frequency will be 2000 Hz for all Trigno sensor data.
When you are using C3D export it is recommended to choose a capture fre-
quency with a direct integer relationship to 2000 Hz to avoid additional res-
ampling when exporting to C3D. See chapter "C3D file format" on page 728 for
more information.
Cometa EMG

QTM supports integration with the Cometa Wave Plus wireless EMG system.
The supported sensor types are EMG and IMU sensors. A single Cometa base
unit allows the use of maximum 16 EMG and/or IMU sensors. It is possible to
connect two base units for the use of maximum 32 sensors.

NOTE: Refer to Cometa documentation or support for how to set up two


base units.

For use with QTM the Cometa system should include the following com-
ponents:

EXTERNAL DEVICES AND INTEGRATIONS 838


l A base unit

l A docking base with power supply

l EMG and/or IMU sensors

l A trigger box with BNC connector

The following chapters cover how to connect to and capture Cometa EMG and
IMU data in QTM. For more detailed information, please refer to the Cometa
documentation.
There is also a tutorial available at QAcademy for connecting and using Cometa
EMG with QTM.

IMPORTANT: The Cometa EMG integration will be discontinued in the


near future. It is recommended to use the latest Cometa Systems integ-
ration, see chapter "Cometa Systems" on page 844.

Cometa installation

Before using the Cometa EMG system for the first time you will need to install
the drivers. You can download the current drivers via QTM via Project
Options>Input Devices>Download device drivers. Follow these steps to
install the drivers:

1. Connect the Cometa base unit to a USB port on the QTM computer. Win-
dows will automatically detect the device.
2. Unzip the file containing the drivers (typically called emg-
musbdrivers_...zip).
3. In Windows explorer, locate the folder containing the files that are com-
patible with your computer (e.g. Win10\x64).
4. Right-click on the file EmgMUsb.inf and choose Install from the context
menu.
To add the Cometa device as an input device in QTM, follow these steps:

1. Open the Input Devices page in the Project Options dialog.

2. Click on the Add Device option.

EXTERNAL DEVICES AND INTEGRATIONS 839


3. Select Cometa from the drop-down list and click OK.

4. Select Cometa in the Input Devices list and click OK.

Cometa synchronization setup

For synchronizing Cometa with Qualisys motion capture data, the following
accessories are required:
l A Cometa trigger box with BNC connector.

l A Camera Sync Unit (Miqus or mixed Qualisys system) or an Oqus Syn-


c/Trig splitter cable (Oqus only system).
The synchronization is based on the Measurement Time synchronization out-
put of the Qualisys system. There are two options, depending on your Qualisys
system.
For a Qualisys system including a Camera Sync Unit:
l Connect the MEAS. TIME output of the Camera Sync Unit with a BNC
cable to the BNC connector of the Cometa trigger box.
For an Oqus system without Camera Sync Unit:

1. Connect the Sync/Trig splitter cable to the control port of one of the
Oqus cameras.
2. Connect the Sync out connector of the Oqus splitter cable with a BNC
cable to the BNC connector of the Cometa trigger box.

EXTERNAL DEVICES AND INTEGRATIONS 840


3. In QTM open the Project Options dialog and go to the Synchronization
page. Select the camera used for synchronization and set the Mode
option under Synchronization output to Measurement time, see
chapter "Synchronization output" on page 285.

NOTE: For Oqus users.


If you are connecting an analog board or force plates in addition to your
EMG system, you will need a second sync/trig splitter cable. This will be
used to connect the other external system through the control port of a
different Oqus camera. Please contact Qualisys Sales if you need to pur-
chase a second splitter.

Cometa QTM settings

The Cometa page, with settings for the Cometa EMG, is included in the Project
options dialog when Cometa is activated on the Input Devices page.

The following settings are available for Cometa EMG.

EXTERNAL DEVICES AND INTEGRATIONS 841


Calibrate Imu sensors
Press to calibrate the used IMU sensors to compensate for gyroscope and
accelerometer offset. When using fused data, the current orientation of
the sensors is set as start position.

Channels
Channel name: Name of the channel. Click in the text area to edit.

Active: Check sensors that are currently used for data acquisition in
QTM.

Sensor type: Select the type of sensor (Emg or Imu) per channel.

Imu Acquisition Type


Choose the data type for the used IMU sensors. The options are:
RawData 284Hz: Acquisition of 3-axis accelerometer (ACC_X, ACC_Y,
ACC_Z), gyroscope (GYRO_X, GYRO_Y, GYRO_Z) and magnetometer
(MAG_X, MAG_Y, MAG_Z) data at a sample rate of approximately 284
Hz (sample ratio 1/7 of 2000 Hz).

Mixed6xData 142Hz: Acquisition of raw and fused data at a sample


rate of approximately 142 Hz (sample ratio 1/14 of 2000 Hz). In addi-
tion to the raw data, quaternions (Q_X, Q_Y, Q_Z, Q_W) are included
representing the orientation of the IMU sensors.

Making a measurement with Cometa

For making a measurement with Cometa in QTM, follow these steps.

1. Set up the Cometa system according to the instructions in chapter


"Cometa installation" on page 839.
2. Set up the synchronization according to the instructions in chapter
"Cometa synchronization setup" on page 840.
3. Take the sensors to be used from the docking base and check that the
LEDs of the assigned channels on the base unit are lighting up.
4. Select the used channels in the QTM settings and make sure that the
sensor type is set correctly, see chapter "Cometa QTM settings" on the
previous page.

EXTERNAL DEVICES AND INTEGRATIONS 842


5. When using IMU sensors, make sure to select the correct IMU acquisition
type and calibrate the sensors before starting a new series of meas-
urements. It is recommended to regularly recalibrate the sensors. Refer
to Cometa documentation for detailed recommendations.
6. Press New or Ctrl+N to start a measurement.

7. For viewing the EMG and IMU data, open a Data Info window in the View
menu, right-click and select Display Analog data.
8. The data from the Cometa EMG is listed with the analog data. The chan-
nels are called the Channel name for the EMG data. The IMU channels
are called the Channel name and then _ACC_X and so on. The Board
name is Cometa.

9. Start a capture via the Capture dialog (Ctrl+M). The start of the EMG and
IMU data will be synchronized with the motion data.

NOTE: In Preview mode there will be a certain latency of the EMG


and IMU data.

EXTERNAL DEVICES AND INTEGRATIONS 843


NOTE: When using hardware synchronization and the syn-
chronization signal is not received by the Cometa device, QTM will
show the following warning after the capture has been finished.
When saving the measurement the data will not be synchronized.

10. EMG and IMU data can be exported to several export formats, see chapter
"Export Cometa EMG and IMU data" below.

Export Cometa EMG and IMU data

Cometa EMG and IMU data can be exported to TSV, MAT and C3D file formats.
For the TSV and MAT export, make sure that the Analog data type is selected.
The sample frequency of all Cometa EMG and IMU data is 2000 Hz. The actual
sample rate of IMU data depends on the used IMU acquisition type, see chapter
"Cometa QTM settings" on page 841.
When you are using C3D export it is recommended to choose a capture fre-
quency with a direct integer relationship to 2000 Hz to avoid additional res-
ampling when exporting to C3D. See chapter "C3D file format" on page 728 for
more information.
Cometa Systems

This chapter describes the Cometa Systems integration in QTM. The following
Cometa systems are supported:
l Cometa WavePlus

l Cometa WaveX

The following chapters cover how to connect and use Cometa System devices
in QTM. For more detailed information about Cometa System devices, please
refer to the Cometa documentation.

EXTERNAL DEVICES AND INTEGRATIONS 844


Requirements

Hardware requirements

The Cometa Systems integration requires a Qualisys camera system with a Cam-
era Sync Unit. If you have an Oqus system, you need a sync/trig splitter cable
(art. 510870) connected to the control port of one of the cameras.

NOTE: For Oqus users.


If you are connecting an analog board or force plates in addition to your
EMG system, you will need a second sync/trig splitter cable. This will be
used to connect the other external system through the control port of a
different Oqus camera. Please contact Qualisys Sales if you need to pur-
chase a second splitter.

The following Cometa systems are supported.

WavePlus
For use with QTM the Cometa WavePlus system should include the following
components:
l A receiver unit

l A docking base with power supply

l WavePlus compatible sensors: Mini Wave EMG, WaveTrack IMU, Pico EMG

l A trigger box with BNC connector

WaveX
For use with QTM the Cometa WaveX system should include the following com-
ponents:
l A receiver unit with power supply

l A docking base with power supply

EXTERNAL DEVICES AND INTEGRATIONS 845


l WaveX compatible sensors: MiniX, PicoX, TrackX, PicoLite

l A trigger box with BNC connector

Driver install
Before using the Cometa system for the first time you will need to install the
drivers. You can download the current drivers via QTM via Project
Options>Input Devices>Download device drivers. The following drivers are
available:
l For WavePlus, download the file emgmusbdrivers_...zip.

l For WaveX, download the file WaveX_driver_...zip.

Follow these steps to install the drivers:

1. Connect the Cometa base unit to a USB port on the QTM computer. Windows
will automatically detect the device.

2. Unzip the file containing the drivers.

3. In Windows explorer, locate the folder containing the files that are com-
patible with your computer (e.g. Win10\x64).

4. Right-click on the file EmgMUsb.inf and choose Install from the context
menu.

For more information or trouble shooting, please refer to Cometa doc-


umentation or support.

Software requirements

The following software is required for configuring and using Cometa Systems
with QTM:
Cometa software:
l EMG and Motion Tools

EXTERNAL DEVICES AND INTEGRATIONS 846


The Cometa EMG and Motion Tools software comes with your Cometa system, or
can be downloaded from the Cometa website at https://www.-
cometasystems.com/emg-and-motion-tools/. Please, refer to Cometa resources
or support for more information.
Make sure that the latest compatible version of the Cometa Systems integration
for QTM is installed. Follow these steps to download and install the integration:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Download device drivers link.

3. Download the installer for the Cometa Systems integration.

4. Run the installer.

Hardware setup

How to connect

The Cometa system (WavePlus or WaveX) is connected as follows:

1. Connect the Cometa Receiver unit with USB to the computer running
QTM.
2. Connect the Synchronization output (Out 1 or Out 2) of the Camera sync
unit to the Cometa trigger box connected to the the Cometa Receiver. If
you are using an Oqus camera as sync device, use the Sync out connector
of the Sync/Trigger splitter.

EXTERNAL DEVICES AND INTEGRATIONS 847


For operating the Cometa system, please refer to Cometa documentation.

Sensor configuration

The sensor configuration is defined in the Cometa EMG and Motion Tools soft-
ware. The software can be used to create sensor configuration files for both
WavePlus and WaveX systems. The sensor configurations contain information
about the sensor types and labels, that are required for the use in QTM. You
can create multiple configurations, that can later be selected in the device
setup in QTM.
The sensor configurations are created as outlined below. For detailed inform-
ation, please refer to the Cometa documentation.

1. Open the EMGandMotionTools software. The device connection should be


recognized.
2. Click the gear icon to open the Data Capture Configuration window.

3. In the Sensors tab, select the sensor modes for the respective sensors and
choose the data protocols used for the respective modes.

EXTERNAL DEVICES AND INTEGRATIONS 848


4. In the EMG tab, define the labels for the respective EMG channels. This
can be done by dragging and dropping the muscles indicated in the
Muscle map onto the channel name fields.

5. In the IMU tab, define the labels for the respective IMU sensors. This can
be done by dragging and dropping the joints from the Joints map onto the
sensor name fields.

EXTERNAL DEVICES AND INTEGRATIONS 849


6. When done, press the Save button to save the configuration. Choose a
folder to save the configuration in and remember it for the device setup
in QTM.

Setting up Cometa Systems in QTM

Add input device

To add the Cometa Systems device to QTM, follow these steps:

1. In QTM, open the Input Devices page in Project Options.

2. Click the Add Device button, select Cometa Systems from the list, and
click OK to add it to the Input Devices list.
3. Check Cometa Systems in the Input Devices list. This will add the
Cometa Systems settings page under Input Devices > EMGs.

Device settings

The Cometa Systems configuration is managed via the Cometa Systems set-
tings page under Project Options > Input Devices > EMGs.

EXTERNAL DEVICES AND INTEGRATIONS 850


The Cometa Systems settings page contains the following buttons to com-
municate with the device and a list with settings for the sensors included in the
configuration.
Restore Default Settings
Reset settings to their default values.

Synchronize Settings
Synchronize changed settings with the Cometa Systems device.

Calibrate IMU
Calibrate the IMU sensors.

The settings list contains a top section with common settings and a section with
sensor information.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

EXTERNAL DEVICES AND INTEGRATIONS 851


Configuration file path
Text field for specifying the path at which the Cometa configuration
files are stored.

File name
Name of the selected configuration file. If there are multiple con-
figuration files, they can be selected from the dropdown menu.

Message
Display of messages from the Cometa Systems device.

Protocol information
Information about the used protocols and sample rates. In case of
multiple sample rates, all channels are resampled to the highest
sample rate.

Cometa Systems device (EMG)


List of sensors and their status or type according to the used con-
figuration. Note that one sensor can be associated with multiple data
channels.

Configuration in QTM

Select sensor configuration

Follow these steps to select the sensor configuration that you created earlier
with the EMGandMotionTools software (see chapter "Sensor configuration" on
page 848):

1. In Configuration file path, specify the folder name containing the stored
Cometa sensor configurations.
2. Press the Synchronize Settings button.

3. Under File name, select the configuration file you want to use.

4. Press the Synchronize Settings button again. The message filed should
now show that the Configuration was loaded succesfully, and the protocol
and channel information should be updated.

EXTERNAL DEVICES AND INTEGRATIONS 852


Synchronization settings
The synchronization output port (Out 1 or Out 2 on the Camera Sync Unit) con-
nected to the Cometa trigger box must be configured as follows:

1. Open the Synchronization page under Project Options > Input Devices
> Camera System.
2. Go to the settings for the used synchronization output (Out 1, Out 2 or
Synchronization output), and configure it as follows:

l Mode: System live time

l TTL signal polarity: Negative

Capturing, viewing and exporting data

After setting up and configuring the Cometa Systems device and sensors, you
can start capturing data. Before you start a measurement session, it is good
practice to make sure that all sensors used in the project are connected and
have sufficient battery level.

Calibration of IMU sensors

If you are using IMU sensors, you must calibrate them before starting the meas-
urements. This is needed to compensate for gyroscope and accelerometer off-
set. Follow these steps to calibrate the IMU sensors:

1. Make sure the camera system is idle.

2. Place all sensors on a flat surface with the same direction.

3. Open the Cometa Systems settings page under Project Options > Input
Devices > EMGs.
4. Press the Calibrate IMU button and wait a few seconds.

5. Press OK to close the settings page.

For more information about IMU sensor calibration, please refer to Cometa doc-
umentation.

EXTERNAL DEVICES AND INTEGRATIONS 853


Viewing data

To view Cometa Systems data during preview or a capture, open a Data Info
window via the View menu (keyboard shortcut Ctrl + D), right-click in the win-
dow and select Analog data.

Exporting data

When exporting, all Cometa Systems data will be exported as analog data from
a single device. The sample rate is normally fixed at 2000 Hz, except for WaveX
if there are no sensors in EMG mode included in the sensor configuration. In
the latter case, the output frequency corresponds to the selected IMU mode.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.
Noraxon EMG

This chapter describes the Noraxon EMG integration for QTM. The following
Noraxon systems are supported:
l Noraxon Ultium EMG

l Noraxon Desktop DTS EMG

The following instructions will cover how to connect and use Noraxon EMG
devices in QTM. For more detailed information about Noraxon EMG, please
refer to the Noraxon documentation.

Requirements

Hardware requirements

The following hardware is required:


l A Noraxon receiver (Ultium or Desktop DTS)

l Noraxon EMG sensors and a a sensor charging dock (IMU and Motion
sensors are not supported)
l A Phono 3.5 mm to BNC sync cable (art. 230049, not included with the Nor-
axon equipment), or a Phono 3.5 mm (mono) to BNC-Female adapter.

EXTERNAL DEVICES AND INTEGRATIONS 854


For the synchronization, you need a Qualisys system with a Camera Sync Unit.
If you have an Oqus system, you need a sync/trig splitter cable (art. 510870)
connected to the control port of one of the cameras.

NOTE: For Oqus users.


If you are connecting an analog board or force plates in addition to your
EMG system, you will need a second sync/trig splitter cable. This will be
used to connect the other external system through the control port of a
different Oqus camera. Please contact Qualisys Sales if you need to pur-
chase a second splitter.

For connecting the receiver via USB to the computer, you need the Noraxon
USB driver, which can be downloaded from https://www.nor-
axon.com/noraxon-download/noraxon-usb-driver/.
For more detailed information about Noraxon devices, refer to Noraxon doc-
umentation.

Software requirements

Make sure that the latest version of the Noraxon EMG integration for QTM is
installed. Follow these steps to download and install the integration:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Download device drivers link.

3. Download the installer for the Noraxon EMG integration.

4. Run the installer.

Hardware setup and configuration

The Noraxon EMG device should be configured for use with QTM according to
the instructions below. For more detailed information, contact Noraxon or
Qualisys support.

Add input device

First, you must add the Noraxon EMG device to QTM:

EXTERNAL DEVICES AND INTEGRATIONS 855


1. In QTM, open the Input Devices page under Project Options.

2. Click the Add Device button and select Noraxon EMG in the drop down
menu.

3. Check the Noraxon EMG item in the Input Devices list. The Noraxon EMG
device should now show up as an input device under the EMGs category.

Hardware setup

Noraxon Ultium EMG


Hardware connections

The Noraxon Ultium EMG device is connected as follows:

EXTERNAL DEVICES AND INTEGRATIONS 856


1. Connect the Noraxon Ultium receiver to the computer via USB.

2. Connect the sensor docking station to the Ultium receiver with the ded-
icated cable.
3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync input of the Ultium receiver with a phono 3.5 mm to BNC cable. If
you are using an Oqus camera as sync device, use the Sync out connector
of the Sync/Trigger splitter.
Device and sensor configuration

Follow these steps to setup your Noraxon Ultium EMG device and sensor con-
figuration.

1. Open the Noraxon EMG settings page under Project Options > Input
Devices > EMGs.
2. Press the Setup button to open the Noraxon Hardware setup program. The
Ultium device should be shown as one of the detected devices.
3. Drag the Ultium device to the Selected Devices tab. This will open the
Ultium Setup dialog (if not, double click on the Ultium device).

4. In the Ultium General setup, add your sensors to the configuration. The
easiest way is to press the Detect Sensors in Chargers button, making
sure that all sensors you want to use are present in the connected sensor
docking stations.
When all sensors are listed, you can change the labels. The labels will be
used as the channel names in QTM.

EXTERNAL DEVICES AND INTEGRATIONS 857


Make sure to check the Use Noraxon MyoSync checkbox.

5. Click the Advanced button to open the Advanced setup dialog.

6. In the Advanced setup page, make sure to check the Invert sync input
checkbox.
Optionally, check the Enable EMG IMU accel and/or Enable EMG IMU
gyro & mag checkboxes to include auxiliary data from the EMG sensors.

7. Click OK to store the configuration and exit the hardware setup.

EXTERNAL DEVICES AND INTEGRATIONS 858


Noraxon Dekstop DTS
Hardware connections

The Noraxon Desktop DTS EMG device is connected as follows:

1. Connect the Noraxon Desktop DTS receiver to the computer via USB.

2. Make sure that the sensors are charged in the sensor docking station.

3. Connect the Out 1 or Out 2 port of the Qualisys Camera Sync Unit to the
Sync in input of the Desktop DTS receiver with a phono 3.5 mm to BNC
cable. If you are using an Oqus camera as sync device, use the Sync out
connector of the Sync/Trigger splitter.
Device and sensor configuration

Follow these steps to setup your Noraxon Desktop DTS device and sensor con-
figuration.

1. Open the Noraxon EMG settings page under Project Options > Input
Devices > EMGs.
2. Press the Setup button to open the Noraxon Hardware setup program. The
Desktop DTS device should be shown as one of the detected devices.

EXTERNAL DEVICES AND INTEGRATIONS 859


3. Drag the Desktop DTS device to the Selected Devices tab. This will open the
Desktop DTS Hardware Setup page.
4. Press the Detect button to detect the configuration of the connected
Desktop DTS device. The sensor and foot switch lists should now show up.

5. Add your sensors to the configuration by typing in their serial numbers.


You can change the labels, which will be used as the channel names in
QTM.

EXTERNAL DEVICES AND INTEGRATIONS 860


6. Click the General button to open the General setup dialog.

7. In the General setup page, select the following settings:


a. Sync mode: MyoSync (input)

b. Hardware sync channel: MyoSync(Digital Channel 1)

EXTERNAL DEVICES AND INTEGRATIONS 861


8. Click OK to store the configuration and exit the hardware setup.

Configuration in QTM

Device settings

The Noraxon EMG device settings are managed via the Noraxon EMG settings
page under Project Options > Input devices > EMGs.

EXTERNAL DEVICES AND INTEGRATIONS 862


The Noraxon EMG page contains the following buttons to communicate with
the device:
Restore Default Settings
The restore button has no function, since device setup and configuration
is done via the Noraxon EMG interface, see Setup button.

Synchronize Settings
Update the device and channel information.

Setup
Open the Noraxon EMG hardware setup interface for modifying the
device and channel configuration.

The settings list contains a top section with common settings and a section with
device and channel information.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

Noraxon EMG device settings


Name, Manufacturer, Product, SDK version, Channels
Information about the Noraxon EMG device and channel con-
figuration.

Synchronization settings

To configure the synchronization, follow these steps:

1. In QTM, open the Synchronization page under Project Options > Input
Devices > Camera System.
2. Use the following settings for the used port (Out 1, Out 2 or Syn-
chronization output):

EXTERNAL DEVICES AND INTEGRATIONS 863


l Mode: System live time

l TTL signal polarity: Negative

Capturing, viewing and exporting data

To collect data with Noraxon EMG, simply start a capture in QTM.


To view the Noraxon EMG data during preview or in a capture, open a Data
Info window via the View menu (keyboard shortcut Ctrl + D), right-click in the
window and select Analog data.
When exporting to C3D, the analog data will be resampled to the closest
integer multiple of the capture frequency, or higher depending on all analog
data stored in the QTM file, see chapter "C3D file format" on page 728.

How to use eye tracker devices

Eye tracking hardware in QTM


There are two levels of integration for the eye trackers. Some eye trackers are
integrated in QTM so their data is captured with the rest of the data. Then
there are other eye trackers where the data needs to be captured in their own
program and then the data can be combined in other software such as Matlab.
Integrated eye trackers in QTM
Tobii - www.tobii.com
QTM supports Tobii Pro Glasses 2 and 3. The gaze vector data from
Tobii can be captured directly in QTM. The gaze vectors can then be
viewed and exported. The settings and how to handle the data is
described in the chapter "Tobii eye trackers" on the next page.

Eye trackers captured in external software


ASL - www.asleyetracking.com
ASL has made a integration called EyeHead to their Mobile Eye
EyeVison program that retrieves 6DOF data from QTM in real time.
In addition to head tracking, the ASL integration can also calculate

EXTERNAL DEVICES AND INTEGRATIONS 864


gaze vectors and intersection points for gaze vectors and surfaces.
For a basic description of the integration see chapter "ASL EyeHead
integration" on page 889. For any detailed question about the ASL
integration please refer to ASL support or their manual.

Smart-Eye - smarteye.se
For the Smart-eye hardware it is possible for the Qualisys system to
lock on the capture frequency of the Smart Eye camera. Then the
start synchronization is handled via a real time event from QTM that
is sent to the Smart eye software. Finally the Smart Eye and Qualisys
systems can be aligned via markers on the Smart eye calibration
equipment. For questions about the Smart eye integration please
contact [email protected].

Tobii eye trackers


QTM supports Tobii Pro Glasses 2 and 3. The eye trackers are integrated with
QTM so that you can view and export gaze vector data. For the integration,
Qualisys-Tobii connectivity kits for Tobii 2 and Tobii 3 are available, containing:
l Attachments for 6DOF tracking of the glasses (specific for Tobii 2 or Tobii 3)

l A cable for hardware synchronization

The connectivity kit can be purchased from Tobii or via Qualisys AB.
To use Tobii eye trackers in QTM, follow these steps:
l Set up the Tobii Glasses in QTM, see chapters "Setting up Tobii Pro
Glasses 2 in QTM" on the next page or "Setting up Tobii Pro Glasses 3 in
QTM" on page 868.
l Adding a gaze vector in QTM, see chapter "How to use gaze vectors in
QTM" on page 873.
For information about capturing and processing data, refer to chapters "Mak-
ing a measurement with Tobii" on page 878 and " Process and export Tobii
gaze vector data" on page 879.
For more information about the eye tracker data and the available gaze vector
settings, see chapters "Tobii data in QTM" on page 883 and "Gaze vector set-
tings" on page 887.

EXTERNAL DEVICES AND INTEGRATIONS 865


Setting up Tobii Pro Glasses 2 in QTM

Set up and configuration in QTM

To connect the Tobii Pro Glasses 2 eye tracker to QTM, the QTM computer and
the Tobii Recorder must be connected to the same local network. The Tobii
Recorder can be connected via WiFi. Follow these instructions to add Tobii to
your project:

1. Configure the Tobii glasses network connection (WiFi) using the Tobii Pro
Glasses Controller application.
2. To add the Tobii eye tracker device, open the Input Devices page in the
Project options dialog and click on Add device.

3. Select the Tobii Pro Glasses 2 from the list with devices and click OK.

4. Make sure that the Tobii device is enabled in the list. Then double-click on
the Tobii line in the list to open the Tobii page.
5. On the Tobii page enter the following information.

Address
The IP address or name of the Tobii glasses. Press the Find Glasses but-
ton to automatically locate them on the network and update the address
field.

EXTERNAL DEVICES AND INTEGRATIONS 866


Capture rate
Select capture rate. Note that not all versions of the glasses support 100
Hz.
Hardware Sync
Enable synchronized capture start between QTM and Tobii glasses. See
chapter "Setting up hardware synchronization" below for how to set up
hardware synchronization.
Offset
The offset of the Tobii data compared to the marker data. You can test
this by capturing a subject looking at a moving marker. After that you will
need to reprocess the file with different data offsets and find out where
the gaze diverges a little as possible from the marker. One way to see this
is to turn on the Gaze vector trace on the 3D View Settings page in Pro-
ject options. Press “Default Offset” to reset the offset to default value.

NOTE: The latency may depend on the system configuration, for


example when using Tobii glasses in combination with other input
devices for QTM. Make sure to check the offset and modify it if
needed if you are not using hardware synchronization.

Calibrate
Initiate Tobii calibration. Calibration is done the same way as in the Tobii
Pro Glasses Controller application using the Tobii calibration card.
When the glasses are connected and set up in QTM, you can proceed to add a
gaze vector, see chapter "How to use gaze vectors in QTM" on page 873.

Setting up hardware synchronization

If you want to make sure that the data from the Tobii Pro Glasses 2 is syn-
chronized with the motion capture data, you can use the hardware syn-
chronization for a simultaneous start of the recording.
The following items are required for hardware synchronization:
l A synchronization cable included in the Tobii-Qualisys connectivity kit.

l A Camera Sync Unit for Arqus of Miqus systems.

Set up the connections as follows:

EXTERNAL DEVICES AND INTEGRATIONS 867


1. Use a BNC cable to connect the Tobii sync cable to the MEAS. TIME port
on the Camera Sync Unit.
2. Connect the 3.5'' connector of the Tobii sync cable to the Data syn-
chronization port on the Tobii recorder.
For an Oqus systems with a Trigger/Sync splitter cable, set up the con-
nections as follows:

1. Connect the Trigger/Sync splitter cable to the control port of one of the
cameras.
2. Use a BNC cable to connect the Tobii sync cable to Sync out on the split-
ter cable.
3. Connect the 3.5 mm connector of the Tobii sync cable to the Data syn-
chronization port on the Tobii recorder.
4. Do the following settings in QTM:
a. Open the Synchronization page under Project Options.

b. Select the Oqus camera which is used for the synchronization.

c. Set Mode under Synchronization output to Measurement time.

NOTE: The synchronization connection is only needed at the start of the


capture. After starting the capture, the synchronization cable may be dis-
connected.

NOTE: Hardware synchronization and offset compensation is only


applied to captured data. It is not possible to compensate for latency in
real time.

Setting up Tobii Pro Glasses 3 in QTM

Set up and configuration in QTM

For using the Tobii Pro Glasses 3 with QTM, the QTM computer and the Tobii
Recording unit must be connected to the same local network. For connecting
and setting up your Glasses 3 device, follow the instructions at

EXTERNAL DEVICES AND INTEGRATIONS 868


connect.tobiipro.com. You will also need to install the Glasses 3 Controller applic-
ation, which you can download from the Tobii website (link included in the
instructions at connect.tobiipro.com).

Connecting a single Glasses 3 device

For connecting a single Glasses 3 device for use with QTM, the following con-
nection options can be used.
l WiFi access point: connect the QTM computer to the network with name
"TG03B-XXXXXXXXXX" (serial number of Recording unit). Default pass-
word: TobiiGlasses.
l Cable (router): Connect the Tobii recorder with an Ethernet cable to a
router. When using a wireless router, the computer running QTM can be
connected via WiFi.
For a detailed description of the connection options, refer to the Connection
Guide in the Glasses 3 Controller application.
After connection the Glasses 3 device it should be accessible in the Glasses 3
Controller application.

NOTE: The connection option Cable (direct) cannot be used in com-


bination with QTM, because then the Glasses 3 device will not get an
IP address.

Connecting multiple Glasses 3 devices

For connecting multiple Glasses 3 devices for use with QTM, the following con-
nection options can be used.
l Cable (router): Connect the Tobii Recorder units with an Ethernet cable to
a router. When using a wireless router, the computer running QTM can be
connected via WiFi.
l Alternatively, the Tobii Recorder units can be connected wireless through
a WiFi router. This requires setting up an alternative network con-
figuration on the Tobii Recorder devices.
For a detailed description of the connection options, refer to the Connection
Guide in the Glasses 3 Controller application or other Tobii resources.

EXTERNAL DEVICES AND INTEGRATIONS 869


After connection the Glasses 3 devices they should be accessible in the Glasses
3 Controller application.

NOTE: For setting up a wireless connection to a router, you can add a


network configuration to the Tobii device via the web interface. To access
the web interface, open a browser and type http://tg03b-xxxxxxxxxx
(serial number of Recorder unit) in the browser's address bar. For more
information about the Glasses 3 web interface, contact Tobii support.

Adding Tobii Pro Glasses 3 to QTM

To set up the glasses as an input device in QTM, follow these steps:

1. Open the Project Options and navigate to the Input Devices page.

2. On the Input Devices page, click the Add Device button.

3. In the Add Device dialog, select Tobii Pro Glasses 3 from the Select
device drop down menu.

4. Open the Tobii Pro Glasses 3 settings page under Project Options >
Input Devices > Eye Trackers.
5. Press the Locate Glasses button. After a short while the connected
Glasses 3 devices should be shown in the settings list.

Tobii Pro Glasses 3 device settings

The Glasses 3 settings are shown on the Tobii Pro Glasses 3 settings page
under Project Options > Input Devices > Eye Trackers.

EXTERNAL DEVICES AND INTEGRATIONS 870


The settings page contains a list of the connected Glasses 3 devices and their
settings. The device properties include information about the connection and
the hardware, including serial numbers and firmware version, and a list of avail-
able data channels. The following properties can be set:
Gaze Frequency
The default Gaze frequency is 50 Hz. The Gaze frequency can be
set for Glasses 3 devices if they support higher sample frequencies.

Hardware Synchronized
Enable synchronized capture start between QTM and Tobii glasses.
See chapter "Setting up hardware synchronization with Tobii Pro
Glasses 3" on the next page for how to set up hardware syn-
chronization.

When the glasses are connected and set up in QTM, you can proceed to add a
gaze vector, see chapter "How to use gaze vectors in QTM" on page 873.

Calibrating the glasses

At the start of a recording session, the glasses need to be calibrated to make


sure that the gaze data is accurate. The calibration can be done in the Glasses 3
Controller application. For more information, refer to Tobii documentation.

EXTERNAL DEVICES AND INTEGRATIONS 871


NOTE: Alternatively, the glasses can be calibrated via the web interface
of the Tobii Glasses 3 device. To access the web interface, open a browser
and type http://tg03b-xxxxxxxxxx (serial number of Recorder unit) in the
browser's address bar. For more information about the Glasses 3 web
interface, contact Tobii support.

Latency information

The Tobii data stream arrives to QTM over the network with a certain latency.
This will affect the real time calculation of the gaze vector. For reliable syn-
chronization of captured data, it is recommended to use hardware syn-
chronization. If you cannot use hardware synchronization, you can estimate the
latency as outlined in chapter "Tips for compensating latency" on page 877.
When not using hardware synchronization or when using the Tobii data in real
time, please take note of the following:
l The latency of Tobii data may vary between trials.

l The latency of the Tobii device data (e.g. pupil data), stored as analog data
in QTM is not compensated for when applying latency compensation to
the gaze vector calculation.
l The latency may increase when using Glasses 3 software to record data
and stream video from the Tobii glasses.
l When using multiple glasses, latencies can be quite large (e.g. up to a few
seconds) and may differ between devices.
l The latency may depend on the quality of the network.

Setting up hardware synchronization with Tobii Pro Glasses 3

If you want to make sure that the data from the Tobii glasses is synchronized
with the motion capture data, you can use the hardware synchronization for a
simultaneous start of the recording.
The following items are required for hardware synchronization:
l A synchronization cable included in the Tobii-Qualisys connectivity kit.

l A Camera Sync Unit for Arqus of Miqus systems.

Set up the connections as follows:

EXTERNAL DEVICES AND INTEGRATIONS 872


1. Use a BNC cable to connect the Tobii sync cable to one of the Syn-
chronization Output ports (Out 1 or Out 2) on the Camera Sync Unit. For
an Oqus system, use the Sync out connector on a sync/trigger splitter
cable instead of a Camera Sync Unit output port.
2. Connect the 3.5 mm connector of the Tobii sync cable to the Data syn-
chronization port on the Tobii recorder.
3. In the Synchronization settings under Project Options > Input Devices
> Camera system:
a. Set the Output mode to System live time.

b. The TTL signal polarity can be set to any value (Negative or Positive).

NOTE: The synchronization connection is only needed at the start of the


capture. After starting the capture, the synchronization cable may be dis-
connected.

NOTE: Hardware synchronization and offset compensation is only


applied to captured data. It is not possible to compensate for latency in
real time.

How to use gaze vectors in QTM

Setting up a gaze vector in QTM

After connecting and setting up a Tobii eye tracker device, follow these steps to
add a gaze vector in QTM.

1. Go to the Gaze Vector page under Project Options > Processing. For an
overview of the gaze vector options, see "Gaze vector settings" on
page 887.
2. Click on Add to add a new gaze vector. The gaze vector includes the defin-
ition for both left and right eye.
3. Double click on the gaze vector to open the Gaze vector dialog.

EXTERNAL DEVICES AND INTEGRATIONS 873


4. Select the eye tracker device from the Eye Tracker drop-down list.

5. Associate a rigid body with the eye tracker. The rigid body is needed to
project the gaze vector in the 3D space. You can select the rigid body from
the Rigid body drop-down list.
l The list contains a list with predefined rigid bodies for the selected
Tobii eye tracker, see chapter "Tobii rigid body definitions" below for
more information.
l It is also possible to choose a custom rigid body from the rigid bod-
ies that are included on the 6DOF Tracking page, for example if you
want to use a refined rigid body definition.
l For tips on how to optimize the tracking of the glasses, see chapter
"Tips for improving the tracking of the glasses" on the next page.
6. If you are not using hardware synchronization you can specify an offset to
compensate for the latency of the recorded eye tracker data.
l For tips on how to estimate the latency, see chapter "Tips for com-
pensating latency" on page 877.
7. Optionally, check the Use median filter option for smoothing the gaze
position and gaze vector data.
8. On the Processing page under Project Options, make sure that the pro-
cessing steps Calculate 6DOF and Calculate gaze vector data are selec-
ted for both Real time actions and Capture actions.

Tobii rigid body definitions

The Qualisys-Tobii connectivity kits for Tobii 2 and Tobii 3 include marker sets
that can be attached to the glasses. QTM includes predefined rigid body defin-
itions corresponding to the available attachments, that can be chosen from the

EXTERNAL DEVICES AND INTEGRATIONS 874


rigid body drop-down list in the gaze vector definition dialog.
For Tobii Pro Glasses 2, three sets with different layouts are avail-
able. The layout number is specified on the box containing the kit
items. The predefined rigid body definitions in QTM are named
Tobii Layout x, where x needs to be replaced with the layout num-
ber.

For Tobii Pro Glasses 3 there are separate attachments for the left
and the right side of the glasses. The marker set number is spe-
cified on the box containing the kit items. When you have multiple
glasses, the left and right attachments can be used in four different
combinations. The predefined rigid body definitions in QTM are
named Tobii3-Set-Lx-Ry, where x and y need to be replaced with
the respective set numbers. Left (L) and Right (R) are considered
from the perspective of the person wearing the glasses.

Tips for improving the tracking of the glasses

The predefined rigid body definitions in QTM for the Tobii marker attachments
give a good starting point for tracking the glasses. However, the actual marker
positions may deviate from the predefined ones, which may lead to suboptimal
tracking results.
The tracking can be improved in the following ways:
l Create a refined rigid body definition (see section below).

l Add a marker to the rigid body used to track the glasses, or remove one,
to make it more asymmetric.
l In some cases it can also help to use an AIM model for labeling the mark-
ers, for example to avoid swapping of the left and right clusters in QTM.
Refinement of the Tobii rigid body definition

TIP: Instead of following the steps below, you can use the Refine rigid
body script, that is included with the scripting tools at https://-
github.com/qualisys/qtm-scripting.

To refine the Tobii rigid body definition, follow these steps:

EXTERNAL DEVICES AND INTEGRATIONS 875


1. Make a short capture with the Tobii glasses with the original rigid body
definition. If you want to add extra markers, make sure they are included
in the capture.
2. Select a range on the time line with good quality tracking and make sure
that the Tobii lay out trajectories are correctly labeled. The 6DOF data
should be present. Check that the local axes are correct: the Z axis should
point forward from the glasses and the Y axis should point up.
l After manual changes of the trajectories you need to reprocess the
file with the Calculate 6DOF option enabled.
l In case 6DOF data is missing despite the labeling is correct, repro-
cess the data with a larger bone length tolerance of 10-15 mm in the
6DOF setting.
3. Select the trajectories of the glasses, right click the selection, and choose
Define rigid body (6DOF) (choose Average of frames if you have
checked the tracking and labeling quality across the selected time range).
Give the rigid body a different name (for example "Tobii-refined").
After adding the new definition, you will need to correct the definition as fol-
lows.

1. Open the 6DOF Tracking page under Project Options.

2. Select the new rigid body (from now on called Tobii-refined)

3. Press the Translate button, choose the option To current position of


this rigid body, and select the predefined Tobii rigid body. Press OK
twice (Translate dialog and Project Options).
4. Reprocess the file with the Calculate 6DOF option enabled and select
6DOF settings from project.
5. Go back to the 6DOF Tracking page under Project Options and select
the Tobii-refined rigid body.
6. Press the Rotate button, choose the option Rotate as this rigid body,
and select the predefined Tobii rigid body. Press OK twice (Rotate dialog
and Project Options).
7. Reprocess the file, make sure to check the Calculate 6DOF option and
select 6DOF settings from project.

EXTERNAL DEVICES AND INTEGRATIONS 876


8. Check in the 3D View window that the local axes of the two rigid bodies
perfectly overlap.
Now you have a refined rigid body definition, make sure it is used for the eye
tracker.

1. Open the Gaze Vector page under Project Options and select it as the
rigid body for your Tobii eye tracker.
2. Open the 6DOF Tracking page under Project Options, set the Bone
length tolerance value to 5 mm, and remove the unused rigid body defin-
itions.

Tips for compensating latency

When not using hardware synchronization, there will be a certain latency of the
recorded Tobii eye tracker data. The latency needs be compensated in QTM for
a correct calculation of the gaze vector.
The latency may be dependent on a number of factors, such as the network
connection or simultaneous use of the Tobii Glasses Controller software. When
using multiple Tobii Glasses 3 devices, the latency increases for the subsequent
devices since they are started sequentially.
The latency can be estimated as follows:

1. In QTM, configure the gaze vector using no rigid body and zero offset.

2. Make a capture with the glasses on. During the capture, look at a fixed
point, for example a marker in front of you, while gently shaking or nod-
ding your head.
3. Create a plot of the pitch or roll angle of the Tobii 6DOF data and another
plot of the gaze vector data, and align the plots.
l The latency can be estimated by measuring the delay between the
peaks in the 6DOF angle and the corresponding peaks in the gaze
vector.
4. Reprocess the gaze vector data with the estimated latency.
l For Tobii Glasses 2 the latency is specified by the Offset value (in mil-
liseconds) in the Input device settings page.

EXTERNAL DEVICES AND INTEGRATIONS 877


l For Tobii Glasses 3, the latency is specified by the Eye tracker off-
set value (in milliseconds) in the Gaze vector dialog.
5. Check if the peaks of the rigid body angle and the gaze vector are aligned
now.
l If the peaks are aligned you have found the correct estimate.

6. When you reprocess the gaze vector data associated with the rigid body
definition of the glasses, the gaze vector should point at the fixed position
you were looking at during the capture.
To make sure that the latency is consistent, capture a number of trials and
check that the latency compensation is correct for all trials.

NOTE: The latency compensation is only applied to the gaze vector cal-
culation in QTM. The recorded eye tracker data or analog data from the
Tobii device is not affected by the compensation.

Making a measurement with Tobii

The eye tracker data is collected together with the marker data in QTM and can
be reprocessed and exported together with the other data, see chapter " Pro-
cess and export Tobii gaze vector data" on the next page. Follow these steps to
capture eye tracker data in QTM:

1. Make sure that the Tobii eye tracker device(s) are connected and correctly
set up in QTM.
2. Make sure that you have activated Calculate gaze vector data and Cal-
culate 6DOF on the Processing page in Project options.
3. Calibrate the glasses using the Tobii calibration card.
l For Tobii Glasses 2, the calibration can be done using the Tobii Pro
Glasses Controller application or via the Input device settings page
in QTM, see chapter "Setting up Tobii Pro Glasses 2 in QTM" on
page 866.
l For Tobii Glasses 3, the calibration can be done using the Glasses 3
Controller application or via the web interface, see chapter "Setting
up Tobii Pro Glasses 3 in QTM" on page 868.

EXTERNAL DEVICES AND INTEGRATIONS 878


4. Start preview in QTM and the gaze vector is displayed in the 3D view win-
dow as soon as the 6DOF body is identified.
5. You can also view the gaze vector data in the Data info window, see
chapter "Tobii data in QTM" on page 883.
6. The eye tracker data is captured automatically with the other data in
QTM.
l When not using hardware synchronization the capture is started via
software, leading to a certain latency of the recorded Tobii eye
tracker data. The latency can be compensated for the gaze vector cal-
culation in QTM, see chapter "Tips for compensating latency" on
page 877.
l When using hardware synchronization, the capture is started sim-
ultaneously with the mocap data via a sync cable. For instruction on
how to set up hardware synchronization, see chapters "Setting up
hardware synchronization" on page 867 or "Setting up hardware syn-
chronization with Tobii Pro Glasses 3" on page 872.
Process and export Tobii gaze vector data

Reprocessing of gaze vector data

The Gaze vector data can be reprocessed in a file, both in reprocessing and
batch processing. This is useful if you need to update the gaze vector data
because of changed 6DOF data.
If you have changed the 6DOF data then you need to reprocess the Gaze vector
data to update it. Follow these steps to reprocess it.

1. Click on Reprocess on the Capture menu.

2. Then select Calculate gaze vectordata in the Processing steps.

l If you need to change the rigid body that is associated with the gaze
vector, then go to the Gaze vector page and double click on the gaze
vector in the list.

3. Click OK in the File reprocessing dialog.

EXTERNAL DEVICES AND INTEGRATIONS 879


Export formats for gaze and eye tracker data

The gaze vector and eye tracker data can then be exported and analyzed in
external programs. You can either export it to TSV or MAT files.

TSV

To export the eye tracker data select the data type Eye tracker in the TSV
export options. The gaze vector and eye tracker data will be exported in sep-
arate files, for example *_g_1.tsv and *_g_2.tsv with gaze vector data for the left
and right eye, respectively. The data included in the gaze vector export (_g.tsv)
is:
GAZE_VECTOR_NAME
The name of the gaze vector.

NO_OF_SAMPLES
The number of samples in the gaze vector data.

FREQUENCY
The frequency of the gaze vector data.

TIME_STAMP
Date and time when the motion capture file was made. The date and time
is followed by a tab character and then the timestamp in seconds from
when the computer was started.

START_OFFSET
The offset time for the first gaze vector frame. It is needed since you may
not always cut the measurement range at the start of gaze vector frame.
Reduce the start time of the gaze vector data with the offset so that it
actually starts before the marker data.

HW_SYNC
Indicates if hardware sync was used (value: YES) for the eye tracker data
or not (value: NO).

FILTER
Indicates if the gaze vector data was filtered (value: YES) or not (value:
NO).

EXTERNAL DEVICES AND INTEGRATIONS 880


Gaze vector data
Columns with the gaze vector data in the following order: X Pos, Y Pos, Z
Pos, X Vec, Y Vec, Z Vec.

For Tobii Pro Glasses 2 devices, the eye tracker data is exported as *_e.tsv files.
The data included in the eye tracker data export (_e.tsv) is:
EYE_TRACKER_NAME
The name of the eye tracker device.

NO_OF_SAMPLES
The number of samples in the eye tracker data.

FREQUENCY
The frequency of the eye tracker data.

TIME_STAMP
Date and time when the motion capture file was made. The date and time
is followed by a tab character and then the timestamp in seconds from
when the computer was started.

START_OFFSET
The offset time for the first eye tracker data frame.

HW_SYNC
Indicates if hardware sync was used (value: YES) for the eye tracker data
or not (value: NO).

Pupil diameter data


Columns with the gaze vector data in the following order: LEFT_PUPIL_
DIAMETER, RIGHT_PUPIL_DIAMETER.

For Tobii Pro Glasses 3 devices, the eye tracker data is exported as analog data,
see chapter "Analog data (_a.tsv)" on page 722.

MAT

To export the eye tracker data select the Eye tracker option for the
MATLAB file export. The struct array of the MAT file then includes the following
data:

EXTERNAL DEVICES AND INTEGRATIONS 881


GazeVector
The struct array consists of the following elements for each eye. For
example write GazeVector(1).GazeVectorName to access the name for the
left eye.
GazeVectorName
The name of the gaze vector.

NrOfSamples
The number of samples in the gaze vector data.

StartOffset
The offset time for the first gaze vector frame. It is needed since you
may not always cut the measurement range at the start of gaze vec-
tor frame. Reduce the start time of the gaze vector data with the off-
set so that it actually starts before the marker data.

Frequency
The frequency of the gaze vector data.

HWSync
Indicates if hardware sync was used for the eye tracker data or not.
The values can be 0 (no hardware sync) or 1 (hardware sync).

Filter
Indicates if the gaze vector data was filtered or not. The values can
be 0 (not filtered) or 1 (filtered).

GazeVector
Array with the gaze vector data in the following order: X Pos, Y Pos, Z
Pos, X Vec, Y Vec, Z Vec.

EyeTracker
Struct array with the following eye tracker data.
EyeTrackerName
The name of the eye tracker device.

NrOfSamples
The number of samples of eye tracker data.

StartOffset
The offset time for the first eye tracker data frame.

EXTERNAL DEVICES AND INTEGRATIONS 882


Frequency
The frequency of the eye tracker data.

HWSync
Indicates if hardware sync was used for the eye tracker data or not.
The values can be 0 (no hardware sync) or 1 (hardware sync).

EyeTracker
Array with the pupil diameter data in mm in the following order: left
pupil, right pupil.

For Tobii Pro Glasses 3 devices the eye tracker data is exported as analog data,
see chapter "MAT file format" on page 730.
Tobii data in QTM

The Gaze vector data is displayed in the 3D view of the window as a vector, for
example color and length can be modified on the 3D view settings page in Pro-
ject options.
There are two different types of data available for the Tobii eye trackers in
QTM.

EXTERNAL DEVICES AND INTEGRATIONS 883


1. The eye tracker device data.
l For Tobii Pro Glasses 2, the device data is stored as Eye tracking data
in QTM, containing the measured pupil diameters.
l For Tobii Pro Glasses 3, the device data is stored in QTM as analog
data, containing all data channels retrieved from the eye tracker
device.
2. The gaze vector data.

Gaze vector data

The gaze vector is provided by the Tobii recorder and then transformed in QTM
to global 3D coordinates through the pose of the associated rigid body. The
gaze vector data can be viewed in a Data Info window as Gaze Vector data.
The gaze vector data includes:
Device
The device is the name of the Gaze vector. (L) and (R) stands for left and
right eye.

X Pos
This is the X position (mm) for the origin of the Gaze vector in the Global
coordinate system.

Y Pos
This is the Y position (mm) for the origin of the Gaze vector in the Global
coordinate system.

Z Pos
This is the Z position (mm) for the origin of the Gaze vector in the Global
coordinate system.

X Vec
This is the X value of the Gaze vector in the Global coordinate system.

Y Vec
This is the Y value of the Gaze vector in the Global coordinate system.

EXTERNAL DEVICES AND INTEGRATIONS 884


Z Vec
This is the Z value of the Gaze vector in the Global coordinate system.

Frame Number
This is the frame number of the Tobii data. Since the frame rate of the
Tobii data is mostly lower than the marker data, it means that the Tobii
data is updated less often than the marker data.

Eye Tracker data

Tobii Pro Glasses 2

The eye tracker data from Tobii Pro Glasses 2 devices can be viewed in a Data
Info window as Eye Tracker data. The eye tracker data for Tobii Pro Glasses 2
includes:
Device
The name of the Tobii eye tracker device under Input devices. The name
for a single Tobii device is by default "Tobii". Additional Tobii devices are
numbered with 2, 3, etc.

L Pupil D
The diameter of the left pupil in mm.

R Pupil D
The diameter of the right pupil in mm.

Frame Number
Frame number of the Tobii data. Note that the Tobii frame number is not
the same as the QTM frame number.

EXTERNAL DEVICES AND INTEGRATIONS 885


Tobii Pro Glasses 3

The eye tracker data from Tobii Pro Glasses 3 devices can be viewed in a Data
Info window as Analog data. The eye tracker data for Tobii Pro Glasses 3
includes:
Channel
Channel name indicating the type of data.

Value
Value of the recorded data including the units.

Board Name
Name of the Tobii Glasses 3 device (serial number of the Recorder unit).

Channel No
Analog channel number.

EXTERNAL DEVICES AND INTEGRATIONS 886


Gaze vector settings

The Gaze Vector page contains a list of the current gaze vectors and the set-
tings needed to calculate a gaze vector from the eye tracker data. Use the fol-
lowing options to modify the list.
Add
Add a new gaze vector to the list.

Edit (or double-click on a gaze vector)


These two alternatives open the dialog below for the selected gaze vector.

The options in the dialog defines how to calculate the Gaze vector.
Eye tracker
Select an eye tracker from the list.

EXTERNAL DEVICES AND INTEGRATIONS 887


In Project options the list contains all of the eye trackers that are
available on the Input devices page. For reprocessing with file set-
tings, the list contains all of the eye trackers that were used in the
capture.

Eye tracker offset


The Eye tracker offset is the number of milliseconds that the eye
tracker data is delayed compared with the marker data. This value is
then used as a compensation to align the data in the file.

Rigid body
Select the rigid body that is mounted on the eye tracker. This is used
as reference for the gaze vector calibration.

NOTE: In Project options the list contains all of the rigid bod-
ies that are available on the Input devices page. For repro-
cessing with file settings the list contains all of the eye trackers
that were used in the capture.

Use median filter


Check this option for applying a median filter for smoothing of the
gaze position and gaze vector data.

Delete
Delete the selected gaze vector.

Right-click to open menu

From the menu you can Change name of the selected gaze vector and
Remove gaze vector.

EXTERNAL DEVICES AND INTEGRATIONS 888


ASL EyeHead integration
The ASL EyeHead Integration system combines information from the head
mounted eye tracker with information from the camera system about the heat
position to compute line of gaze in the environment. The head mounted eye
tracker determines line of gaze relative to the head, and the camera system
determines the head position and orientation of the head in the environment.
For the integration you need one rigid body on the head of the subject and
then a rigid body on a tool that measures the position of the eye. Finally you
need to define a plane with a laser pointer or a pointer wand. All of this equip-
ment can be ordered from ASL. The data from the rigid bodies are then used to
calculate the gaze in the Mobile Eye EyeVison program.
For more information about the integration please contact ASL.

How to use motion gloves

Connecting Manus gloves


Overview

The MANUS integration allows for the use of MANUS gloves for finger tracking
in combination with the skeleton solver in QTM.

Components and requirements

l MANUS Glove(s) and MANUS Glove dongle

l MANUS Core and Dashboard software

l QTM 2024.1 or higher

l QDevice plugin for Manus

MANUS setup

In the MANUS Knowledge Center (https://documentation.manus-meta.com),


open the manual for the glove type you are using and follow the instructions
on the "First Time Setup" page.

EXTERNAL DEVICES AND INTEGRATIONS 889


Connect the Manus Gloves

Connect the MANUS gloves to your computer.

1. Connect the MANUS dongle to the computer via USB.

2. Start the MANUS Core program.

3. Make sure that the gloves are recognized and correctly configured in the
MANUS Core Dashboard.
After connecting you can close the MANUS Core Dashboard. MANUS Core will
continue to run in the background even when the MANUS Core Dashboard is
closed.

Setup of Manus Gloves in QTM

Follow these steps to setup Manus Gloves in QTM:

1. Download QDevice installer for Manus Gloves from https://www.qualisys.-


com/downloads/ and run the installer.
2. In QTM, open the Input Devices page under Project Options.

3. Click the Add Device button and select Manus Gloves in the drop down
menu.
4. Open the Manus Gloves settings page under Input Devices > Gloves to
access the MANUS device settings, see chapter "Manus Gloves" on
page 317.
5. Specify the IP address of the computer running the Manus Core software
(use 127.0.0.1 when the gloves are connected to the same computer run-
ning QTM).
6. Choose a Model Type depending on the skeleton type used in QTM
(Qualisys Skeleton/Metahuman).
7. Press the Synchronize Settings button to locate the gloves and syn-
chronize the settings.

EXTERNAL DEVICES AND INTEGRATIONS 890


Making a measurement with Manus Gloves

Create a skeleton

Apply the markers to the actor according to the Qualisys Animation marker set
guide. The markers are the same for the Qualisys Animation and the MetaHu-
man skeleton models. Attach the LHandOut and RHandOut markers to the
gloves in the positions specified in the marker set guide. Use a T-pose to cal-
ibrate the skeleton.
The use of gloves is supported for the Qualisys Animation skeleton without any
further customizations. For using the MetaHuman skeleton model, a custom
skeleton is required. Contact Qualisys support for more information.

Create bindings

Go to the Glove processing page under Project Options > Processing, see
chapter "Glove" on page 344.
The glove processing step settings dialog is used to create bindings which asso-
ciate a glove to the skeleton its data will be applied to. To create a new binding,
select an available glove in the bottom row of the bindings grid and then select
the associated skeleton.
Once the glove bindings are created, you can stream and capture skeleton data
including the hands driven by the gloves.
Capturing, viewing and exporting data

To stream or collect data with Manus, simply start a preview or a capture in


QTM.
To view the glove data in QTM, open a Data Info window, right click in the dis-
play area and click Analog data.
To view the skeleton data, open a Data Info window, right click in the display
area and click Skeleton data.
The data can be plotted by selecting the rows in the Data Info window, right-
click, and select the data to plot from the Plot sub-menu.
The skeleton data can be exported to FBX, TSV, MAT and JSON formats, see
chapter "Data export to other applications" on page 710.

EXTERNAL DEVICES AND INTEGRATIONS 891


Connecting StretchSense gloves
The StretchSense integration allows for the use of StretchSense gloves for fin-
ger tracking in combination with the skeleton solver in QTM. The use of gloves
is natively supported by the Qualisys Animation skeleton. For the use of gloves
with custom skeletons, such as the MetaHuman skeleton, contact Qualisys sup-
port.
Components and requirements
l StretchSense Glove(s) and dongle. For supported glove types, see the
StretchSense website: https://stretchsense.com/.
l StretchSense Hand Engine Pro software. Download the license and latest
version from your StretchSense user account.
l QTM 2024.1 or higher.

l StretchSense Gloves Integration plugin for QTM.

Installing the StretchSense Gloves Integration

Make sure that the latest compatible version of the StretchSense Gloves Integ-
ration for QTM is installed. Follow these steps to download and install the integ-
ration:

1. In QTM, open the Input Devices page under Project Options.

2. Click the Download device drivers link.

3. Download the installer for the StretchSense Gloves Integration.

4. Run the installer.

Setting up the StretchSense gloves

Follow the below instructions for setting up the StretchSense gloves. For more
detailed information, refer to StretchSense resources at https://stretch-
sense.com/support/.

Connecting the gloves to the computer

Connect the StretchSense gloves to your computer:

EXTERNAL DEVICES AND INTEGRATIONS 892


1. Connect the StretchSense dongle to the computer via USB

2. Start the StretchSense Hand Engine Pro program

3. Select the gloves from the device dropdown

Glove calibration and mapping

Calibrate the gloves as follows:

1. Click the Calibrate button

2. Follow the on-screen instructions guiding you through the calibration


sequence
Setup the mapping:

1. Select Tools > Remapping

2. Choose File -> C:\Program Files\Qualisys\Common\Input_


Devices\QStretchSense_QDevice\Qualisys_Remap.fbx

EXTERNAL DEVICES AND INTEGRATIONS 893


3. Select LeftHand for Left Hand Root and RightHand for Right Hand Root.

4. Setup the joints as follows. Press Submit and close the window when
done.

EXTERNAL DEVICES AND INTEGRATIONS 894


NOTE: The LeftInHandThumb and RightInHandThumb joints are not
mapped.

Setup streaming:

1. For the respective gloves, select Qualisys_Remap.

2. Switch on TCP Streaming for the left and the right hand.

Setup and configuration in QTM

Add input device

Add the StretchSense device to QTM:

EXTERNAL DEVICES AND INTEGRATIONS 895


1. In QTM, open the Input Devices page under Project Options.

2. Click the Add Device button and select StretchSense in the drop down
menu.

3. Check the StretchSense item in the Input Devices list. The StretchSense
device should now show up as an input device under the Gloves category.

Device settings

The StretchSense device settings are managed via the StretchSense settings
page.

The StretchSense page contains the following buttons to communicate with the
gloves and a list with settings for the gloves included in the configuration.
Restore Default Settings
Reset settings to their default values.
Synchronize Settings
Synchronize changed settings to the StretchSense device.

EXTERNAL DEVICES AND INTEGRATIONS 896


The settings list contains a top section with common settings and a section with
individual settings for each glove.
Common settings
The common settings are always visible.
API version
The API version used when creating the integration. Hover over the
version number to compare with the API version used by QTM.

Integration version
The version number for the integration.

Hand Engine IP
IP address of the Hand Engine server.

Glove Ports
Specify the port numbers of the gloves according to the con-
figuration in Hand Engine. Use commas to separate the port num-
bers for the respective gloves.

Individual settings and information for each glove


The individual settings are displayed for the StretchSense gloves included
in the configuration.
Name
Name of the glove according to the Hand Engine configuration.

Channels
List of channels and sample frequency.

Configuration in QTM

Setting up the device in QTM

To set up the gloves in QTM, follow these steps:

1. Open the StretchSense device settings page under Project Options >
Input Devices > Gloves.
2. Specify the Hand Engine IP address. Use 127.0.0.1 when on the same
computer.
3. Specify the port numbers for the gloves, separated by commas.

EXTERNAL DEVICES AND INTEGRATIONS 897


Processing settings

The use of glove data requires a calibrated skeleton in QTM with matching
hand hierarchies. Glove data is natively supported by Qualisys Animation skel-
eton. The following processing steps need to be setup for using StretchSense
glove data.
Enable glove processing in QTM:

1. Go to Project Options > Processing.

2. Enable Apply glove data.

Create bindings:

1. Go to Project Options > Processing > Glove.

2. Select an available glove in the bottom row of the bindings grid.

3. Select the associated skeleton.

Capturing, viewing and exporting data

To stream or collect data with StretchSense, simply start a preview or a capture


in QTM.
To view the glove data in QTM, open a Data Info window, right click in the dis-
play area and click Analog data.
To view the skeleton data, open a Data Info window, right click in the display
area and click Skeleton data.
The data can be plotted by selecting the rows in the Data Info window, right-
click, and select the data to plot from the Plot sub-menu.
The skeleton data can be exported to FBX, TSV, MAT and JSON formats, see
chapter "Data export to other applications" on page 710.

How to use external video devices


QTM can record video data, including the sound, together with the motion cap-
ture data. This data is only for documentation purposes and are not used in the
calculation of 3D data. The video source can be either via DirectShow (Black-

EXTERNAL DEVICES AND INTEGRATIONS 898


magic Design cards or DV/webcam devices), see respectively chapters "Video
capture with Blackmagic Design cards" below and "DV/webcam devices" on
page 909.
The video capture via DirectShow starts and stops in synchronization with the
motion capture. The synchronization is done via Windows OS, therefore exact
synchronization cannot be guaranteed. It is, however, within a quarter of a
second. It is possible to change the offset of the video to get a more accurate
synchronization, see chapter "Video offset" on page 911.
The capture rate of the video data depends on the hardware of the video
device, on the settings of the video device and on the general performance of
the measurement computer that QTM runs on. To set the settings of the video
device, right-click in the view of the video camera in a 2D view window, in the
preview mode, to open the View window menu. Click Video camera settings
to open the dialog with settings. For information about the settings see the
manual of the video device.
The recorded video data will be saved as an AVI file in the same folder that is
selected for the capture file (QTM file) and it will have the same name as the
capture file. It can be played in any media player that supports the AVI format
as well as viewed in QTM together with the motion capture data. For more
information about the display of video data, see chapter "External video
devices in 2D view" on page 100.

Video capture with Blackmagic Design cards


QTM supports the video capture cards Intensity Pro and DeckLink Mini
Recorder from Blackmagic Design. It is PCI Express cards that can capture video
directly from a video camera. The Intensity Pro card supports video from dif-
ferent types of video sources (e.g. HDMI, Component, S-video and Composite).
The card can also capture the sound on the HDMI input or from an analog RCA
input. The DeckLink Mini Recorder card supports video and sound from HDMI
and SDI.
Installing the BlackMagic Design card

It is important to notice that you need one Blackmagic Design card for each
video source that you want to capture. Therefore, you also need a PCI Express
slot on the computer for each card. If you have a computer made before 2008
there might not be that many PCI Express slots on the motherboard. It is also

EXTERNAL DEVICES AND INTEGRATIONS 899


required that the graphic board can handle hardware acceleration, which
means that a motherboard integrated graphics card will not work. Contact
Qualisys AB to make sure that it works on your computer.
Follow these steps to install the card in the computer:

1. Install the drivers (Desktop Video x.x.msi). It is recommended to down-


load the latest version from www.blackmagic-design.com.
2. Shut down the computer.

3. Install the card in a free PCI express slot.

4. Start the computer and click through the driver installation.

5. When the installation is finished, open the Desktop Video Setup applic-
ation.
a. Make sure that the input is set to HDMI Video & HDMI Audio,
unless you are using another input.
b. Close and click OK if there is an elevation question.

For more information about how to install the card see the manuals from Black-
magic Design.
Connecting a video source to the Intensity Pro card

The Intensity Pro card can capture from different types of video sources (HDMI,
Component, S-video and Composite). The card can also capture the sound on
the HDMI input or from an analog RCA input. However it can only capture from
one of the video inputs at a time so you have to follow these instructions when
connecting a video source.

1. Select the correct input for your video source on the Blackmagic Control
Panel on the Windows control panel. You get the best image with HDMI
input, but you can use that with eitherHDMI audio or Analog RCA audio.
You do not have to change any other setting on the control panel.
2. Connect the video source to the input you have chosen in the Blackmagic
Control Panel. You have to use the breakout cable for other inputs than
HDMI.

EXTERNAL DEVICES AND INTEGRATIONS 900


IMPORTANT: Make sure that you connect the HDMI cable to the
HDMI input, see image below.

IMPORTANT: If you are using the breakout cable, make sure that
you read the labels on the connectors so that you use the input con-
nectors and not the output connectors. For more information about
the analog inputs see the manual from Blackmagic Design.

Connecting a video source to the Decklink Mini Recorder card

The Decklink Mini Recorder card can capture video and sound from either
HDMI or SDI. However it can only capture from one of the video inputs at a
time so you have to follow these instructions when connecting a video source.

1. Select the correct input for your video source on the Blackmagic Control
Panel on the Windows control panel. On a standard camcorder the out-
put is always HDMI, so set the video source to HDMI input. The SDI out-
put is only available on pro camcorders.
You do not have to change any other setting on the control panel.
2. Connect the video source to the input you have chosen in the Blackmagic
Control Panel.
Using Blackmagic Design video source in QTM

QTM uses DirectShow to capture the video and audio from the Intensity Pro
card. It will, therefore, appear as a video device called Decklink Video Capture
on the Video Devices page in Project options. To use the board follow the
instructions below:

EXTERNAL DEVICES AND INTEGRATIONS 901


1. Before you start QTM you must select the correct input on the card on the
Blackmagic Control Panel on the Windows control panel, see chapter
"Connecting a video source to the Intensity Pro card" on page 900 or "Con-
necting a video source to the Decklink Mini Recorder card" on the pre-
vious page. HDMI usually gives you the best image. You do not have to
specify the output or any of the other settings.
2. On the Input Devices page, select the checkbox in the Enabled column
next to Decklink Video Capture option. If you have several cards it will
be numbered starting from the top of PCI express slots.

IMPORTANT: Do not select the BlackMagic WDM Capture option


it does not work in QTM.

3. Start a new measurement and open a 2D view window.

4. The Video view is most probably black, because the Video format is on
the default value. Right-click on the Video view for the Blackmagic card
and select Video camera settings.

5. Select the Video Format of the video source and click OK. You must check
which format is actually used by the video camera, there is usually a sep-
arate setting for the HDMI output. If you select the wrong format the
image will be black in QTM.

EXTERNAL DEVICES AND INTEGRATIONS 902


NOTE: You have to use OK to see if the settings work, because
Apply will not update the settings.

The options for the Video Format goes from NTSC to HD1080p 24. The
first five settings for PAL and NTSC are ambiguous, but they correspond
to the following formats.
l NTSC

l NTSC 23.98

l PAL

l NTSC Progressive

l PAL Progressive

The rest of the formats can be interpreted directly from the settings, e.g.
HD 720p 59.94options is 720p at 59.94 Hz. If you use the HDMI input QTM
will automatically detect the number of pixels in the image and scale the
video view accordingly.
6. Then if you want to record sound with the video, right click on the Video
view again and select the correct audio source on the Connect to audio
source option. QTM will remember the audio source as long as the Intens-
ity Pro card is installed on the computer.

7. It is recommended to use a codec to compress the video files, otherwise


the files will be extremely large. For information about recommended
codecs, see chapter "Recommended codecs" on page 583.

EXTERNAL DEVICES AND INTEGRATIONS 903


8. Because the video is recorded with DirectShow the start of the video is
not exactly synchronized with the marker data. It is therefore recom-
mended that you look at the offset in a file and change it with the Set
time offset option on the menu. The offset will usually be the same as
long as you have the same measurement setup. For more information see
chapter "Video offset" on page 911.
Settings on Sony HDR-CX330

On a new Sony HDR-CX330 camera you need to change the HDMI output and
some other settings to make it work best with QTM and the Blackmagic Design
card. Click on Menu with the control button to change the video settings and
the settings menu below is displayed on the camera.

HDMI Resolution

First of all you must set the HDMI resolution so that you know what Video set-
tings to use in QTM.

1. Go to Setup and scroll down to HDMI Resolution and open that option.

2. Select the option that you want to use. The recommended option is
720p/480p which will give you 720p 59.94 Hz.

EXTERNAL DEVICES AND INTEGRATIONS 904


The list below shows different resolutions and the matching Video
format setting in QTM, for information about how to change the setting
in QTM see chapter "Using Blackmagic Design video source in QTM" on
page 901“Using Intensity Pro in QTM” in the QTM manual.
l Auto - Unknown

l 1080p -
Not supported by Intensity Pro because the frequency is too high.
l 1080i - HD 1080i 59.94 - 8 bit 4:2:2 YUV
This option is interlaced which is not recommended when being
viewed on a computer, because when played there will be horizontal
lines in the image.
l 720p - HD 720p 59.94 - 8 bit 4:2:2 YUV
Recommended option because it is the highest possible resolution
that uses progressive scanning. The image is good, but the files will
be large so it is recommended to compress the files in QTM.

NOTE: If the video camera is a PAL model the frequency changes


from 59.94 Hz to 50 Hz and NTSC to PAL.

Other recommended settings

Then it is recommended to change the two settings that makes the camera
operate better.

Demo mode

Turn off Demo mode on the camera, otherwise the camera will start showing
you demo pictures after a while.

EXTERNAL DEVICES AND INTEGRATIONS 905


l Go to Setup and scroll down in the list until you find the Demo mode
option and turn it off.

Face detection

l Go to Camera/Mic on the first menu and scroll down to the Face Detec-
tion option and turn it off. If not turned off there will be a rectangle in the
image as soon as a face is detected.

Settings on Sony HDR-CX430V

On a new Sony HDR-CX430V camera you need to change the HDMI output and
some other settings to make it work best with QTM and the Blackmagic Design
card. Click on Menu on the top left corner on the touch screen to change the
video settings and the settings menu below is displayed on the camera.

EXTERNAL DEVICES AND INTEGRATIONS 906


HDMI Resolution

First of all you must set the HDMI resolution so that you know what Video set-
tings to use in QTM.

1. Go to Setup and scroll down to HDMI Resolution and open that option.

2. Select the option that you want to use. The recommended option is
720p/480p which will give you 720p 59.94 Hz.

The list below shows different resolutions and the matching Video
format setting in QTM, for information about how to change the setting
in QTM see chapter "Using Blackmagic Design video source in QTM" on
page 901“Using Intensity Pro in QTM” in the QTM manual.
l Auto - Unknown

l 1080p/480p -
Not supported by Intensity Pro because the frequency is too high.
l 1080i/480i - HD 1080i 59.94 - 8 bit 4:2:2 YUV
This option is interlaced which is not recommended when being
viewed on a computer, because when played there will be horizontal
lines in the image.

EXTERNAL DEVICES AND INTEGRATIONS 907


l 720p/480p - HD 720p 59.94 - 8 bit 4:2:2 YUV
Recommended option because it is the highest possible resolution
that uses progressive scanning. The image is good, but the files will
be large so it is recommended to compress the files in QTM.
l 480p - NTSC - 8 bit 4:2:2 YUV(The fourth item in the list in QTM)
This is the same resolution as the American analog TV resolution,
but it uses progressive scan so there are no lines in the image. You
can use this option if you want smaller video files than 720p/576p.
l 480i - NTSC - 8 bit 4:2:2 YUV(The top option in the list in QTM)
This is the American TV analog resolution. It is interlaced and has a
low resolution.

NOTE: If the video camera is a PAL model the frequency changes


from 59.94 Hz to 50 Hz and NTSC to PAL.

Other recommended settings

Then it is recommended to change the two settings that makes the camera
operate better.

Demo mode

Turn off Demo mode on the camera, otherwise the camera will start showing
you demo pictures after a while.
l Go to Setup and scroll down in the list until you find the Demo mode
option and turn it off.

EXTERNAL DEVICES AND INTEGRATIONS 908


Face detection

l Go to Camera/Mic on the first menu and scroll down to the Face Detec-
tion option and turn it off. If not turned off there will be a rectangle in the
image as soon as a face is detected.

Panasonic AW-HE2

To setup the Panasonic camera for use with QTM, connect the power adapter
and an HDMI cable. Refer to the Panasonic manual for how to connect the
power adapter and switch on the camera. The LED on the front of the camera
should light green when the camera is switched on.
Connect the Panasonic AW-HW2 camera with a standard HDMI cable to the
input of the BlackMagic design card. Follow the instruction under "Using Black-
magic Design video source in QTM" on page 901 to enable video capture in
QTM.
For Video format (step 5 in "Using Blackmagic Design video source in QTM" on
page 901) use HD 720p 59.94 Hz - 8 bit 4:2:2 YUV.

DV/webcam devices
Only DV/webcam devices that support the Microsoft DirectShow API can be
used with the QTM software. This is true for most webcams, however there are
very few video cameras equipped with a DV–output or FireWire 4-pin con-
nection. This is a requirement to be able to stream out video in real-time. The
option for video cameras without firewire is to use the Intensity Pro card, see
chapter "Video capture with Blackmagic Design cards" on page 899.
There can be several devices connected to QTM, but there might be problems
with the video capture if too many are connected. The reason is that the video
data stream becomes too large and the frame rate of the video devices will

EXTERNAL DEVICES AND INTEGRATIONS 909


therefore be lowered. If you have problems capturing many video cameras at
the same time you can use import the link to a video file after the meas-
urement, see chapter "Import video link" on page 912.
The video devices are connected to QTM on the Input devices page in the Pro-
ject options dialog, see chapter "Input Devices" on page 218. For the DV cam-
eras the audio is recorded directly with the video. Webcams on the other hand
has the audio sent separately to the computer, which means that you have to
select it manually, see chapter "Selecting audio source" on the next page.

NOTE: For information about how to connect the video device to the
computer see the manual of the video device.

Compression of video from video cameras


QTM can compress the video recording during the capture of video data. To
activate the compression for a video camera, right-click in the view of that cam-
era in a 2D view window. Then click on Choose video compression and select
the codec you want to use. You have to select the codec you want individually
for each video camera that is used in the capture. For more information about
recommended codecs, see chapter "Recommended codecs" on page 583.
The codec settings for individual cameras can be changed by choosing Video
compression settings from the menu after right-clicking in the video view.
This will open the settings for the currently selected codec.
Since QTM uses standard Microsoft DirectShow routines for video playback, the
video can be compressed after it has been captured as well.

NOTE: Qualisys AB does not offer support on video compression issues


on external software.

EXTERNAL DEVICES AND INTEGRATIONS 910


Video offset
Because the Video synchronization is done via Windows it is not always exact.
However there is a way to set the offset for your current setup, follow this pro-
cedure:

1. Open the View window menu by right-clicking in an open Video view win-
dow.
2. Then click on Set time offset and enter the starting time of the video in
the dialog. The starting time is in the measurement time, i.e. with a 1 s
starting time the video file will start at 1 s when looking in the 3D view win-
dow.
3. Play the file and check that it is ok.

NOTE: Because of the offset there will be a part in the beginning or


at the end with no video data.

4. Repeat the steps for each video file that has been captured. The offset will
be remembered for the video device until it is unplugged from the com-
puter.

Selecting audio source


QTM can record the audio from DirectShow devices together with a DirectShow
video source. This can for example be the microphone on the computer or
audio on the Intensity Pro card. The available audio sources is listed in the
menu, however for DV cameras this option is not available because the audio is
already included in the video.
To set the audio source for a video source follow these instructions>

1. Start a new measurement and open a 2D view window.

2. Right-click on the Video view that you want to set the audio source for and
click on Connect to audio source in the menu.

EXTERNAL DEVICES AND INTEGRATIONS 911


3. Then select the audio source you want from the list. QTM will remember
the audio source as long as the video source is available on the computer.

Import video link


The import video link feature can be used when a video files have been cap-
tured externally from QTM. However the start of the video capture must be
triggered in some way to ensure the synchronization with the marker capture.
How to trigger the video device can be found in the video manuals, but not all
video devices can be triggered externally.
To import the video file click Import/Add link to video file on the File menu.
Browse to the video file which must be an AVI-file. The video file do not need to
be the same length as the marker capture, since the video length is encoded in
the file so that the video frames will be placed at the correct place. If the video
has not started in synchronization with the marker capture, an offset can be
applied to the video see chapter "Video offset" on the previous page.

How to use generic devices

Connecting the h/p/cosmos treadmill


The h/p/cosmos tredamills can be integrated with QTM, if they use the MCU6
user terminal, for example the pluto, mercury, quasar and pulsar modules.
Hardware connections

Connect the h/p/cosmos treadmill to an open Ethernet port on the same com-
puter as QTM. Refer to the h/p/cosmos manual for information how to con-
figure the plate.

EXTERNAL DEVICES AND INTEGRATIONS 912


Set up and configuration in QTM

Add input device

Once a h/p cosmos treadmill is connected to the computer, the device can be
added and configured in QTM.

1. Open the Input Devices page in the QTM Project Options.

2. Click the Add Device button, and select hpcosmos treadmill in the drop
down menu.

3. Check the hpcosmos treadmill item in the Input Devices list. The hpcos-
mos treadmill device should now show up as an input device under the
Generic category.
4. Open the hpcosmos treadmill settings page, see chapter "h/p/cosmos
treadmill" on page 318.
5. Enter the IP address for the treadmill. You can find the IP address under
the External control setting in the treadmill software.
Capturing, viewing and exporting data

To collect data with the hp cosmos treadmill, simply start a capture in QTM. The
treadmill data is automatically captured with the start of the capture. The tread-
mill data is captured at 1 Hz, so there is no need for an exact synchronization of
the capture start.
To view the h/p/cosmos data during preview or a capture, open a Data Info
window via the View menu (keyboard shortcut Ctrl + D), right-click in the win-
dow and select Analog data. The hp cosmos analog data includes speed, elev-
ation, and heart rate. Note that the slow sample rate means that the plot can
look strange if it includes too few samples. To make the real time plots look
good, it is recommended to increase the Default Real-Time Plot Size option
on the GUI page to around 20-30 seconds.

EXTERNAL DEVICES AND INTEGRATIONS 913


The treadmill data is exported in the C3D, MAT and TSV exports. When export-
ing to C3D, the analog data will be resampled to the closest integer multiple of
the capture frequency, or higher depending on all analog data stored in the
QTM file, see chapter "C3D file format" on page 728.

EXTERNAL DEVICES AND INTEGRATIONS 914


Applicat ions

Analysis Modules
Analysis Modules are predefined applications based on the Project Automation
Framework (PAF) in QTM. They are used to streamline the motion capture work-
flow for specific applications. In QTM, projects with PAF functionality contain a
structured Project view pane guiding the user to collect captures and meta-
data, start analyses, and generate reports, see chapter "PAF Project view" on
page 921.
Qualisys offers Analysis Modules for a range of biomechanical applications,
providing an integrated workflow for collecting, analyzing and reporting data.
The Analysis Modules generally use Visual3D software by HAS-Motion for per-
forming the biomechanical calculations. Analysis Modules are available for the
following applications:
l Baseball

l Cycling

l Equine lameness

l Functional Assessment

l Gait

l Golf

l Running

Other modules available for download that are based on the Project Auto-
mation Framework are:
l CalTester

l Calqulus (see chapter "Calqulus" on page 923)

APPLICATIONS 915
You can download installers, documentation and demo projects of your pur-
chased Analysis Modules via the Qualisys dashboard. For information about
how to install or update Analysis Modules, see chapter "PAF module install-
ation" below.
For more information about the available analysis modules, please refer to the
product information on the Qualisys website or contact your sales rep-
resentative or [email protected].
The Open Project Automation Framework (Open PAF) is freely available and can
be used to create your own custom projects. For more information, see "Pro-
ject Automation Framework (PAF)" on page 1016.

PAF module installation


Downloading installation files

The required files and licenses are available via the Qualisys client login, see
http://www.qualisys.com/my/. Log in with your Qualisys user account asso-
ciated with your QTM registration.
New installation

Follow these instructions if you are installing the module for the first time. If
you are updating an existing installation, see next section.

1. Download the analysis module installer via the Analysis modules page at
http://www.qualisys.com/my/.
2. Run the installer and follow the instructions.

APPLICATIONS 916
3. Start QTM and create a new project. Use this project to collect all future
data that you want to analyze with the analysis module:
a. Select File > New Project.

b. Enter name for the project. Change the location of the project folder
if you wish to, however we recommend that you keep it in Docu-
ments unless multiple different users need access.
c. If you collected data previously and you want to use the same cam-
era settings, choose “Settings imported from another projects” and
select a project that contains the correct camera settings that you
want to use. Otherwise select “Use default settings”.
d. Check Use PAF module and select the correct module from the drop-
down.

4. If you selected to import the settings from another project, choose the set-
tings file in the next dialog.
5. Enter the module license when prompted to do so. Select the name of the
module in the Plug-in drop-down. For Gait modules, always select PAF Gait
Module, regardless of the exact name of the module you are using:

APPLICATIONS 917
6. Open the Project Options, select Folder Options and specify the loc-
ation of the computational engine. If using Visual3D, locate Visual3D.exe
by clicking the “…” icon (on most computers the path is 'C:\Program
Files\Visual3D v6 x64'). If using QBE navigate to' C:\Program Files\Run-
ning_analysis\application'.

7. Check the C3D export settings in Project Options match the screenshot
below:

l Check Exclude unidentified trajectories

l Check Exclude empty trajectories

l Un-check Exclude non-full frames

l For the Label Format, choose De facto standard

APPLICATIONS 918
l For the Event Output Format, choose Following the c3d.org spe-
cification
l Export units are millimeters

Zero force baseline can be used if the force plate is unloaded at the begin-
ning of the measurement but note that depending on which module you
are using the Visual3D script is set up to apply a zero baseline correction
as well. The Visual3D correction will overwrite the zero frame range that is
set here.

NOTE: Physically zeroing the force plates is also required prior to


data collection. The method for zeroing depends on the force plate
used.

8. Open the Project View by selecting View > Project View (shortcut Ctrl+R).

Upgrading an existing installation

If you have an existing analysis module that needs to be updated, follow these
steps to update to a new version.

1. We recommend that you create a backup of the complete project folder.


In particular, make sure to keep a backup copy of all files that you have
customized since you installed the module (for example report templates
or AIM models).
2. Run the installer and follow the instructions.

APPLICATIONS 919
3. Start QTM and load the project that you want to update. QTM will ask if
you want to upgrade the project. Click Yes to confirm.

4. If you have modified template files within the PAF project and there is a
conflict because the same files have also been changed in the PAF
installer, the following dialog is shown:

If you check Overwrite, your changed files will be overwritten.


If you un-check Overwrite, you will keep your changed files but will miss
the updated files from the module installer.

NOTE: For AIM files, your modified files are kept by default. If only
AIM files are affected from conflicts, the dialog is not shown.

APPLICATIONS 920
NOTE: If you have modified a file, but the original file has not
changed between module versions, your modified file will be kept
automatically.

5. The update is complete. You can verify the module version number in
QTM under Help > About.

PAF Project view


The project view in a PAF project consists of three parts:

1. The Project data tree shows all files that are part of the project (i.e. files
contained in the Data subfolder of the project root folder).
You can use the buttons at the bottom of the project tree:

Add
Add new subjects, sessions, other types as defined in the PAF set-
tings.

Open
Open the selected file.

Find
Search files or folders within the PAF project.

APPLICATIONS 921
Find next
Search for the next occurrence of the current search term.

2. The Details area allows editing the properties of the selected item (for
example personal data, session details, file details).
3. The Project automation area shows contents depending on the selected
item. If the selected item is a session, this area shows buttons that cor-
respond to files and the Go button for data collection.
At the top of the Project automation area there is a breadcrumb trail to
navigate back to the session, subject or root level.
When a session has been selected, colored buttons appear in the Project auto-
mation area:

1. There are three methods to collect files:


a. Click a red field and select Capture.

b. Click a red field, select Edit Settings and Capture. Changes to the dur-
ation of the measurement period will be stored and be used the
next time you make a measurement of the same type.
c. Click the Go button. If you click the Go button, QTM starts at the top
of the file list and records all files of the first measurement type (for
example static). If you click again, it continues with all meas-
urements of the second measurement type (for example). Activate
the external trigger button in Project Options > Timing and use it to
control when the captures start. At any point, click the Esc key to
stop the data collection.
2. The following options are available for a recorded file:

APPLICATIONS 922
a. Add comments by clicking the field next to the green file name.

b. Use the plus/minus button to show more/less file buttons (to make
additional captures).
c. Un-check a file to exclude it from the processing (only checked files
will be exported).
3. Once the minimum number of files has been recorded, the Start Pro-
cessing button can be used. Click the button to run the default processing
step. Depending on the PAF module, there may be multiple processing
steps. Click the triangle on the button to show all processing steps. The
default processing step is topmost in the list.

4. Click the Show guide button to show the marker set guide, if available.

Calqulus
Calqulus is using a cloud-based approach to perform analysis on motion cap-
ture data where the engine and scripts are stored in the cloud. The engine is
called Calqulus engine, and the scripts are called Calqulus pipelines which con-
sist of a succession of Calqulus steps. The engine can be considered as the glue
putting together the pipelines and steps to process the data and creation of the
web reports.
With this cloud-based approach, Web Reports are automatically updated upon
modifications to pipelines and the Calqulus pipelines and steps are public. The
pipelines and steps are hosted on the following public GitHub repositories
which make them easy to share, track and modify:

APPLICATIONS 923
l https://github.com/qualisys/Calqulus-Pipelines

l https://github.com/qualisys/Calqulus-Steps

Calqulus functionality is added to a QTM project via the Calqulus module, which
can be downloaded via the Qualisys dashboard. For information about
installing and using Analysis Modules, see chapter "PAF module installation" on
page 916.
The Calqulus module includes predefined sessions for amongst others Base-
ball, Cycling, Cricket and Running. The Generic session can be used for col-
lection of unspecified biomechanical trials.
For more information about Calqulus, please refer to the product information
on the Qualisys website or contact your sales representative or [email protected]
com.

Qualisys Cloud ecosystem


The Qualisys Cloud ecosystem consists of:
l Web reports

l Report Center

l Calqulus

All modules give the option to visualize the results as a Web Report. Each Web
Report is hosted on Report Center. Demo reports are available on the main
page of our Report Center.
The main Web Report features are:
l Synchronized 3D view, videos and charts

l Interactive selection of charts

l Annotate charts

l Link annotations with recommended exercises

APPLICATIONS 924
l Export to PDF

l Compare sessions

Report Center, home of all the Web Reports, also offers some valuable features
such as:
l Sharing Web Reports

l Create custom reference data sets

l Create custom Web Report layouts

l Import pre-processed CGM sessions

To use this Qualisys Cloud ecosystem, a subscription is necessary. For an over-


view of the available options, see the Qualisys website. Please, contact your
sales representative or [email protected] for further information.

APPLICATIONS 925
Technical reference

Qualisys technical reference


Technical reference of Qualisys system hardware. For more information, con-
tact Qualisys support at [email protected].

Overview of camera models and sensor specifications


Qualisys camera sensor specifications (marker mode)

The below table gives an overview of the sensor specifications and sensor
modes for all Qualisys camera models, when in marker mode.

Max capture
Camera model Sensor mode Resolution
rate (Hz)

26 MP @ 150 Hz 150 5120×5120


Arqus A26
6 MP @ 290 Hz 290 2560×2560

12 MP @ 300 Hz 300 4096×3072


Arqus A12
3 MP @ 1040 Hz 1040 2048×1536

9 MP @ 300 Hz 300 4224×2160


Arqus A9
2 MP @ 590 Hz 590 2112×1080

5 MP @ 700 Hz 700 2560×1920


Arqus A5
1 MP @ 1400 Hz 1400 1280×960

4 MP @ 182 Hz 182 2048×2048


Miqus M5
1 MP @ 361 Hz 361 1024×1024

2 MP @ 340 Hz 340 1824×1088


Miqus M3
0.5 MP @ 666 Hz 666 912×544

Miqus M1 N/A 250 1216×800

2 MP @ 340 Hz 340 1824×1088


Miqus Hybrid
0.5 MP @ 666 Hz 666 912×544

TECHNICAL REFERENCE 926


Max capture
Camera model Sensor mode Resolution
rate (Hz)

Miqus Video
N/A 340 1920×1088
(VC / VM)1

12 MP @ 300 Hz 300 4096×3072


Oqus 7+
3 MP @ 1121 Hz 1121 2048×1536

6 MP @ 460 Hz 460 3072×1984


Oqus 6+
1.5 MP @ 1662 Hz 1662 1536×992

4 MP @ 179 Hz 179 2048×2048


Oqus 5+
1 MP @ 355 Hz 355 1024×1024

Oqus 5 N/A 180 2352×1728

Oqus 4 N/A 476 1696×1710

1.3 MP @ 502 Hz 502 1296×1024


Oqus 3+
0.3 MP @ 1740 Hz 1740 648×512

Oqus 3 N/A 503 1280×1024

Oqus 1 N/A 247 640×480

2 MP @337 Hz 337 1920×1088


Oqus 2c1
0.5 MP @ 666 Hz 666 960×544
1Only in marker mode during calibration. Sensor mode setting
accessible in intensity mode.

Qualisys video sensor specifications (in-camera MJPEG)

The below table gives an overview of the sensor specifications and sensor
modes for all Qualisys video camera models when using in-camera MJPEG com-
pression.

Max capture
Camera type Sensor mode Resolution
rate (Hz)

100 1920x1440
Miqus Video Plus N/A1
120 1920x1080

TECHNICAL REFERENCE 927


Max capture
Camera type Sensor mode Resolution
rate (Hz)

Miqus Video
N/A1 86 1920×1080
(VC / VM)
Miqus Hybrid N/A1 86 1920×1080
Miqus M5 N/A 30 2048×2048
Miqus M3 N/A 30 1824×1088
Miqus M1 N/A 30 1216×800

2 MP @ 24 Hz 24 1920×1088
Oqus 2c 0.5 MP @ 62 Hz 62 960×544
0.13 MP @204 Hz 204 480×272

12 MP @ 3 Hz 3 4096×3072
Oqus 7+
3 MP @ 10 Hz 10 2048×1536

6 MP @ 6 Hz 6 3072×1984
Oqus 6+
1.5 MP @ 20 Hz 20 1536×992

4 MP @ 13 Hz 13 2048×2048
1 MP @ 50 Hz 50 1024×1024
Oqus 5+
0.25 MP @ 176 Hz 176 512×512
0.06 MP @ 557 Hz 557 256×256
1Miqus Video cameras automatically switch to a high-speed sensor mode
at lower resolution presets.

NOTE: For information about maximum capture rate at different presets


for the Miqus Video/Video+/Hybrid and Oqus 2c see chapter "Maximum
capture rate for streaming video" on page 578.

TECHNICAL REFERENCE 928


Device specifications and features
Arqus cameras

General specifications and features

Camera output
Marker coordinates / Intensity map / Video preview
modes

Marker support Passive and active (NIR) markers

Max. frame rate


10000 fps
(reduced FOV)

Max. frame rate


30 fps
video mode

Camera Sync Unit for triggering, hardware sync, Video


External Sync
Genlock, PTP, SMPTE and IRIG

Die-cast aluminum, polycarbonate & thermoplastic


Camera body
polyurethane

Camera size 132×143×126 mm (5.2×5.6×5 inches)

Camera weight 1.9 kg (4.2 lbs)

Connection 1-connector daisy-chained Gigabit Ethernet and power

Power Daisy-chained, 36–58VDC @ 40W maximum

Strobe 24 high-power NIR LEDs @ 850nm

Built-in camera dis-


Graphical high-contrast OLED
play

Operating tem- Standard housing: 0–35°C (32–95°F)


perature Protected housing: -15–45°C (5–113°F)

Security attach-
Kensington Lock
ment

TECHNICAL REFERENCE 929


Quick release for Manfrotto & Arca swiss, two ¼” cam-
Mounting features
era mounts

Standard (IP10 / NEMA 1)


Available housing Protected (IP67 / NEMA 6)
Underwater (waterproof 10 m, IP68)

Active filtering
Outdoor tracking
Sun filter1
1Sun filter included for protected models, optional for standard models.

TECHNICAL REFERENCE 930


Arqus model specifications

Arqus
A26 A12 A9 A5
model:

Pixels
Image size 26 MP 12 MP 9 MP 5 MP
Max. frame 5120×5120 4096×3072 4224×2160 2560×1920

TECHNICAL REFERENCE
Normal mode (full FOV)
rate 150 Hz 300 Hz 300 Hz 703 Hz
Camera 6.7 ms 3.3 ms 3.3 ms 1.4 ms
latency

Pixels
Image size 6.5 MP 3 MP 2.5 MP 1 MP
Max. frame 2560×2560 2048×1536 2112×1080 1280×960
High-speed mode (full FOV)
rate 297 Hz 1040 Hz 591 Hz 1400 Hz
Camera 3.4 ms 1.0 ms 1.7 ms 0.7 ms
latency

Standard 56°×56° 54°×42° 67°×37° 56°×44°


Field of View (FOV) Wide 77°×77° 70°×56° 82°×48° 77°×62°
Narrow 29°×29° 31°×24° 47°×25° 29°×22°

Measurement distance (with 16 mm 32 m (105 40 m (131


28 m (92 ft) 26 m (85 ft)
markers)1 ft) ft)

931
Arqus
A26 A12 A9 A5
model:

Marker resolution1 (µm @1m) 3.3 3.9 4.9 6.5

Min marker separation1 (mm/px


0.21 0.25 0.31 0.42
@1m)

TECHNICAL REFERENCE
Motorized lens No Yes No No

Lens mount C EF-M C C


1For standard lens

932
Description of Arqus cameras

Arqus camera: front side

1. LED ring
LED ring for camera identification and indication of status during startup
sequence.
l Green light: Active camera indicator (see "Identifying the cameras
with the identification tool" on page 480).
l Pulsing green light: Camera system being calibrated.

l Amber light: Camera is booting.

l Pulsing amber light: Camera is waiting for IP address.

2. Measurement status indicator


l Green light: The system is ready to start a measurement.

l Yellow light: The system is measuring.

TECHNICAL REFERENCE 933


l Flashing green light: The camera is synchronizing to the master
device.
l Flashing red light: Error signal.

3. Camera display
o Display of camera ID.

4. Mounting plate
o Quick release for Manfrotto and Arca Swiss, two 1/4'' camera
mounts.

Arqus camera: back side

1. Left daisy-chain data/power port


Combined Power/Gigabit Ethernet connector.
2. Right daisy-chain data/power port
Combined Power/Gigabit Ethernet connector.

TECHNICAL REFERENCE 934


3. Left/right port Link/Activity indicator
Shows the status of the ethernet connection. Fixed yellow light means
that a carrier signal has been sensed and that the connection is up. Flash-
ing yellow light indicates that data is received and/or transmitted.
4. Left/right port Gigabit link indicator
Lit orange when Gigabit Ethernet link is established.
5. Left/right port Power indicator
Lit green when the camera is powered on.
6. Strobe unlock button
Press/release to unlock/lock the strobe mechanics.
7. Kensington Security Slot
Slot for connecting a security cable.
8. Camera identification
This label shows:
l The serial number of the camera.

l The product number.

l The Ethernet Mac address.

Mechanics

Physical specifications

Weight 1.9 kg (4.2 lbs)

Physical dimensions 1261×132×143 mm

Operating temperature range 0 – 35°C (32–95°F)

Operating temperature range


-15–45°C (5–113°F)
Arqus protected

Storage temperature range -10–55°C (14–131°F)

Die-cast aluminum, polycarbonate, ther-


Housing materials
moplastic polyurethane
1Can differ depending on lens selection.

TECHNICAL REFERENCE 935


Physical dimensions

Mounting

The Arqus camera has an integrated mounting plate that is compatible with the
quick-release mount mounting from Manfrotto and Arca Swiss.
There are also two UNC 1/4”-20 tripod mounting points.

For industrial mounting, a mounting plate that provides four M6 mounting


points on the back of the camera is available as an option.

TECHNICAL REFERENCE 936


Optics and strobe

How to adjust aperture and focus

This is only needed for camera models with manual focus and aperture.
Press the strobe unlock button while pulling the strobe mechanics away from
the camera body until the aperture and focus ring is accessible. For cameras
with protected housing, the strobe mechanics must be pulled all the way out so
that the lens cover can be removed.

How to change the strobe unit

Follow the instructions above but pull the strobe all the way out.

How to add or remove the sun filter

Follow the instructions above but pull the strobe all the way out.
Add or remove the filter and remount the strobe. Make sure that there is
enough room between the filter and flash glass to allow for focus adjustments.

How to change the lens

All lenses can be changed by the user by following the steps above, unscrewing
the lens and replacing it with a new lens.

TECHNICAL REFERENCE 937


IMPORTANT: To get accurate data, the camera must be linearized after
changing the lens.

Electrical specifications

Power supply

Arqus cameras are powered by 48 VDC (R1 power supply) or 56 VDC (R2 power
supply) through the Data/Power connectors. The operating voltage range is 36-
58VDC. The power supply should be dimensioned for a minimum of 40W for
each camera.
The external AC/DC converter available for the Arqus camera is capable of deliv-
ering 200W@48VDC or 250W@56VDC and supplying up to five Arqus cameras.
The maximum total cable length for a full chain with Arqus cameras is 50 m per
power supply.

Power consumption

Mode Average power consumption [W]

Idle 13

Measuring 300fps, no flash 16

Measuring 300fps, 166us 27

Measuring 300fps, 333us 36

The power consumption is dependent on the duty cycle of the flash.

TECHNICAL REFERENCE 938


Communication

The Arqus camera can communicate with a host computer through the Gigabit
Ethernet interface. For detailed information, see section "Ethernet (Gigabit)" on
page 977.

Digital IO

The Arqus camera does not provide any digital inputs or outputs. For more
information about digital inputs and outputs to a Qualisys system, see section
"Camera Sync Unit" on page 949.
Miqus cameras

General specifications and features

Marker coordinates1 / Intensity map / Video


Camera output modes
preview / Streaming video2

Marker support Passive and active (NIR)1 markers

Max. frame rate (reduced


10000 fps1
FOV)

In-camera video com-


Yes (MJPEG)
pression

TECHNICAL REFERENCE 939


Max. frame rate video
30 fps3
mode

Camera Sync Unit for triggering, hardware


External sync
sync, Video Genlock, PTP, SMPTE and IRIG

Convection cooled, custom die-cast aluminum


Camera body
and polycarbonate

Camera size 140×84×84 mm (5.5×3.3×3.3 inches)

Weight ~0.7 kg (1.54 lbs)

1-connector daisy-chained Gigabit Ethernet


Connection
and power

Power Daisy-chained, 36–58VDC @ 20W maximum

M1, M3, M5, Hybrid: Invisible near-infrared


Strobe light (850 nm)
VC+, VC, VM: White high-power LEDs

Operating temperature 0–35°C (32–95°F)

Security attachment Kensington lock

Mounting features Two ¼” camera mounts

Lens mount C mount

Standard (IP10 / NEMA 1)


Available housing
Underwater (waterproof 10 m, IP68)

Outdoor tracking Active filtering


1Applies only to marker cameras, including Miqus Hybrid.
2Applies only to Miqus Video and Hybrid cameras.
3Applies only to marker cameras. For Miqus Video and Hybrid, see "Miqus
Video specifications" on page 942.

TECHNICAL REFERENCE 940


Miqus model specifications

Miqus
M5 M3 M1 Hybrid
model:

Pixels
Image size 4 MP 2 MP 1 MP 2 MP
Max. frame 2048×2048 1824×1088 1216×800 1824×1088

TECHNICAL REFERENCE
Normal mode (full FOV)
rate 183 Hz 340 Hz 250 Hz 340 Hz
Camera 5.5 ms 2.9 ms 2.1 ms 2.9 ms
latency

Pixels
Image size 3 MP 0.5 MP 0.5 MP
Max. frame 1024×1024 912×544 912×544
High-speed mode (full FOV) N/A
rate 362 Hz 667 Hz 667 Hz
Camera 2.8 ms 1.5 ms 1.5 ms
latency

Standard 54°x42° 64°×41° 58°×40° 61°×37°


Field of View (FOV) Wide N/A 80°×53° N/A 83°×53°
Narrow 25°x25° 44°×27° 41°×27° 46°×27°

Measurement distance (with 16 mm 10 m (33


18 m (59 ft) 15 m (49 ft) 13 m (43 ft)
markers)1 ft)

941
Miqus
M5 M3 M1 Hybrid
model:

Marker resolution1 (µm @1m) 6.9 10.7 14.3 9.6

Min marker separation1 (mm/px @1m) 0.44 0.68 0.91 0.61


1For standard lens.

TECHNICAL REFERENCE
Miqus Video specifications

VC+ VC VM Hybrid

1440p 1920x1440 @ 100 fps -


Video output
Full HD 1920×1080 @ 120 fps 1920×1080 @ 85 fps
720p 960×720 @ 400 fps 1280×720 @ 180 fps
(resolution and
540p 960×540 @ 440 fps 960×540 @ 330 fps
max frame rate)
VGA - 640×480 @ 550 fps

Max. frame rate 480 fps (540p 4:3) 714 fps (480p 1:1)

Standard 51°×40° 61°×37°


Field of View Wide 72°×57° 83°×53°
Narrow 40°×30° 47°×28°

942
VC+ VC VM Hybrid

Dual
Filter (built in) IR cut-off IR cut-off
band pass

Color Yes Yes No Yes

Streaming video Yes Yes Yes Yes

TECHNICAL REFERENCE
In-camera video
MJPEG Yes Yes Yes Yes
compression

Auto exposure Yes Yes Yes Yes

Auto white balance Yes Yes N/A Yes

Through regular
Video overlay Yes Yes Yes Yes
wand calibration

943
Description of Miqus cameras

Miqus camera: front side

1. LED ring
LED ring for camera identification and indication of status during startup
sequence.
l Green light: Active camera indicator (see "Identifying the cameras
with the identification tool" on page 480).
l Pulsing green light: Camera system being calibrated.

l Amber light: Camera is booting.

l Pulsing amber light: Camera is waiting for IP address.

2. Measurement status indicator


l Green light: The system is ready to start a measurement.

l Yellow light: The system is measuring.

l Flashing green light: The camera is synchronizing to the master


device.
l Flashing red light: Error signal.

TECHNICAL REFERENCE 944


3. Lock lever
o Lock/unlock the strobe mechanics.

Miqus camera: back side

1. Left daisy-chain data/power port


Combined Power/Gigabit Ethernet connector.
2. Right daisy-chain data/power port
Combined Power/Gigabit Ethernet connector.
3. Left port Link/Activity indicator
Shows the status of the ethernet connection. Fixed yellow light means
that a carrier signal has been sensed and that the connection is up. Flash-
ing yellow light indicates that data is received and/or transmitted.
4. Right port Link/Activity indicator
Identical to left indicator.
5. Left port Gigabit link indicator
Lit orange when Gigabit Ethernet link is established.
6. Right port Gigabit link indicator
Identical to left indicator.
7. Left port Power indicator
Lit green when the camera is powered on.

TECHNICAL REFERENCE 945


8. Right port Power indicator
Lit green when the camera is powered on.
9. Camera identification
This label shows:
l The serial number of the camera.

l The product number.

l The Ethernet Mac address.

10. Kensington Security Slot


Slot for connecting a security cable.

Mechanics

Physical specifications

Weight 0.7 kg (1.5 lb)

Physical dimensions 1401×87×84 mm

Operating temperature range 0–35°C

Storage temperature range -10–55°C

Housing materials Die-cast aluminum and polycarbonate


1Can differ depending on lens selection.

TECHNICAL REFERENCE 946


Miqus dimensions

Mounting

The Miqus camera has two UNC 1/4”-20 tripod mounting points on the bottom
of the camera.

Optics and strobe

How to adjust aperture and focus

Put the lock lever in the open position, then move the strobe mechanics away
from the camera body until the aperture and focus ring is accessible.

TECHNICAL REFERENCE 947


How to change the strobe unit

Follow the instructions above but pull the strobe all the way out. Make sure
that the new strobe unit is mounted correctly (see up-label on strobe rails).

How to change the lens

All lenses can be changed by the user by following the steps above, unscrewing
the lens and replacing it with a new lens.

IMPORTANT: To get accurate data, the camera must be linearized after


changing the lens.

Electrical specifications

Power supply

Miqus cameras are powered by 48 VDC (R1 power supply) or 56 VDC (R2 power
supply) through the Data/Power connectors. The R2 power supply only works
with Miqus cameras and Camera Sync units with serial number 28123 or
higher. The operating voltage range is 20-58VDC. The power supply should be
dimensioned for a minimum of 20W for each camera.
The external AC/DC converter available for the Miqus camera is capable of deliv-
ering 200W@48VDC or 250W@56VDC and supplying up to ten Miqus cameras
and one Camera Sync Unit. The maximum total cable length for a full chain with
Miqus cameras is 120 m per power supply.

Power consumption

Mode Average power consumption [W]

Idle 8

Measuring 100fps, no flash 8

Measuring 100fps, 500us 12

Measuring 100fps, 1000us 16

The power consumption is dependent on the duty cycle of the flash.

TECHNICAL REFERENCE 948


Communication

The Miqus camera can communicate with a host computer through the Gigabit
Ethernet interface. For detailed information, see section "Ethernet (Gigabit)" on
page 977.

Digital IO

The Miqus camera does not provide any digital inputs or outputs. For more
information about digital inputs and outputs to a Qualisys system, see section
"Camera Sync Unit" below.
Camera Sync Unit

The Camera Sync Unit (CSU) is an optional accessory to a Qualisys camera sys-
tem that provides trigger and sync inputs and outputs.

Specifications and features

Trig NO, Trig NC, Event, Sync in, SMPTE, Gen-


Inputs
lock

Outputs Measurement time, Out 1, Out 2

Signal level TTL, 5V (up to 12V tolerant)

Input/Output impedance 50Ω

TECHNICAL REFERENCE 949


Communication Gigabit Ethernet through camera daisy chain

Power 48VDC @ 200mA through camera daisy chain

Weight 900g (1.98lbs)

Dimensions 172×137×55mm (6.76”×5.40”×2.15”)

Operating temperature
0–35°C (32–95°F)
range

Security attachment Kensington lock

Housing Standard (IP30 / NEMA 1)

Description of Camera Sync Unit

Camera Sync Unit: front side

1. Trig NO indicator
Turns on for 0.5 s when the Trig NO input transitions from high to low.
2. Trig NO input
Trigger input (TTL, 0-5 Volt, normally open). The base voltage of the port is
5 Volt (high).

TECHNICAL REFERENCE 950


3. Trig NC indicator
Turns on for 0.5 s when the Trig NC input transitions from low to high.
4. Trig NC input
Trigger input (TTL, 0-5 Volt, normally closed). The base voltage of the port
is 0 Volt (low).
5. Event indicator
Turns on for 0.5 s when the Event input transitions from high to low.
6. Event input
Input for creating events in QTM. The base voltage of the port is 5 Volt
(high).
7. Sync indicator
Turns on for 0.5 s when the Sync input transitions from low to high.
8. Sync input
Input for TTL synchronization signal (0-5 Volt). The base voltage of the
port is 0 Volt (low).
9. SMPTE indicator
Turns on when a valid SMPTE signal is present at the SMPTE input.
10. SMPTE input
Input for SMPTE time code signal, e.g. from a MOTU sound device.
11. Genlock indicator
Turns on 0.5 s for each vsync frame when a valid video signal is present at
the Genlock input.
12. Genlock input (video)
Genlock input for synchronization to a video signal. Compatible with Com-
posite video (s-video) and Component video (YPbPr / GBR).
13. Measurement time indicator
Turns on 0.5 s at the start of a measurement.
14. Measurement time output
Outputs a pulse that lasts for the duration of the measurement.
15. Output 1 indicator
Turns on for 0.5 s when the Output 1 port transitions from low to high.
16. Output 1
Programmable synchronization output (TTL, 0-5 Volt).

TECHNICAL REFERENCE 951


17. Output 2 indicator
Turns on for 0.5 s when the Output 2 port transitions from low to high.
18. Output 2
Programmable synchronization output (TTL, 0-5 Volt).
19. Measurement status indicator
l Green light: The system is ready to start a measurement.

l Yellow light: The system is measuring.

l Flashing green light: The device is synchronizing to the master


device.
l Flashing red light: Error signal.

Camera Sync Unit: back side

1. Kensington Security Slot


Slot for connecting a security cable.
2. Daisy-chain data/power port
Combined Power/Gigabit Ethernet connector.
3. Power/Gigabit Link indicator
Lit green when the Sync unit is powered on. An additional orange LED is lit
when a Gigabit link is established.

TECHNICAL REFERENCE 952


4. Link/Activity indicator
Shows the status of the Ethernet connection. Fixed yellow light means
that a carrier signal has been sensed and that the connection is up. Flash-
ing yellow light indicates that data is received and/or transmitted.

Mechanical end electrical specifications

Physical specifications

Weight 0.8 kg (1.8 lbs)

Physical dimensions 172×137×55 mm

Operating temperature range 0–35°C

Storage temperature range -10–55°C

Housing material Aluminum

Mounting

The CSU housing is designed to be placed on a desktop. There are no mounting


points.

Electrical specifications

Mode Average power consumption [W]

Idle 8

Measuring 8

Digital IO

Trigger inputs

TECHNICAL REFERENCE 953


The CSU can receive a trigger signal from an external TTL source. Depending on
the selected function, the external trigger can be used to start or stop a capture
or generate an event. For an overview of the available trigger port settings, see
chapter "Trigger ports" on page 273.
The CSU has two trigger inputs: Trig NO (normally open) and Trig NC (normally
closed). The Trig NO input is pulled high and can be used with the Qualisys trig-
ger button. On the Trig NC input the signal must be driven by the source.

Event/IRIG input

The Event input is dedicated for generating events. The Event input is pulled
high and can be used with the Qualisys trigger button. For an overview of the
available event port settings, see chapter "Event port (Camera Sync Unit)" on
page 276.
This input doubles as IRIG input. It is possible to use the IRIG timecode as a syn-
chronization input source and/or to timestamp the data frames with the IRIG
timecode.

NOTE: IRIG cannot be used when there are any Oqus cameras included
in the system.

Synchronization input

The CSU can be synchronized by an external source and configured to accept


various types of synchronization signals. The SYNC input can be used to syn-
chronize to external periodic or non-periodic TTL signals. For an overview of set-
tings and use of external timebase, see chapters "External timebase" on
page 278 and "How to use external timebase" on page 494.

SMPTE input

This input is dedicated for SMPTE time code signals. It is possible to use the
SMPTE timecode as a synchronization input source and/or to timestamp the
data frames with the SMPTE timecode. For an overview of the settings and syn-
chronization scenarios, see chapters "Timestamp" on page 284, "External
timebase" on page 278 and "Using SMPTE for synchronization with audio
recordings" on page 512.

TECHNICAL REFERENCE 954


Genlock input

This input is dedicated for Genlock signals. It is possible to use this input as a
synchronization input source. For an overview of the settings, see chapter
"External timebase" on page 278.

Synchronization outputs

The CSU has three synchronization outputs Measurement time (MEAS. TIME),
Output 1 (OUT1) and Output 2 (OUT2). For an overview of the available set-
tings, see chapters "Synchronization output" on page 285 and "Measurement
time (Camera Sync Unit)" on page 290.
All outputs are fused and capable of driving 50 Ohm transmission lines.

Oqus cameras

General specifications

Camera output Marker coordinates / Intensity map / Video preview /


modes1 Streaming video / high speed video

Marker support Passive and active (NIR) markers

Synchronization Internal 1ppm clock/ext. freq. output/ext. word clock


options input/smpte input, PTPv2

Camera body Convection cooled, custom die-cast aluminum

5+/6+: 185×110×124 mm (7.3×4.3×4.9 inches)


Camera size
7+: 200×145×155 mm (7.9×5.7×6.1 inches)

5+/6+: 1.9 kg (4.2 lbs)


Camera weight
7+: 2.1 kg (4.6 lbs)

TECHNICAL REFERENCE 955


Cabling Hybrid cable with Ethernet and power

Wired com-
Hub-less daisy-chained Ethernet 802.3 @ 100Mbps
munication

Wireless com-
WiFi 802.11b/g @ 54Mbps
munication2

Power Daisy-chained, 36–72VDC @ 30W maximum

Strobe 150 NIR LEDs @ 870nm

Built-in camera
128×64 graphical high contrast OLED
display

Operating tem- 5+: 0–35°C (32–95°F)


perature 6+/7+: 0–30°C (32–86°F)

Standard (IP10)
Available camera Weather protected (IP67)
housing Underwater (IP68)
MRI (EMI shielded)

Position data
±1 sub-pixels
noise level

Maximum frame
1152 MB (Oqus high-speed video)
buffer size
1Depending on camera model and configuration
2Optional feature

TECHNICAL REFERENCE 956


Oqus model specifications and features

Oqus
7+ 6+ 5+ 5
model:

Pixels
Image size 12 MP 6 MP 4 MP 4 MP
Max. frame 4096×3072 3072×1984 2048×2048 2352×1728

TECHNICAL REFERENCE
Normal mode (full FOV)
rate 300 Hz 450 Hz 179 Hz 178-182 Hz2
Camera 3.3 ms 2.2 ms 5.6 ms 5.6 ms
latency

Pixels
Image size 3 MP 1.5 MP 1 MP
Max. frame 2048×1536 1536×992 1024×1024
High-speed mode (full FOV) N/A
rate 1121 Hz 1662 Hz 355 Hz
Camera 0.9 ms 0.6 ms 2.8 ms
latency

Standard 54°×42° 56°×39° 50°×50°


Field of View (FOV) Wide 70°×56° N/A 70°×70°
Narrow 31°×24° 37°×25° 25°×25°

Max frame rate (fps, reduced FOV) 10000 10000 10000 10000

In camera MJPEG compression Yes Yes Yes No

957
Oqus
7+ 6+ 5+ 5
model:

Max. frame rate streaming video (full


3 fps 6 fps 13 fps N/A
resolution)

Measurement distance1 (with 16 mm 35 m (115


28 m (92 ft) 25 m (82 ft)
markers) ft)

TECHNICAL REFERENCE
Marker resolution1 (µm @1m) 3.9 5.4 6.9

Min marker separation1 (mm/px


0.25 0.34 0.44
@1m)

Active filtering Yes Yes Yes No

Motorized lens Yes No No No

Lens mount EF-M C C C


1For standard lens.
2For Oqus 5 the maximum capture frequency is dependent on the exposure time.

958
Oqus model: 4 3+ 3 1

Pixels
Image size 3 MP 1.3 MP 1.3 MP 0.3 MP
Max. frame 1696×1710 1296×1024 1280×1024 640×480
Normal mode (full FOV)
rate 476 Hz 502 Hz 503 Hz 247 Hz
Camera 2.1 ms 2.0 ms 2.0 ms 4.0 ms

TECHNICAL REFERENCE
latency

Pixels
Image size 0.3 MP
Max. frame 648×512
High-speed mode (full FOV) N/A N/A N/A
rate 1740 Hz
Camera 0.6 ms
latency

Standard 46°×46° 41°×33° 41°×33° 41°×31°


Field of View (FOV) Wide N/A 58°×48° 58°×48° N/A
Narrow 32°×32° 20°×16° 20°×16° 29°×22°

Max frame rate (fps, reduced FOV) 10000 10000 10000 1000

In camera MJPEG compression No No No No

Measurement distance1 (with 16 mm


markers)

959
Oqus model: 4 3+ 3 1

Marker resolution1 (µm @1m) 7.8 9.1 9.1 18.1

Min marker separation1 (mm/px @1m) 0.50 0.58 0.58 1.16

Active filtering Yes Yes Yes Yes

Motorized lens No No No No

TECHNICAL REFERENCE
Lens mount C C C C
1For standard lens.

Oqus video specifications

Streaming video

For an overview of Oqus cameras that can be used for streaming video and their sensor specifications, see
"Qualisys video sensor specifications (in-camera MJPEG)" on page 927.

High-speed video

A selection of Oqus camera types can be equipped to capture full-frame, full-speed, full-resolution high-speed
video. In this configuration the camera is equipped with a large 1.1 GB buffer memory and a clear front glass to
get the best possible performance out of the image capture.

960
The below table gives an overview of the sensor specifications, sensor modes and video buffer capacity for all
Oqus high-speed cameras, when in video mode. The specifications are only applicable to uncompressed video
(8-bit raw images). Note that the number of frames that can be stored in the buffer can be increased by cropping
the image.
The maximum measurement duration (in seconds) at full resolution and maximum capture rate is indicated in
the table below. For other combinations of image size (X, Y in pixels) and capture frequency use the following for-
mula to calculate the maximum capture duration:

TECHNICAL REFERENCE
Max capture Max video buffer capacity
Camera type Sensor mode Resolution
rate (Hz) (frames / s @max capt. rate)

2 MP @ 337 Hz 337 1920×1088 540 / 1.6


Oqus 2c 0.5 MP @ 666 Hz 666 960×544 1080 / 1.6
0.13 MP @ 1300 Hz 1300 480×272 2100 / 1.6

4 MP @ 179 Hz 179 1920×2048 280 / 1.6


1 MP @ 355 Hz 355 960×1024 560 / 1.6
Oqus 5+
0.25 MP @ 694 Hz 694 480×512 1100 / 1.6
0.06 MP @ 1328 Hz 1328 240×256 2200 / 1.6

Oqus 5 N/A 180 2352×1728 290 / 1.4

1.3 MP @ 502 Hz 502 1296×1024 900 / 1.8


Oqus 3+
0.3 MP @1740 Hz 1740 648×512 3600 /2.0

961
Max capture Max video buffer capacity
Camera type Sensor mode Resolution
rate (Hz) (frames / s @max capt. rate)

Oqus 3 N/A 503 1296×1024 900 / 1.8

Oqus 1 N/A 247 640×480 3800 / 15.2

TECHNICAL REFERENCE
NOTE: When using Oqus high-speed video, take into account prolonged data fetch times after a capture.
The data transfer time for a full buffer memory (1.1 GB) can be several minutes.

962
Description of Oqus devices

Oqus camera display

The Oqus camera has a large graphical OLED display and three LEDs on the
front to inform the user of the current status of the camera. The display shows,
among other things, the camera number and the number of markers currently
seen by the camera.

NOTE: The display will be turned off when the camera enters stand-by
mode, i.e. if the camera has not been in use for 2 hours. Start a preview
in QTM to light up the display again.

1. Measurement status indicator


Green light - The camera is ready to start a measurement
Yellow light - The camera is measuring
Flashing green light - Waiting for trigger to start measurement
Flashing yellow light - Waiting for trigger to switch from pre-trigger to
post-trigger measurement

TECHNICAL REFERENCE 963


2. Error indicator
A red light indicates that an error has occurred. The LED is blinking when
a software error occurs and is lit constantly if a hardware error occurs.
3. IR receiver
The IR receiver is used for synchronization with certain active markers. It
detects modulated light with a frequency of 33 kHz and is sensitive to light
with wavelengths between 800 and 1100nm.
4. Synchronization status
During the synchronization phase this symbol is flashing. When the cam-
era is synchronized with the master camera in the system it becomes
stable.
5. WLAN indicator
This symbol is displayed when the WLAN of the camera is activated.
6. Master/Slave indicator
An M indicates that the camera is master for the system and by that con-
trols for example internal synchronization. An S indicates that the camera
is a slave. The indicator can also be a rotating + sign, which means that
the camera is looking for the Master camera.
7. Camera number
The area to the right usually shows the camera number that the camera
has in QTM. The camera number can be changed with the Reorder tool in
the 2D view window. This number is stored in the camera so it is shown at
the next camera startup.

NOTE: If the camera has never been connected to QTM the last
three digits of the serial number (upper part) and the last octet of
the IP-number assigned to the camera (lower part) will be shown
instead. This can also be activated from the QDS menu, see "QDS"
on page 462.

8. Marker area
During a marker measurement this area shows the number of markers
currently seen by the camera. When the camera is idle or is collecting
video, this area shows ’-----’.
9. Text area
This area is used for scrolling text messages, for example during startup.

TECHNICAL REFERENCE 964


Oqus camera connectors

The back of the camera holds six connectors for power, data and control con-
nections. The view differs slightly depending on the type of camera. The image
below shows the standard version of the camera. The water protected version
uses different connectors and lacks the LEDs found on the standard version.

1. Left Data port (light blue)


Ethernet connector. 100BaseTX/802.3i, 100 Mbps, Fast Ethernet.
2. Right Data port (light blue)
Identical to the left data port.
3. Left Ethernet activity indicator
Shows the status of the Ethernet connection. Fixed green light means that
a carrier signal has been detected and that the connection is up. Flashing
green light indicates that data is received and/or transmitted.
4. Right Ethernet activity indicator
Identical to the left indicator.
5. Battery port (white)
Used to supply the camera with power from an Oqus compatible battery.
6. Battery status indicator
Lit green when the camera is supplied though the BATTERY port.
Lit red when a voltage outside the specified range (10-16V) is connected

TECHNICAL REFERENCE 965


to the port.
7. Power supply status
Lit green when the camera is powered through one of the POWER ports.
A red light indicates internal power supply error.
8. Right power supply port (black)
Daisy-chain power port. Supplies the camera with 48VDC and can be daisy
chained to supply cameras further down the chain with power.
9. Left power supply port (black)
Identical to the right power supply connector.
10. Control port (light grey)
The control port is used to synchronize the camera with external sources,
and contains pins for among other things external trigger in, external sync
in and external sync out. Splitter cables are needed to connect one or
more BNC cables to this port, for more information see "Control con-
nections" on page 973.
11. Camera identification
This label provides information on:
l The serial number of the camera.

l The product number.

l The Ethernet Mac address.

l The WLAN Mac address.

TECHNICAL REFERENCE 966


Oqus Sync Unit

1. Oqus connector
Connector for control port on an Oqus camera.
2. Trig In indicator
Lit when in Trig in mode.
3. SMPTE indicator
Lit when in SMPTE mode.
4. Sync In indicator
Lit when in Sync in mode.
5. Video In indicator
Lit when in Video/Genlock mode.
6. Sync out activity indicator
Lit green for 0.5 s when output transitions from high to low.
7. Sync in activity indicator
Lit green for 0.5 s when signal transitions from high to low.
8. Sync output
Programmable synchronization output (TTL, 0-5 Volt).

TECHNICAL REFERENCE 967


9. Sync input/Genlock (video) input
Mode dependent on Signal source selected in QTM (see chapter
"External timebase" on page 278).
l Sync input (TTL, 0-5 Volt) when Signal source set to Control port.

l Video input when Signal source set to Video sync. Compatible with
Composite video (s-video) and Component video (YPbPr / GBR).
10. Trig input
Mode dependent on QTM project settings.
l Trigger input (TTL, 0-5 Volt) when in Trig in mode (default).

l SMPTE timecode input (e.g. from a MOTU sound device) when in


SMPTE mode (Signal source set to SMPTE or Use SMPTE
timestamp activated, see chapters "External timebase" on page 278
and "Timestamp" on page 284).
11. Trig in activity indicator
Lit green for 0.5 s when signal transitions from high to low, or con-
tinuously lit when a valid SMPTE signal is present.

Mechanics

The Oqus camera is made of a three-piece die-cast aluminum assembly. The fin-
ish is anodized and powder painted with "Structured black" and "Clear silver".
Some versions of the Oqus camera comes in other colors: for example the MRI
camera is white.

Physical specifications

Weight 1.9–2.4kg (4.2–5.3lb)

Without strobe: 185×110×84


Physical dimensions With strobe: Depends on lens and Oqus-
series.

Operating temperature Oqus 1-5 series: 0–35°C


range Oqus 6+ and 7+ series: 0–30°C

Storage temperature range -10–55°C

TECHNICAL REFERENCE 968


Mounting

Mount Type Mounting

Tripod Standard 1× UNC 1/4”

Industrial Standard 3× M6

Back-side mounting rails (optional) Accessory 4× M4

Oqus bottom mounting:

Oqus back-side mounting rails:

TECHNICAL REFERENCE 969


Optics and strobe

The native camera mount is C-mount but adapters to accommodate Nikon F-


mount, Pentax K-mount, Minolta MD-mount and others are available. CS-
mount optics cannot be used due to its shorter back focal lengths compared to
the C-mount. Contact Qualisys for available lens options that fit a specific Oqus
camera series.

How to adjust aperture and focus

All optics available to the Oqus camera have adjustable aperture and focus. In
the standard camera the optics is easily accessible by turning the strobe part
counter-clockwise. After adjusting aperture/focus the opening is closed by turn-
ing the strobe clockwise. In the IP-classified version the locking screws on the
side must be loosened and the strobe unit pulled out.

How to change strobe unit

Follow this procedure to change the strobe unit.

1. Make sure the power is turned off and the strobe is in its closed position.

2. Remove the strobe unit.

TECHNICAL REFERENCE 970


a. On a standard housing Oqus, turn the strobe counter-clockwise, pla-
cing it in the completely open position. Now, find the exit track and
turn the strobe so that clicks in to the exit track. Pull the strobe care-
fully out of the housing.
b. On an IP-classified housing, loosen the three locking screws on the
side and pull out the strobe carefully. The gasket might make it dif-
ficult to pull the strobe out. Turning it side-to-side while pulling
helps .
3. Disconnect the strobe cable. There is either a connector on the cable or
three strobe connectors on the strobe unit. Always pull the connectors,
not the cable!
4. Attach the strobe cable to the new strobe. For the connectors on the
strobe unit make sure the red cable is attached to the pin marked '+' and
that the black cable is attached to the pin marked '-'.
5. Re-attach the strobe unit
a. On a standard housing Oqus, insert the strobe with the connectors
facing down and the opening aligning with the opening on the cam-
era. Align the tracks in the strobe unit with the thrust screws on the
camera housing and simply push the strobe in.
b. On an IP-classified housing, insert the strobe with the connectors to
the left and align the holes on the side of the flash with the locking
screws on the side of the camera housing. Then tighten the locking
screws.

How to change lens

All lenses on the Oqus cameras can be change by the user by following the
steps below.

1. Remove the strobe unit.

2. Unscrew or dismount the lens and replace with the new lens.

3. Re-attach the strobe unit.

TECHNICAL REFERENCE 971


IMPORTANT: To get accurate data, the camera must be linearized after
changing the lens.

Electrical specifications

Power supply

The Oqus camera can be powered by either 48VDC through the connectors
marked POWER or by 12VDC through the connector marked BATT. The power
supply should be dimensioned for a minimum of 30W for each camera. The sup-
ply should be able to deliver higher peak currents during the short periods
when the strobe is lit.
The external AC/DC converter available for the Oqus camera is capable of deliv-
ering 240W@48VDC and supplying up to five Oqus cameras.

Power consumption

The below table specifies the maximum power consumption of Oqus 7+ cam-
eras. The power consumption is dependent on the duty cycle of the flash. The
values for other models may be lower.

Mode Average power consumption [W]

Idle 15

Measuring (max. duty cycle


30
of strobe 10%)

Digital IO

The control port of the Oqus camera provides an interface for digital I/O. For
the available control connections, see below.
The following digital I/O signals are available on the Oqus control port:
Trigger input: The Trigger port on the Oqus camera is mainly similar to
the TRIG NO input on the Camera Sync Unit, see chapter "Trigger inputs"
on page 953.

TECHNICAL REFERENCE 972


Synchronization input: The Synchronization input on the Oqus camera is
mainly similar to the SYNC input on the Camera Sync Unit, see chapter
"Synchronization input" on page 954.

Synchronization output: The Synchronization output on the Oqus cam-


era is mainly similar to the Synchronization output ports on the Camera
Sync Unit, see chapter "Synchronization outputs" on page 955.

Control connections

The following splitter cables can be used for connecting to the control port.
They all have BNC connectors where BNC cables can be connected to extend
the length.
Sync out/Sync in/Trig splitter
This splitter has three connectors. One BNC female for Sync out, one BNC
female for Sync in and one BNC male for Trigger in.

Sync out/Trig splitter


This splitter has two connectors. One BNC female for Sync out and one
BNC male for Trigger in.

Sync in splitter
This splitter has one connector, a BNC female for Sync in.

Alternatively, an Oqus Sync Unit can be used as a connector to the control


port, featuring opto-isolated signal connectors and support for SMPTE time
code. For more information, see chapter "Oqus Sync Unit" on page 967.
Analog boards

USB-2533

The USB A/D board (USB-2533) is a portable A/D board that can easily be con-
nected to any computer with an USB port. The board has 64 analog channels
and is distributed in a case with BNC connections for the analog signals and the
synchronization signal. For instructions how to install the board see chapter
"Installing the USB-2533 board" on page 748.

TECHNICAL REFERENCE 973


The following connections are available on the front view of the board:
CH. 1 - CH. 64
BNC connections for the 64 analog channels.

DIGITAL I/O A and DIGITAL I/O B


Ports for controlling a Kistler force plate, the cable can be ordered from
Qualisys.

NOTE: The port can also be used to control other applications. Pin
1-12 on each port is then the digital I/O and pin 13-16 is ground.

EXTERNAL TRIGGER
BNC connection for synchronous start of the analog capture.

The following connections are available on the rear view of the board:
SYNC
BNC connection for frame synchronization of the analog capture.

INTERFACE

CAUTION: Do not use this connection. If there are connections


both on the front and on this interface then the board can be dam-
aged.

POWER
Connection for the power supply, which is supplied by Qualisys.

USB
USB connection to the measurement computer. The cable is supplied by
Qualisys.

TECHNICAL REFERENCE 974


USB LED
The USB LED is lit when the board is connected to a computer.

POWER LED
The Power LED is lit when the board has power. The power can come
either from the USB port or from an external power supply. To use the
external power supply it must be plugged in before the USB connection,
however the Power LED will still not be lit until the USB cable is con-
nected.

USB-1608G

The USB-1608G is a portable A/D board that can easily be connected to any
computer with a USB port. The board has 16 analog channels and is distributed
in a case with BNC connections for the analog signals and the synchronization
signal. For instructions how to install the board see chapter "Installing the USB-
1608G board" on page 750.

The following connections are available on the front view of the board:
CH1 - CH16
BNC connections for the 16 analog channels.

The following connections are available on the rear view of the board:
SYNC
BNC connection for frame synchronization of the analog capture.

TECHNICAL REFERENCE 975


TRIG
BNC connection for synchronous start of the analog capture.

DIGITAL OUT
Not in use.

USB
USB connection to the measurement computer. The cable is supplied by
Qualisys.

STA. LED
The Status LED turns on when the device is detected and installed on the
computer.

ACT. LED
The Activity LED blinks when data is transferred, and is off otherwise.

USB-1608G specifications

Materials Extruded and die-cast Aluminum housing


Weight 910g
Channels 16 single ended
ADC resolution 16 bits
Input voltage range ±10V, ±5V, ±2V and ±1V software selectable per channel
Trigger input 5V TTL
Synchronization input 5V TTL
Digital outputs Not in use
Data/power connection Compatible with USB 1.1, USB 2.0 and USB 3.x ports

Communication
Arqus and Miqus

Arqus and Miqus cameras communicate with a host computer through the Gig-
abit Ethernet interface.

TECHNICAL REFERENCE 976


Ethernet (Gigabit)

Ethernet (IEEE 802.3) is a low-level communications protocol, which normally


carries IP traffic in Local Area Networks (LANs). The physical transmission
media is generally twisted pair cables. The standard used by the cameras is
1000BaseT/802.3ab, Gigabit Ethernet, with a communications speed of 1 Gbps.
To comply with the standard, cables classified as cat 5e or better should be
used and cable length between each node should be limited to 100m. For best
performance, it is recommended to keep cable length to 50m or shorter.
Arqus and Miqus cameras are equipped with daisy-chained Ethernet, which
means that a maximum number of 20 cameras can be connected in a data
chain without the requirement of an external hub or switch. It is however pos-
sible to connect the system in a star configuration which could improve per-
formance in very large systems.
Arqus and Miqus cameras use both TCP/IP and UDP/IP to communication with
a host computer and other cameras within a system.
Oqus

The Oqus camera can communicate with a host computer through the fol-
lowing interfaces.

Ethernet (Oqus)

Ethernet (IEEE 802.3) is a low level communications protocol which normally car-
ries IP traffic in Local Area Networks (LANs). The physical transmission media is
generally twisted pair cables. The standard used by the Oqus cameras is
100BaseTX/802.3i, Fast Ethernet, with a communications speed of 100 Mbps.
To comply with the standard, cables classified as cat 5 or better should be used
and cable length between each node should be limited to 100m. For best per-
formance, it is recommended to keep cable length 50m or shorter.
Oqus cameras are equipped with daisy-chained Ethernet which means that a
maximum number of 15 cameras can be connected in a chain without the
requirement of an external hub or switch. It is however possible to connect the
system in a star configuration which could improve performance in very large
systems.
Oqus cameras use both TCP/IP and UDP/IP to communication with a host com-
puter and other cameras within a system.

TECHNICAL REFERENCE 977


WLAN (Oqus)

The Oqus system can run with a wireless communication from the camera sys-
tem to the computer. The camera uses the 802.11b/g@54mbps standard.
However, the communication speed can be reduced depending on the signal
strength or the presence of other wireless networks.
A wireless configuration requires one camera in the system to be equipped
with WLAN. The cameras are connected to each other by cables and the host
communicates wireless with the entire system by using the WLAN camera as a
gateway to the other cameras.
Setting up Oqus for wireless communication requires that the wireless adapter
of the computer is set up as a hosted network. This requires a computer run-
ning Windows 7. This feature is no longer supported for most wireless adapters
in Windows 10 or higher. The configuration requires a special version of QDS,
contact [email protected] for more information.

Environmental protection
Qualisys cameras are available with a several types of environmental pro-
tection. For an overview and a general description, see chapter "Qualisys cam-
era types" on page 432.
Underwater

TECHNICAL REFERENCE 978


Cameras and specifications

Marker cameras

Arqus underwater

Arqus A9u Arqus A12u

TECHNICAL REFERENCE
Resolution (pixels) 4224×2160 (9 MP) 4096×3072 (12 MP)

Max. capture rate at full resolution (Hz) 300 300

Default FOV 61°×33° 40°×31°

Optional FOV N/A N/A

Motorized lens No Yes

Max range with 19 mm marker (clear water) 23 m 30 m

Connection 1-connector Gigabit Ethernet and Power

Cable Qualisys Gigabit Ethernet and Power Cable (underwater)

Connection unit Underwater connection unit 24 V

Cameras per UW connection unit 3

Strobe light 24 high-power Deep Blue LEDs @ 455 nm

979
Arqus A9u Arqus A12u

Underwater housing Marine grade stainless steel (A4), PC

Size Length 166 mm, Diameter 180 mm

Weight 4.0 kg

Buoyancy Slightly negative

TECHNICAL REFERENCE
Classification IP68, pressure tested to 5 bar (40 m depth)

Operating temperature range 0–35°C

Operating voltage 24 VDC

980
Miqus underwater

Miqus M5u Miqus M3u

Resolution (pixels) 2048×2048 (4 MP) 1824×1088 (2 MP)

Max. capture rate at full resolution (Hz) 183 340

Default FOV 51°×51° 58°×40°

TECHNICAL REFERENCE
Optional FOV N/A N/A

Motorized lens No No

Max range with 19 mm marker (clear water) 17 m 14 m

Connection 1-connector Gigabit Ethernet and Power

Cable Qualisys Gigabit Ethernet and Power Cable (underwater)

Connection unit Underwater connection unit 24 V

Cameras per UW connection unit 3

Strobe light 12 high-power Deep Blue LEDs @ 455 nm

Underwater housing Stainless steel and acrylic

Size Length 262 mm, Diameter 110 mm

981
Miqus M5u Miqus M3u

Weight 2.5 kg

Buoyancy Neutral

Classification IP68, pressure tested to 5 bar (40 m depth)

Operating temperature range 0–35°C

TECHNICAL REFERENCE
Operating voltage 24 VDC

982
Oqus underwater

Oqus 7+u Oqus 5+u

Resolution (pixels) 4096×3072 (12 MP) 2048×2048 (4 MP)


Max. capture rate at full resolution (Hz) 300 179

Default FOV 40°×31° 38°×38°

TECHNICAL REFERENCE
Optional FOV (narrow) 24°×18° 19°×19°

Motorized lens Yes No

Max range with 19 mm marker (clear water) 27 m 23 m

Connection 1-connector 100Mb Ethernet and Power

Cable Qualisys Ethernet and Power Cable (underwater)

Connection unit Underwater connection unit 48 V

Cameras per UW connection unit 3

Strobe light 12 high-power Royal Blue LEDs @ 448nm

Underwater housing Hard anodized aluminum

Size Length 223 mm, Diameter 220 mm

983
Oqus 7+u Oqus 5+u

Weight 8.4 kg

Buoyancy Neutral

Classification IP68, pressure tested to 5 bar (40 m depth)

Operating temperature range 0–30°C 0–35°C

TECHNICAL REFERENCE
Operating voltage 48 VDC

984
Video cameras

Miqus VMu Miqus VCu

Resolution 1920×1088 (full HD) 1920×1088 (full HD)


Max. frame rate at full resolution (fps) 86 86

Default FOV (under water) 60°×40° 60°×40°

TECHNICAL REFERENCE
Color No Yes

Auto exposure Yes Yes

Auto white balance N/A Auto

Connection 1-connector Gigabit Ethernet and Power

Cable Qualisys Gigabit Ethernet and Power Cable (underwater)

Connection unit Underwater connection unit 24 V

Cameras per UW connection unit 3

Strobe light 12 high power Blue LEDs 12 high power White LEDs

Max calibration distance 15 10

Underwater housing Stainless steel and acrylic

985
Miqus VMu Miqus VCu

Size Length 250 mm, Diameter 110 mm

Weight 2.5 kg

Buoyancy Neutral

Classification IP68, pressure tested to 5bar (40m depth)

TECHNICAL REFERENCE
Operating temperature range 0–35°C

Operating voltage 24 VDC

986
How to connect

Arqus and Miqus

Each camera has a high-quality underwater connector, which connects to a


water-protected connection unit placed on land. The cable carries both power
and data. A connection unit drives up to three underwater cameras from one
power supply and several connection units can be connected, either daisy-
chained or in a star configuration through an Ethernet switch.
Arqus and Miqus underwater cameras can be freely combined in a system and
share the same connection unit.
The schematic below shows an example of how to connect an underwater sys-
tem with 24V connection units and power supplies. A Camera Sync Unit can be
added to the system through an Ethernet switch.

TECHNICAL REFERENCE 987


Oqus

An Oqus system is connected in a similar way, but make sure that you use
Oqus the correct 48V connection units and power supplies.

Mechanics and physical specifications

Arqus physical specifications

Weight 4.0 kg

Length: 166 mm
Diameter: 180 mm
Width (including mounting bracket): 192
Physical dimensions
mm
Height (including mounting bracket): 229
mm

Operating temperature
0–35°C
range

Storage temperature range

Marine grade stainless steel (A4), PC


Housing materials Connector: Chrome plated brass and Cu
alloy

Housing and mount foot


Drawings of the dimensions of the housing and the integrated mount foot.

TECHNICAL REFERENCE 988


Miqus physical specifications

Weight 2.5 kg

Length: 262 mm
Physical dimensions Diameter: 110 mm
Width (including mounting bracket): 143 mm

Operating temperature
0–35°C
range

Storage temperature
range

Marine grade stainless steel (A4), Acrylic (PMMA),


Housing materials
PC (Lexan)

TECHNICAL REFERENCE 989


Housing
Drawings of the dimensions of the housing.

Connection unit

Materials Powder coated Aluminium

Weight 2.6 kg

Width: 210 mm
Dimensions (excluding connectors) Height: 77 mm
Depth: 221 mm

Classification IP65

Cameras per unit 3

Operating voltage 24VDC

Mechanics
Dimensions of the 24V Underwater Connection Unit.

TECHNICAL REFERENCE 990


Underwater mounting options

There are two options for underwater wall mounts:


l Fixed wall mount for Arqus and Miqus, respectively. The wall mount is a
single piece that is bolted directly to the wall and the camera is attached
to the wall mount. This option is useful when the mounting is permanent
and there is no need to move cameras or take them out of the water.
l Quick attach mounts for Arqus and Miqus. This option is useful when the
cameras will be taken out of the water or when several mounting point
options are desired.

Arqus wall mount

Drawing of the mechanics and dimensions of the Arqus wall mount.

TECHNICAL REFERENCE 991


Miqus wall mount

Mount foot

Wall mount

Quick Attach mount

A Quick Attach (QA) mount is available for both Arqus and Miqus underwater
cameras. The Quick Attach mount consists of two parts:
l A Quick Attach mount base, which is similar for Arqus and Oqus Quick
Attach mounts.
l A Quick Attach camera mount for Arqus and Miqus cameras, respectively.

TECHNICAL REFERENCE 992


Quick Attach mount base
The QA mount base is mounted to the wall through four 10 mm holes. Use one
QA mount base per camera position.

TECHNICAL REFERENCE 993


Arqus Quick Attach mount
The QA Arqus mount is easily mounted to the QA base and locked into place
using the Wing grip. At the bottom of the mount there is a slot for an optional
M6x10mm screw to fixate the mount to the base.
Use one QA Arqus mount per camera.

Miqus Quick Attach mount


The QA Miqus mount is easily mounted to the QA base and locked into place
using the Wing grip. At the bottom of the mount there is a slot for an optional
M6x10mm screw to fixate the mount to the base.
Use one QA Miqus mount per camera.

TECHNICAL REFERENCE 994


Qualisys accessories specifications and features
Qualisys AB provides a wide range of accessories, for example markers, mocap
suits, computers, analog data acquisition devices, mounting equipment, etc.
For more information, see the Qualisys website at https://www.qualisys.-
com/accessories/.
The following chapters contain technical information about Qualisys calibration
kits and active markers.
Qualisys calibration kits

Qualisys offers a range of wand calibration kits for a variety of applications.


Standard calibration kits
The carbon fiber 300 mm and 600 mm calibration kits are recommended
for standard capture volumes with camera distances ranging from about
3 to 35 m. For larger capture volumes or volumes that contain force
plates, the carbon fiber 600 mm is recommended. For smaller capture
volumes the carbon 300 mm kit is recommended.

Small calibration kit


The 120 mm calibration kit is intended for small volumes with camera dis-
tances ranging from about 1 to 3 m. The 120 mm kit is especially recom-
mended for small capture volume applications that require high accuracy.

Active calibration kits


Two types of active calibration kits are available. The active calibration kits
are especially recommended for outdoor capture volumes. The long
range active marker (LRAM) calibration kit (1000 mm) is recommended for
large capture volumes with camera distances of 10 m or larger. The active
500 mm calibration kit is recommended for standard capture volumes
with camera distances of 5 to 35 m, and can be used for both marker and
video cameras.

The following chapters contain practical information for selected calibration


kits. For more detailed technical information, contact [email protected].

TECHNICAL REFERENCE 995


Carbon fiber 600 mm wand kit

The carbon fiber wand kit consists of a carbon fiber wands with a length of 600
mm and a L-frame where the long arm is about 600 mm long.
The wand is attached to the handle by pressing and turning it in the track until
the handle locks. Make sure that you lock the handle so that the wand is not
dropped and the markers damaged.
The L-frame rests on three points. A static corner point that is the origin when
calibrating in QTM. The resting points on the arms are adjustable with the
adjustment wheels. Check the spirit levels on the frame and adjust the points
so that the L-frame is level.
There are also force plate positioning plates on the side of the L-frame, so that
the L-frame can be placed on the same position on the force plate for every cal-
ibration. Loosen the screws to fold down the positioning plates, then tighten
the screws before placing the L-frame on the force plate with the positioning
plates on the sides of the force plate.
When folded the L-frame is held together by a magnet. To open the L-frame
pull the arms and unfold the arms to the maximum position. The arms are then
locked, so to fold it pull the red locking sprint away from the center.

NOTE: The origin of the L-frame is automatically translated to the corner


point (X=-10, Y=-10, Z=-49, relative to the corner marker).

Active 500 mm wand kit

Indicators and buttons

TECHNICAL REFERENCE 996


1. Power button
Press button to start device.
Keep button pressed for about one second until both indicators turns off.
2. Battery indicator
Green light - Battery status ok
Red/yellow light - Charge battery
3. Status indicator
Green light - Idle
Yellow light - Triggered
Red light - Battery charging
4. USB-C connector
Used to charge battery and configure device
5. IR detector
Used for synchronizing the LEDs to Camera system
6. Set screws
Use the included T-handle to lock the two parts together rigidly
7. Release button
See disassembly instruction below.
8. Reset
Access to reset button.

Assembly and disassembly

It is recommended to store the L-frame in one piece. In case the L-frame needs
to be disassembled, for example for transport, follow these instructions.

1. Press the release button and pull gently on the short arm of the L-struc-
ture to separate it into two pieces. Make sure the locking setscrews are
disengaged.

TECHNICAL REFERENCE 997


2. Disconnect the connector.

3. The two pieces are now separated. Assembly is done in the reverse order.

How to use the active 500 mm calibration kit

The active 500 mm calibration kit has compound white and IR LEDs, and can be
used for the calibration of both marker and video cameras. To use the active
calibration kit, the cameras need to be in active marker mode, see chapter
"Marker mode" on page 229. The recommended exposure time of the cameras
is 400-500 μs.
For outdoor use, it is recommended to use active filtering, see chapter "Active fil-
tering" on page 246.

TECHNICAL REFERENCE 998


The markers on the L-frame and the wand are only visible on one side, unlike
spherical markers. When moving the wand, it is important to be aware that the
view angle of the markers is limited to max 90 degrees relative to the view dir-
ection of the cameras. Make sure that there are always cameras that can see
the markers while moving the wand. For example, if all cameras are mounted
in high positions, the marker side of the wand should never be pointed towards
the floor.
The L-frame can be leveled using the level adjustment screws at the end of
both arms while monitoring the spirit levels.
The L-frame also contains positioning plates for reproducible placement of the
L-frame on a force plate.

NOTE: When using a system with only Miqus video cameras for mark-
erless motion capture, the cameras should be equipped with strobes that
contain one IR LED. Contact [email protected] if you need to
upgrade the strobes of your Miqus video cameras.

Configuration

The active 500 mm L-frame and wand can be configured via the USB-C port.
The options that can be configured are: triggering mode (triggered,
untriggered), LED activation (IR, white or both), and intensity (default power
and high power). The active 500 mm calibration kit is by default configured in
triggered mode, with both IR and white LEDs activated and a flash time of 400
μs.
The untriggered mode can be used if the L-frame needs to be visible for non-
Qualisys cameras, e.g. standard video cameras.
For more information about how to configure the active calibration kit, contact
[email protected].

NOTE: The battery time is significantly reduced when using untriggered


mode.

TECHNICAL REFERENCE 999


Battery and charging

The L-frame and wand can be charged using any standard USB-C charger. The
charging time is less than 2 hours. The battery time for the default con-
figuration (triggered, white and IR LEDs) is 10 hours or more for the L-frame
and 20 hours or more for the wand.
Active marker types

The following types of Qualisys active markers are available.


Active Traqr
Compact and lightweight trackable object with integrated active markers.
The markers of the Active Traqr use sequential coding for reliable rigid
body identification of multiple Active Traqrs. For more information about
the hardware, see chapter "The Active Traqr" below.

Naked Traqr
A component kit for embedding Qualisys active tracking into custom
objects or props. For more information about the hardware, see chapter
"The Naked Traqr" on page 1003.

Short range active marker (SRAM) (legacy)


Lightweight active markers with sequential coding for automatic iden-
tification of the markers. For more information about the hardware, see
chapter "The Short Range Active Marker" on page 1005.

Long range active marker (LRAM)


Active marker solution for long distance ranges and outdoors, see chapter
"The Long Range Active Marker" on page 1010.

The Active Traqr

The Active Traqr is a compact lightweight trackable object for 6DOF tracking.
The Active Traqr has four active markers that are identified using sequential
coding. This facilitates robust real-time rigid body tracking and identification of

TECHNICAL REFERENCE 1000


multiple Traqrs. The Active Traqr can be easily attached to various types of
mounts, for example a screw mount that can be fixed to an object, or strap
mounts for use on hands or feet.
Examples of applications in which the Active Traqr is especially recommended
are:
Location Based Virtual Reality and gaming
The robust real-time tracking and identification makes the Active Traqr
ideal for tracking multiple players and props. The Active Traqr can be
used for real-time skeleton solving with only 6 Traqrs per player. In addi-
tion, the Naked Traqr allows for seamless embedding of Qualisys active
marker technology into props, such as HMDs or VR weapons.

Real time control


The Active Traqr is ideal for real-time control applications in engineering
that require robust real-time tracking of the position and orientation of
rigid objects. Furthermore, the use of active markers helps to avoid the
interference of extra reflections otherwise caused by the strobes of the
cameras when tracking reflective objects or in an environment with many
reflective elements.

Active Traqr description

TECHNICAL REFERENCE 1001


1. Power button
Push button to start the device. For powering off the device keep the but-
ton pressed for about one second until both indicators turn off.
2. Status indicator
l Green light: The device is ready for measurement.

l Yellow light: The device is triggered.

l Red light: The battery is charging.

3. Battery indicator
o Green light: Battery status 16-100%.
o Yellow light: Battery status 6-15%.
o Red light: Battery status 0-5%.

4. USB-C connector
Connector for charging and configuration of the device.
5. Active markers
Four active markers with diffusor.
6. IR eye
Infrared detector for synchronization of the device.
7. Reset button
Recessed button for resetting the device.
8. Mount
Bayonet mount for attaching several types of mounts. A screw mount is
included with the active Traqr.

Active Traqr specifications

Range >35m1
20 hours @100 fps continuous meas-
Battery time
urement
Charging time 2 hours
Connector USB-C

TECHNICAL REFERENCE 1002


4 wide angle, NIR LEDs (850 nm) with
LEDs
diffusion
Maximum frequency 500 Hz
Maximum number of Traqrs 185
Synchronization Optical
Material Polyamide, silicone rubber
Size (incl. markers) 70 x 85 x 21mm
Weight 70g
Operating temperature range 0-50°C
1Depending on camera resolution and LED separation. For example, about
35m with Arqus A12, or 16m with Miqus M3.

The Naked Traqr

The Naked Traqr consists of a circuit board and a set of components that can
be embedded in an object that needs to be tracked. The Naked Traqr works
just like the Active Traqr with the added benefit that it allows you to seamlessly
integrate Qualisys active marker technology into custom objects. A single
Naked Traqr unit supports up to eight sequence coded markers.
The Naked Traqr can be used in a wide variety of applications. Two typical
examples are:
Location Based Virtual Reality
The Naked Traqr can be embedded into props, such as HMDs or VR
weapons. This way the props can be reliably tracked without the need to
add markers that are visible to the eye.

Engineering applications
The Naked Traqr can be embedded in moving objects for reliable real-
time tracking. The active markers can be used on both rigid or flexible
objects, making use of the sequential coding for identification of single
markers or markers in a rigid body configuration. An additional advantage
of the use of active markers is that it helps to avoid the interference of
extra reflections otherwise caused by the strobes of the cameras when

TECHNICAL REFERENCE 1003


tracking reflective objects or in an environment with many reflective ele-
ments.

Naked Traqr description

The Naked Traqr package includes the following components:


l Naked Traqr PCB

l IR detector

l IR LEDs (8x)

l 150mm cables (20x)

l 8-pin connectors for IR LEDs (2x)

l 3-pin connector for IR detector

A short manual with instructions on how to assemble and power the Naked
Traqr is included in the package.

NOTE: Optionally, a battery can be used to power the Traqr. The battery
is not included in the package.

TECHNICAL REFERENCE 1004


Naked Traqr specifications

Range >35m1
20 hours @100 fps continuous measurement
Battery time
with 4 markers
Charging time 2 hours
Connector USB-C
LEDs up to 8 wide angle, NIR LEDs (850 nm)
Sequence coding 0-8 LEDs2
Synchronization Optical
Input voltage 5V3
Size (PCB) 45 x 30 x 6.3mm
Weight 6g
Operating tem-
0-50°C
perature range
1Depending on camera resolution and LED separation. For
example, about 35m with Arqus A12, or 16m with Miqus M3.
2Maximum number of unique sequence coded LEDs in one sys-
tem: 740
3If it is not possible to supply 5V or use the battery, contact
Qualisys support.

The Short Range Active Marker

The Short Range Active Marker consists of a driver unit capable of driving up to
32 markers through four daisy-chained outputs. The markers are small and
lightweight and can be attached to the skin with simple double adhesive tape.
The driver has an integrated battery which can either be charged in the driver
unit or replaced with another battery and charged in a standalone charger. The
battery is selected to last a day of normal measurements.
The use of the Short Range Active Marker (SRAM) can be beneficial in the fol-
lowing situations:

TECHNICAL REFERENCE 1005


Tracking of reflective objects or in a reflective environment
When tracking reflective objects or in an environment with many reflect-
ive elements, the use of active markers helps to avoid the interference of
extra reflections otherwise caused by the strobes of the cameras.

Simple marker identification


The sequential coding makes the identification very simple. The AIM func-
tionality for passive markers can be just as effective as sequential coding,
but in some setups the active markers can be easier to use. For example
irregular movements of single markers cannot be handled by AIM and
then the active marker is the best solution.

Outdoors
The high intensity of the active marker allows for effective tracking when
outdoors at frequencies up to 500 Hz.

The active marker hardware consists of the following parts:


l A driver for up to 32 markers.

l Selectable number of active markers on different chain lengths, and up to


4 marker chains with a maximum of 8 markers per chain.
l IR eye

l 2 batteries with power supply and an external battery charger.

l Belt clip for driver

l Sync cable [Not included in the standard kit]

Short range driver

TECHNICAL REFERENCE 1006


The driver is the center of the short range active marker. It has to be attached
in a secure way to the subject to make sure that there is no strain on the
cables. The active marker kit includes a belt clip that can be attached on the
driver to fasten it for example on a belt.
The driver controls and powers the active markers and has the following con-
nections and indicators.
Markers 1-32
4 connectors for the active marker chains. There can be up to 8 active
markers on each chain and therefore there can be up to 32 markers on
one driver.

IR in/Sync In
Connector for the IR eye or Sync in. The standard use is to connect an IR
eye to the input, The eye is triggered by a modulated pulse from the Oqus
camera.
The sync in signal must be coded to be correct and must therefore be
sent either from another active marker driver or from an Oqus camera
with the sync out signal set to Wired synchronization of active
markers.

Sync out
Use the Sync out connector to daisy chain several active marker drivers to
one IR eye. Connect the Sync out signal from one driver to the Sync in con-
nector on the next driver.

Charge/ 9 VDC
Use this connection to charge the battery with the supplied power supply.
The driver can still be used while it is being charged.

CAUTION: Do not use the driver with the power connected while
the driver is mounted on a person.

ON/OFF
Turn on the driver by pressing the button. Turn off by holding down the
button a couple of seconds.

TECHNICAL REFERENCE 1007


Driver ID
Change the driver ID to specify which sequential range to use for the
driver. The possible ranges are 1-5, which corresponds to the IDs 1-32 to
129-160. 0 means that the sequential ID is turned off.

Battery indicator
When pressing the power button, the four LEDs indicates the battery
status. One LED means that you soon have to charge the battery.

Status indicators
The three status indicators at the top displays the following information:
Power/Batt Low
The LED is lit green when the power is on and starts flashing green
when the driver goes into the power save mode. The battery is low
when the LED is lit or flashing red.

Charging
The LED is lit green when the driver is being charged, if something is
wrong the LED is lit red.

Sync active
The sync is active when the LED is flashing orange.

Active markers and IR eye

The active markers are mounted on a daisy-chained wire that can have up to 8
markers and a length of 3 meters. The size of the markers is comparable to 12
mm passive markers. The ID of the markers is decided by the connector on the
driver. When a chain has less than 8 markers the other IDs of that connector
will be skipped, e.g. if you have chain of 4 markers connected to the first two
connectors these will have ID 1-4 and ID 9-12.

TECHNICAL REFERENCE 1008


The IR eye is mounted on a short wire and can be mounted anywhere to detect
the pulse from the camera. It has an adaptable sensitivity so it can easily detect
the pulse in almost any setup, even outdoors.

Battery and charging

The battery in the short range active marker is a rechargeable Li-Polymer bat-
tery. It can power the active marker for a day of normal measurements. The bat-
tery can be recharged either in the driver with the power supply or separately
in the battery charger. It takes about 3 hours to fully charge the battery.

CAUTION: Do not use the driver with the power connected while the
driver is mounted on a person.

To remove the battery, press the button on the side of the driver and lift the
bottom lid. Make sure that you insert the battery according to the image inside
the driver.

Short range active marker specifications

Driver 83 x 52 x 15 mm, 99 g
IR marker D 16 x H 11 mm, 3 g
Frequency range 1 - 500 Hz
Number of markers per driver Up to 32 markers
Up to 5 drivers can be used,
Total number of markers
up to 160 sequential coded markers
Maximum measurement distance more than 25 m
Battery time, 200 fps, 16 markers 4.5 h
Time to fully charge battery 2-3 h

TECHNICAL REFERENCE 1009


The Long Range Active Marker

The Long Range Active Marker is usually used for industrial applications, for
example marine measurements. The marker is synchronized with the cameras
via a pulsed strobe signal. The same signal is used for spherical and reference
marker. Because of the synchronization the marker can be lit only during the
exposure time, which means that the LEDs can be brighter. For more inform-
ation about the long range marker please refer to the information included
with the marker or contact Qualisys AB.
Marker maintenance

The passive markers should be attached to the object by double adhesive tape.

WARNING: Do not use glue or other types of chemical substances to


attach the markers to the object as it may damage the markers.

The passive markers can be cleaned from grease and dirt with soap and water.
Do not use very hot water when cleaning the markers and do not use a brush
when cleaning the markers.

NOTE: Be careful when cleaning the markers, otherwise the reflective


material on the markers can be damaged. If part of the reflective material
is damaged on the passive marker, it should be discarded because the
central point calculation of this marker will be indeterminable.

Rotation angle calculations in QTM


The following chapters contain information about rotation angle calculations in
QTM for 6DOF data and skeleton data.

TECHNICAL REFERENCE 1010


6DOF tracking output

The 6DOF tracking function uses the rigid body definition to compute Porigin,
the positional vector of the origin of the local coordinate system in the global
coordinate system, and R, the rotation matrix which describes the rotation of
the rigid body.
The rotation matrix (R) can then be used to transform a position Plocal (e.g. x'1,
y'1, z'1) in the local coordinate system , which is translated and rotated, to a pos-
ition Pglobal (e.g. x1, y1, z1) in the global coordinate system. The following equa-
tion is used to transform a position:
Pglobal= R ·Plocal + Porigin

If the 6DOF data refer to another coordinate system than the global coordinate
the position and rotation is calculated in reference to that coordinate system
instead. The coordinate system for rigid body data is then referred to the
global coordinate system.

Calculation of rotation angles from the rotation matrix


(Qualisys standard)
The rotation angles are calculated from the rotation matrix (R), by expressing it
in the three rotation angles: roll (q), pitch (f) and yaw (y).
To begin with the rotations are described with individual rotation matrices: Rx,
Ry and Rz. The rotations are around the X-, Y- respectively Z-axis and positive
rotation is clockwise when looking in the direction of the axis.

TECHNICAL REFERENCE 1011


The individual rotation matrix can be derived by drawing a figure, see example
below. To get an individual rotation matrix express for example the new ori-
entation x’ and y’ in the coordinates x, y and y and then make a matrix of the
equations. The example below is of rotation around the Z-axis, where positive
rotation of yaw (y) is clockwise when the Z-axis points inward.

The resulting three rotation matrices are then:

The rotation matrix (R) is then calculated by multiplying the three rotation
matrices. The orders of the multiplications below means that roll is applied
first, then pitch and finally yaw.

The following equations are then used to calculate the rotation angels from the
rotation matrix:

The range of the pitch angle is -90° to 90°, because of the nature of the arcsin
function. The range of the arcos function is 0° and 180°, but the range of roll
and yaw can be expanded by looking respectively on the r23 and r12 elements

TECHNICAL REFERENCE 1012


in the rotation matrix (R). The roll and yaw will have the opposite sign com-
pared to these elements, since cos(f) is always positive when f is within ±90°.
This means that the range of roll and yaw are -180° to 180°.
A problem with the equations above is that when pitch is exactly ±90° then the
other angles are undefined, because of the division by zero, i.e. singularity. The
result of a pitch of exactly ±90° is that the Z-axis will be positioned at the pos-
ition that the X-axis had before the rotation. Therefore yaw can be set to 0°,
because every rotation of the Z-axis could have been made on the X-axis before
pitching 90°. When yaw is 0° and pitch is ±90°, the rotation matrix can be sim-
plified to:

From the matrix above the roll can be calculated in the range ±180°.

IMPORTANT: With the definitions above, roll, pitch and yaw are unam-
biguous and can describe any orientations of the rigid body. However,
when the pitch (f) is close to ±90°, small changes in the orientation of the
measured rigid body can result in large differences in the rotations
because of the singularity at f=±90°.

Calculation of other rotation matrices


This chapter describes how other rotation matrices are calculated. The cal-
culations are not described in the same detail for each matrix as with the
Qualisys standard, but you should be able to calculate the exact rotation matrix
from these descriptions.

NOTE: If you use rotations around global axes the order of multiplication
of the individual rotation matrices are reversed and if you use a left-hand
system change the positive direction to counterclockwise, which means
that the sign of the angle is swapped.

First there are two types of rotation matrices: those with three different rota-
tion axes and those with the same rotation axis for the first and third rotation.

TECHNICAL REFERENCE 1013


The first type is of the same type as the one used as Qualisys standard. This
means that for this type the same individual rotation matrices (Rx, Ry and Rz)
are used as for the Qualisys standard. The individual rotation matrices are then
multiplied in different orders to get the different rotation matrix. When you
have the rotation matrix the same kind of formulas but with other indexes and
signs are used to get the rotational angles as for the Qualisys standard:

You have to look at the rotation matrix to see what indexes and signs that
should be used. The singularity will always be at ±90° for the second rotation
and at the singularity the third rotation is always set to 0°.
In the second type the third rotation is round the same axis as the first rota-
tion. This means that one of the individual rotation matrices below is used as
the last rotation, depending on which axis that is repeated.

The rotation matrix (R) is then calculated by multiplying two of the individual
rotation matrices from the first type and then one of matrices above. In the
example below the rotations are round the x, y and then x axis again.

The rotation angles can then be calculated according to the equations below.
These equations are similar for other rotation matrices of this type, just the
indexes and types are changed.

TECHNICAL REFERENCE 1014


As for the first type of rotation matrices there will be a singularity for the
second rotation angel, but now it is at 0° and 180°. The third rotation angle is
then set to 0° and the following rotation matrix can be used to calculate the
rotation. It will of course change for other matrices but the principle is the
same.

Developers' resources
The following chapters contain information about the possibilities for
developers to communicate with QTMs interface and extend its capabilities.

Real-time protocol
The real-time protocol makes it possible to write external applications that can
control QTM and receive real-time data. The real-time protocol is implemented
in a range of SDKs and QTM Connect modules for external software for real-
time control and communication with QTM. All resources are available on
Github (https://github.com/qualisys) and the Qualisys website
(https://www.qualisys.com/downloads/).
SDKs are available for:
l C++

l C# (.NET)

l Python

l JavaScript

Resources for engineering:

TECHNICAL REFERENCE 1015


l QTM Connect for Matlab

l QTM Connect for LabVIEW

l ROS resources

l Qualisys Drone SDK

Resources for animation:


l QTM Connect for MotionBuilder

l QTM Connect for Unreal

l QTM Connect for Unity

l QTM Connect for Maya

QTM also supports the Open Sound Control (OSC) protocol for sound and mul-
timedia applications and devices.
Full documentation of the real-time protocol is included with the QTM installer
and is available online at https://docs.qualisys.com/qtm-rt-protocol/.

Project Automation Framework (PAF)


The Project Automation Framework (PAF) can be used to automate workflows
for structured data collection in QTM and subsequent data analysis and report-
ing using external software or online resources.
In QTM, projects with PAF functionality contain a structured Project view pane
guiding the user to collect captures and meta-data, start analyses, and gen-
erate reports. For more information about the PAF project view, see chapter
"PAF Project view" on page 921. PAF packages typically include the following
components:
l A Settings.paf file coded in YAML that defines the structure of the data col-
lection and analysis workflow.
l A Templates folder with predefined templates that are instantiated when
running the analysis.

TECHNICAL REFERENCE 1016


l Optionally, other QTM project subfolders can be included, for example an
AIM folder containing one or more AIM models that are used for auto-
matic labeling of the captures, or a Data folder containing example data.
PAF functionality can simply be added to QTM projects by copying the content
of a PAF package into the QTM project folder. PAF packages can also be added
to the Packages subfolder in the QTM program folder (e.g. under C:\Program
Files\Qualisys\Qualisys Track Manager). In that case, they will be included in the
list of PAF modules that can be selected when creating a new QTM project, see
chapter "Creating a new project" on page 69.
PAF resources are available on GitHub at https://github.com/qualisys/paf-
resources, including full PAF documentation and several examples of PAF pro-
jects, for example for Visual3D, Theia Markerless, Matlab, Python, etc.
Qualisys also provides full PAF implementations in the form of Analysis Mod-
ules. For more information, see chapter Applications.

QTM Scripting Interface


The QTM Scripting Interface provides scripting support for QTM. Scripting is
supported for both Python and Lua, which interpreters are embedded in QTM.
Additionally, the QTM Scripting Interface is also exposed through a REST API.
Scripting allows the user to extend the functionality of QTM, for example:
l Automation of regularly performed complex tasks

l Extraction and display of information from recorded data

The QTM Scripting Interface provides, amongst others, the following func-
tionality:
l Access and modification of recorded data (captures)

l Access and modification of settings (at project and measurement level)

l Trajectory, rigid body and skeleton editing

l Extending the main menu, adding new QTM commands and keyboard
shortcuts
l Rendering of text and graphical elements in the 3D View window

TECHNICAL REFERENCE 1017


For a comprehensive overview of the modules and functions, please refer to
the QTM Scripting Interface documentation on GitHub at https://-
github.com/qualisys/qtm-scripting.
Scripting components in QTM

The components in QTM related to scripting are shown in the above figure:
Terminal button: Show/hide the Terminal window.

Reload scripts button: Reload script files that are used in the project
(keyboard shortcut F5).

Scripting settings: Terminal settings and script files used in the project.

Terminal window: Terminal for access to the selected interpreter


(Python or Lua).

The settings for scripting can be accessed via Project Options > Mis-
cellaneous > Scripting. In the settings, you can select the interpreter language
that is used for the terminal and add scripts that are loaded into the project,
see chapter "Scripting" on page 431 for more information.

TECHNICAL REFERENCE 1018


The terminal window can be used to write commands in the selected inter-
preter language and to display text output. The terminal window can be
opened via the View menu, or by pressing the Terminal button in the toolbar.
At startup, the terminal displays information about the used interpreter. The
help() command can be used to get basic help information about the QTM
Scripting Interface functions.
When using the Python interpreter, you will first need to import the qtm mod-
ule to get access to the qtm commands:
>>> import qtm

TIP: The import step can be automated in a startup script that can be
added to the Script files in the QTM project.

Scripting resources

For complete documentation and advanced examples, it is recommended to


refer to the resources that are available on GitHub at https://-
github.com/qualisys/qtm-scripting. The qtm-scripting repository includes:
QTM Scripting Interface documentation
Comprehensive documentation of the QTM Scripting Interface, including
code examples for Python, Lua and REST.

Demo scripts
Collection of scripts that demonstrate various capabilities of the scripting
engine.

Tools
Collection of tools that can be helpful add-ons to QTM, or can serve as
examples for developing your own tools.
Using scripts

Scripts can be used in QTM by simply adding them to the Script files list in the
QTM project. To load scripts files:

1. Open the Scripting page under Project Options > Miscellaneous.

2. Press the Add button and locate the script file to be added in the file dia-
log.

TECHNICAL REFERENCE 1019


3. Make sure that the check box next to the script file is enabled.

4. Press Apply or OK to load the enabled scripts.

The qtm-scripting resources on Github include a collection of ready-to-use tools


for trajectory editing, marker set and rigid body operations that can be added
to QTM. For instructions on how to add these tools to QTM, refer to the README
file of the qtm-scripting repository. Once loaded, each set of operations is rep-
resented by a menu in QTM. For a detailed description of the functionality of
the tools, see the script documentation, which is displayed in the terminal
when pressing the Help button in the respective tools’ menus.
Creating your own scripts

You can also add your own scripts to QTM. You can write Python or Lua scripts
in any text editor, for example Visual Studio Code.
Here is a simple example of a Hello World! script, showing how you can add a
menu to QTM with a button that displays “Hello world!” in the terminal and the
3D View window.

1 '''Example of a simple Python script using the QTM Scripting Interface.


2
3 Content of the script:
4 * Add a custom QTM command that outputs "Hello World!"
5 to the terminal and the 3D View window
6 * Add a menu to QTM with a button that executes it
7 '''
8
9 # Import modules
10 import qtm
11
12 # Definition of execute function for custom QTM command
13 def _echo_hello():
14 # Use QTM Scripting Interface method for writing to terminal
15 qtm.gui.terminal.write("Hello world!")
16
17 # Draw callback function for the _display_hello function.
18 qtm.gui._3d.set_draw_function(_display_hello)
19
20 # Function defining the rendering in the 3D View window
21 def _display_hello(measurement_time):
22 qtm.gui._3d.draw_text_2d([60, 30], 12, "Hello world!", \
23 {"horizontal": "left", "vertical": "top"}, \
24 {"horizontal": "left", "vertical": "top"}, \
25 qtm.utilities.color.rgb(0,1,0))
26
27 # Function for adding custom commands and setting up the menu in QTM
28 def add_menu():
29 # Add QTM command, associate it with the _echo_hello() function

TECHNICAL REFERENCE 1020


30 qtm.gui.add_command("echo_hello")
31 qtm.gui.set_command_execute_function("echo_hello", _echo_hello)
32
33 # Add menu and button
34 menu_id = qtm.gui.insert_menu_submenu(None, "My menu")
35 qtm.gui.insert_menu_button(menu_id, "Hello world", "echo_hello")
36
37 # Call add_menu() function when running the script
38 if __name__ == "__main__":
39 add_menu()
40

When your script is ready, you can add it to the script files list under Project
Options > Miscellaneous > Scripting. The script will be loaded when pressing
Apply or OK, and when starting the project.
When you make modifications to your script, you can reload it in QTM using the
Reload button.
Use of external packages

Installing external packages for Python

You can add external packages for Python using pip install, following these
steps:
l Open cmd.exe (as administrator).

l Change the current path to the folder in which QTM is installed (usually
C:\Program Files\Qualisys\Qualisys Track Manager).
l Type the command: .\python.exe -m pip install <name of the
package to install> (replace <> with the package you want to
install).
For example for installing numpy: .\python.exe -m pip install
numpy

The package will be installed to the subfolder \Lib\site-packages.

NOTE: Some external packages may not work properly in QTM, for
example packages with GUI functionality.

TECHNICAL REFERENCE 1021


External packages for Lua

The use of external packages for Lua is currently not supported.

QDevice API
The QDevice API can be used to integrate external data acquisition devices with
QTM. The SDK including documentation can be downloaded from the Qualisys
website at https://www.qualisys.com/downloads/.

Troubleshooting QTM

Troubleshooting connection

Symptoms

QTM cannot find the connection to the camera system or do not find the
whole system.

Resolutions
l Check that the camera system is turned on.

l Check that the cable between the computer and the master camera
is connected.
l Check that the cables between the cameras in the camera system
are connected.
l Check that the cameras are working and that the cables are not dam-
aged.

NOTE: Do not remove more than one component at a time


when looking for the error.

TECHNICAL REFERENCE 1022


l Check that the network interface has the correct settings, see "Net-
work card setup" on page 461.
l For wireless Oqus systems, check that the computer is connected to
the correct wireless network (SSID).
l Restart the system in the following order: exit QTM, shut down the
camera system, turn on the camera system, wait until all cameras
are powered up and then start QTM.

Symptoms

The camera does not got an IP-address (Arqus/Miqus status ring keeps
pulsing yellow; Oqus: status bar on the display gets stuck at about 75%).

Resolutions
l Restart the computer.

l Check that QDS is running.

l Check that the cable between the computer and the master camera
is connected.
l If the computer has two network cards remove the other network
cable and go to Network connections in Windows and check that the
correct network is connected.
l Check that the network has the correct static IP-address settings,
see "Network card setup" on page 461. Run the QDS wizard if you do
not know how to change the IP-address settings.
l Check that QDS is not blocked by a firewall. Most importantly check
that the Windows firewall allows the program.

NOTE: The exceptions in the Windows firewall can change if


you disconnect another network card than the one you use for
the Oqus cameras. In that case restart QDS to get a new mes-
sage from the firewall.

TECHNICAL REFERENCE 1023


l Make sure that you are using the built-in wired Ethernet port. QDS
does not work with some Ethernet adapters.

Troubleshooting calibration

Symptoms

The calibration is inexact, i.e. too few points or large residuals.

Resolutions
l Check that the Exact wand length on the Calibration page in the
Project options dialog are correct.
l Check that all cameras can see the L-shaped reference structure.

NOTE: This does not apply to an extended calibration.

l Check that the calibration wand has not been moved too fast.

l Check that the calibration wand has been moved inside the cal-
ibration volume during the calibration, i.e. check that there is at
least 500 points used for each camera in the Calibration results dia-
log.

NOTE: If the wand is larger than the calibration volume,


change to a smaller wand calibration kit.

l Check that the camera positioning is OK, see chapter "Camera pos-
itioning" on page 439.

Symptoms

The field of view of the cameras is too small.

TECHNICAL REFERENCE 1024


Resolutions
l Move the cameras out from the measurement volume.

l Change to a wider lens option.

Symptoms

The camera result of a camera says Unused camera.

Resolutions
l The camera is deactivated on the Linearization page in the Project
options dialog. Activate the camera and perform the calibration
again or reprocess the calibration, see chapter "Recalibration" on
page 563.

Troubleshooting capture

Symptoms

There are no visible markers in the 2D view window in the preview mode.

Resolutions
l Check that the cameras are focused.

l Check that the aperture is open enough.

l Check the Exposure time and Marker threshold settings, see


chapter "Tips on marker settings in QTM" on page 483.

Symptoms

The markers in the 2D view are too small (check the size in the Data info
window).

TECHNICAL REFERENCE 1025


Resolutions
l Check that the aperture is open enough.

l Move the camera system closer to the markers.

l Change to a larger marker on the measurement object.

l Check that the marker is not worn, if it is replace it with a new one.

Symptoms

The markers in 2D are not stable.

Resolutions
l Check the Exposure time and Marker threshold settings, see
chapter "Tips on marker settings in QTM" on page 483.

Symptoms

The intensity of the markers in the 2D view is too low.

Resolution
l Check the aperture, see chapter "Tips on setting aperture and focus"
on page 481.

Symptoms

There are fewer markers in the 3D view than in the 2D view.

Resolutions
l Check that the Bounding box setting on the 3D Tracking page in
the Project options dialog are large enough.

TECHNICAL REFERENCE 1026


Symptoms

QTM is waiting for external trigger, but no trigger is connected.

Resolution
l Check that the Use external trigger option is not selected on the
Synchronization page in the Project options dialog.

Symptoms

A camera is indicated as Not use for tracking in the 2D view and the 3D
view window.

Resolutions
l The camera is deactivated on the Linearization page in the Project
options dialog. Activate the camera and retrack the measurement if
you have already made measurements, see chapter "Reprocessing a
file" on page 601.

NOTE: If the camera was deactivated during the calibration,


the calibration must be reprocessed if you want to use the
camera, see chapter "Recalibration" on page 563.

Symptoms

There are error messages saying that the camera did not reply three
times in a row.

Resolutions
l Check that you are using the built-in wired Ethernet. Sometimes the
communication does not work with Ethernet adapters.
l Check the Ethernet cables so that they are not broken.

l Reboot the camera system.

TECHNICAL REFERENCE 1027


Troubleshooting tracking

Symptoms

There is too much segmentation in the capture file.

Resolutions
l Check the tracking parameters, see chapter "3D Tracker parameters"
on page 325.
l Make a new calibration.

l Check that the camera positioning is OK, see chapter "Camera pos-
itioning" on page 439.
l Check that the calibration wand has been moved in the area where
the segmentation occurs. E.g. when segmentation occurs close to
the floor at gait measurements.

Symptoms

The trajectories are swapping.

Resolutions
l Check if the markers on the measurement object can be smaller.

l Check if the markers on the measurement object are too close to


each other. Move them further apart to reduce swapping.
l Lower the Max frame gap setting on the 3D Tracking page in the
Project options dialog. A large Max frame gap will make it more
likely that some trajectories will swap.
l Check the tracking parameters, see chapter "3D Tracker parameters"
on page 325.

Symptoms

There are large gaps between two 3D points in trajectories.

TECHNICAL REFERENCE 1028


Resolutions
l Check that the capture frequency is high enough.

Symptoms

AIM cannot find a solution or finds the wrong solution.

Resolutions
l Make sure that it is the correct AIM model that is applied.

l Make sure that the model was generated from a file where all of the
trajectories could be viewed during the whole measurement and
where there were no erratic data, see chapter "Generating an AIM
model" on page 625.

Symptoms

The markers in the 3D view window are located in one plane.

Resolutions
l 2D tracking has been used instead of 3D tracking. If you want 3D
tracking in the following captures change to 3D on the Processing
page in the Project options dialog. To 3D track a saved capture file
with 2D tracked data, you have to Batch process the file with 3D
tracking, see chapter "Batch processing" on page 605.

Troubleshooting reflections

Symptoms

The cameras capture unwanted reflections.

TECHNICAL REFERENCE 1029


Resolutions
l Check that there are no other sources of infrared light in the meas-
urement volume, e.g. reflections of the sun, lamps etc.

TIP: You can use the Video mode to locate the reflections.

l Check that the cameras cannot see each other IR flashes or the
reflection of the IR flashes.
l Check that there are no very glossy materials in the measurement
volume.
l Use exposure delay, see chapter "Delayed exposure to reduce reflec-
tions from other cameras" on page 534.
l Use the marker masking function if you cannot remove the reflec-
tions, see chapter "Marker masking" on page 536.

Troubleshooting force calculation

Symptoms

The forces are too high or too low.

Resolutions
l Check that the calibration parameters for the force plate have been
entered correctly, for information about the settings see chapter
"Force plate settings" on page 362.

Symptoms

QTM does not calculate the force data.

TECHNICAL REFERENCE 1030


Resolutions
l Check that the Calculate force option have been selected for the
force plate on the Force data page in the Project options dialog.

Symptoms

The force vector is pointing in the wrong direction.

Resolutions
l Check the settings for the force plate location on the Force plate
page in the Project options dialog.

Symptoms

The area representing the force plate is not placed at the correct place in
the 3D view window.

Resolutions
l Change the Force plate location settings on the Force plate page
in the Project options dialog.

Troubleshooting 6DOF

Symptoms

There is no 6DOF body in the 3D view window.

Resolutions
l Check that Calculate 6DOF is activated on the Processing page in
the Project options dialog. If you have a file reprocess it with Cal-
culate 6DOF activated with the correct 6DOF bodies.
l Check that the 6DOF body is within the measurement volume.

TECHNICAL REFERENCE 1031


l Check that the 6DOF body definition includes at least three points,
which are not in one plane.
l Check that at least three of the markers on the 6DOF body can be
captured at the beginning of the measurement.
l Check that at least three of the markers can be seen in all frames
during the measurement. Otherwise the 6DOF tracking function can-
not find the body in those frames.

Symptoms

There is no 6DOF body data in the Data info window even though it is dis-
played in the 3D view window.

Resolutions
l Check if the 6DOF body uses the Use the current position of this
rigid body option in the Coordinate system for rigid body data
dialog. If it does it will not be displayed in the Data info window
when the other body is not visible.

Troubleshooting update

Symptoms

You are using an old version of QTM and you want to update the soft-
ware.

Resolutions
l Look for the latest software on this web page https://www.qualisys.-
com/my/ and follow the instructions for installing the QTM software,
see chapter "Software installation" on page 54.

Symptoms

QTM needs to update the firmware and you cannot find it on your com-
puter.

TECHNICAL REFERENCE 1032


Resolutions
l The firmware should be located in the Camera_Firmware folder in the
Qualisys Track Manager folder. If it is not there, install the latest QTM
version again to get the correct firmware.

Troubleshooting other

Symptoms

There are less analog channels displayed in the Data info window than
there should be.

Resolutions
l Check that all of the analog channels have been activated on the
Analog board (...) page in the Project options dialog.

Symptoms

The Status bar shows the message ”No analog trigger received” after a
motion capture.

Resolutions
l Check that the synchronization cable between the camera and the
analog board is connected.

Symptoms

There is a Too fast pacer rate error for the USB-2533 board even though
the analog capture rate is low.

Resolution
l Check the sync cable that is connected between the camera system
and the analog board.

TECHNICAL REFERENCE 1033


l Make sure that the sync cable is the correct type of BNC cable (50
Ohm).

Symptoms

One or more trajectories are missing in the Trajectory info windows.

Resolution
l Retrack the file with new tracking parameters.

NOTE: This will delete all processing that have been made to
the capture file.

Symptoms

You cannot open a new window.

Resolutions
l There can be 30 View windows opened for a qtm-file, this also
includes the Plot windows. If you have 30 windows opened, close
one to open a new window.

TECHNICAL REFERENCE 1034


Glossary

2D
Two dimensions

2D tracking
Tracker that uses the 2D data of a single camera to calculate trajectories in a plane.

2D view window
Window with the 2D views of the cameras.

3D
Three dimensions

3D point
Point that is specified with the three coordinates of the 3D space.

3D tracking
Tracker that uses the 2D data of all cameras in the system to calculate marker pos-
itions in three dimensions.

3D view window
Window with a 3D view calculated from the 2D data.

6DOF
Six degrees of freedom

6DOF body (rigid body)


Body that is defined by fixed points on the body and a local coordinate system, i.e. loc-
ation and rotation.

GLOSSARY 1035
6DOF tracking
Tracker that calculates the position and rotation of a rigid body in the 3D view.

A/D board (analog board)


Analog/Digital board, which converts an analog signal to a digital.

AIM (Automatic Identification of Markers)


Process that automatically identifies the trajectories as being part of a defined tra-
jectory.

Analog capture
QTM can capture analog voltage data in synchronization with the motion capture
data. If you have an analog board.

Analog output
With analog output 6DOF data can be used as feedback to an analog control system. If
you have an analog output board.

Analog output board


Board which converts a digital value to an analog signal.

Analog synchronization cable


Cable that connects the camera system with the analog board for synchronization of
the start.

Aperture
The size of the opening in the camera’s lens. This opening can be adjusted with the
adjustment ring.

GLOSSARY 1036
B

Bit
Computer unit, which can be either 0 or 1.

BNC
Type of contact for coaxial cables.

Bone
Visible connection between two trajectories in the 3D view.

Byte
Computer unit. 1 byte = 8 bit

C3D
Standard file format in motion capture

Calibration
Process that defines the position of the cameras in the 3D space. The calibration is
used for the 3D reconstruction.

Calibration kit
Equipment that is needed for a wand calibration, e.g. calibration wand and L-shaped
reference structure.

Calqulus
Online platform for biomechanical data analysis hosted by Qualisys AB.

GLOSSARY 1037
Camera ray
The 2D position of a marker projected into the 3D space based on the position and ori-
entation of the camera.

Capture
Measurement which collects several frames at a fixed frame rate.

Capture file
A qtm-file with motion capture data (.qtm).

Capture rate
Frame rate in Hz that is used for the motion capture.

Capture view
View that is used during motion capture.

Coordinate system
A system of axes which describes the position of a point. In QTM all of the 3D coordin-
ate systems are orthogonal, right hand systems.

Coordinate system of the motion capture


The coordinate system which is defined with the calibration process.

D/A board
Digital/analog board, which converts a digital value to an analog signal.

Data info window


Window with data for the current frame.

Discarded trajectories window


Window with deleted trajectories.

GLOSSARY 1038
E

Extended calibration
Method that extends the motion capture volume, when using Wand calibration.

External timebase
Device that controls the frame rate of the camera system.

External trigger
Device that triggers the motion capture system to start the measurement.

FBX
FBX (filmbox) is a widely used file format for exchange of 3D and skeleton data for
animation software.

Field of view (FOV)


The camera's view, vertical and horizontal on a specific distance from the camera.

File view (File mode)


View that is used when a motion capture is finished and for saved files.

Fill level
The percentage of the capture period in which a trajectory has 3D data (been visible).

Fixed camera calibration


Calibration method which is used for large systems with fixed cameras.

Focus
Changes the focal length of the camera to achieve a clear image.

GLOSSARY 1039
Force plate
Plate that can measure the force that is applied on top of it.

Frame
Single exposure of the camera system.

Frame rate
Frequency of the motion capture.

Gap
Missing part within a trajectory.

Gap fill
Function that calculates a probable path between trajectory parts to associate them.

Global coordinate system


In QTM it is the same as the coordinate system of the motion capture, which is
defined by the calibration.

IR
Infrared

IR marker
A marker which reflects or transmits IR light.

GLOSSARY 1040
L

Label
Name of a trajectory in the Identification windows.

Labeled trajectories window


Window with identified trajectories.

LED
Light Emitting Diode

Linearization
Correction data which is needed for each camera to make the capture as good as pos-
sible.

Local coordinate system


Coordinate system of a 6DOF body.

Marker
Item that is attached to the moving object to measure its position.

Marker – Active
Marker with an infrared LED that is activated by the camera’s flash in each frame.

Marker – Passive
Marker with reflective material.

Marker (3D view)


Sphere that represents a trajectory in 3D views.

GLOSSARY 1041
Marker discrimination
Option that reduces non-wanted reflections or marker sizes during capture.

Marker mode
The default mode in 2D view windows, which shows the markers captured by the cam-
era.

Markerless mocap
Video based motion tracking without the use of markers.

Max residual
Maximum distance for a 2D ray to be included in a 3D point during tracking.

Measurement computer
Computer which is connected to a camera system, which must have the QTM applic-
ation installed.

Measurement range
The range that is set with the boxes on the Timeline control bar. It defines the frames
which is included in the analysis.

Mesh
Wavefront 3D object (.obj ) files and associated .mtl and texture files for visualization
of 3D objects.

Motion capture
Measurement which records a motion.

Motion glove
Glove used for tracking finger motions.

GLOSSARY 1042
O

Open GL
Standard graphic presentation language.

PAF
Project Automation Framework for structured data collection, analysis and reporting.

Pitch
Rotation around the Y-axis in Qualisys standard rotation.

Plot window
Window with data plots.

Pretrigger
Setting in QTM where the frames before the trigger are saved in the camera’s memory
and are then added to the measurement when a trigger event occurs.

Preview mode
Mode when QTM is showing the measured data before the start of a capture.

QDevice API
API for the integration of external data aquisition devices into QTM.

QFI
Program for installing firmware in the Qualisys cameras.

GLOSSARY 1043
QTM Scripting Interface
API providing scripting support for QTM, implemented for Python, Lua and REST.

Real-time output (RT output)


Function which exports 3D, 6DOF and Analog data in real-time to a remote computer.

Reference marker
Special kind of active marker which is used in fixed camera systems and is visible at
long distances.

Reference structure
The L-shaped part in the calibration kit of the Wand calibration.

Remote computer
Computer which receives 6DOF data from the RT output.

Residual
In most cases in QTM this is the minimum distance between a 2D marker ray and its
corresponding 3D point or an average of this measure.

Residual (3D)
The average of the different residuals of the 2D marker rays that belongs to the same
3D point.

Residual (6DOF)
The average of the errors of each measured marker compared to the 6DOF body
definition.

Residual (calibration)
The Average residual in the Calibration results dialog is the average of the 3D resid-
uals of all the points measured by the camera during the calibration.

GLOSSARY 1044
Rigid body (6DOF body)
Body that is defined by points and a local coordinate system, i.e. location and rotation.

Roll
Rotation around the X-axis in Qualisys standard rotation.

SAL
Skeleton Assisted Labeling.

Scripting
Possibility to extend QTMs functionality through Python or Lua scripts, or REST API.

Skeleton
Series of segments organized in joint chains with hierarchical relationships.

Skeleton calibration
Function that creates a skeleton including degrees of freedom, joint boundaries, pos-
ition of segments origins and position of markers expressed in their respective seg-
ment coordinate system.

Skeleton solver
Function that fits a calibrated skeleton definition to measured 3D trajectories.

Smooth
Operation applied to data in order to reduce noise while preserving important pat-
terns.

Spike
Discontinuity between consecutive frames within a trajectory.

GLOSSARY 1045
Subpixel
Unit used to express marker position and size in the 2D data of a Qualisys camera.
The number of subpixels is obtained by multiplying the number of pixels in each
dimension by a factor 64.

TCP/IP
Protocol for communication between computers.

Trace
Traces the position of a trajectory in the 3D view.

Trace range
Range of frames for which the trace is shown.

Tracking
Process that calculates 3D data, 6DOF or skeleton data.

Trajectory
3D data of a marker in a series of frames.

Trajectory editor window


Window for graphical display and editing of trajectories.

Trajectory info windows


Windows with trajectories and 3D data.

Translate
Move the center of rotation and zoom in the current 2D plane of the 3D view.

GLOSSARY 1046
Traqr
Compact lightweight object with active or passive markers for rigid body tracking, part
of the Qualisys Traqr ecosystem.

Trigger
Signal that starts a measurement.

Tripod
A very stable three-legged stand.

TSV (Tab Separated Values)


File format where the data is separated with the TAB character.

Twin system
Connect two separate camera systems and merge their 3D data in one file. For
example with above and under water systems.

Unidentified trajectories window


Window in QTM with unidentified trajectories.

USB
Hardware interface for connecting peripheral units to a computer.

View window
Window in QTM which shows 2D, 3D or Video views.

GLOSSARY 1047
Virtual trajectory
Artificial trajectory added during post processing without a direct relation to meas-
ured 2D data.

Volume
The defined measurement’s height, length, depth.

Wand
T-shaped object that is used in Wand calibration to scale and define the axes of the
coordinate system of the motion capture.

Wand calibration
Calibration method which uses a wand and an L-shaped structure to calibrate.

WLAN
Wireless local area network.

Yaw
Rotation around the Z-axis in Qualisys standard rotation.

Zoom
Zoom in to get a close-up of your 3D view or zoom out to see more of the page at a
reduced size.

GLOSSARY 1048
Index

2D data

graphic display 84

plot 170

TSV export 711

TSV file format 713

viewing data 169

2D tracking 618

description 618

settings 330

2D view window 84

3D data 138

C3D export 727

graphic display 116

Matlab export 729

Matlab file format 730

plot 151

TSV export 711

TSV file format 713

viewing data 138

3D motion capture 566

camera placement 440

outline 567

3D tracking 614

description 614

INDEX 1049
Maximum residual 326

Minimum ray count 327

Minimum trajectory length 327

parameters 325

Prediction error 326

Ray length limits 328

settings 325

test 616

troubleshooting 1028

3D view window 109

bones 119

change appearance 112

contents 109

menu 131

rigid bodies 122

Skeletons 123

trajectories 116

6DOF analog output 665

how to use 665

settings 389

6DOF bodies (Rigid bodies)

creating 650

definition 653

graphic display 122

local coordinate system 658

referring coordinate system 354

settings 346

6DOF capture 566

analog output 665

real-time output 665

INDEX 1050
troubleshooting 1031

6DOF data 170

6DOF versus 3D 649

calculation 1011

Euler angles (rotation angles) 709

graphic display 122

Matlab export 729

Matlab file format 730

rotation angles 663

TSV export 711

TSV file format 716

viewing data 170

6DOF real-time output 665

description 665

settings 387

6DOF tracking 649

description 659

introduction 649

parameters 345

A/D board 752

channels 297

connect 752

drivers 747

settings 291

USB-1608G 975

USB-2533 973

INDEX 1051
Active markers

Active Traqr 1000

Long Range Active Marker 1010

Naked Traqr 1003

Passive vs active 529

Settings 531

Short Range Active Marker 1005

AIM 624

apply model 634

generate model 626

settings 339

Skeleton 682

Analog data 174

capture 752

plot 175

TSV file format 722

viewing data 174

Analyze 153

Aperture

Arqus 447

Miqus 451

Oqus 458

Tips 481

Arqus

Aperture and focus 447

Automatic ordering 481

Camera back side 934

Camera front side 933

Camera identification 480

Camera Sync Unit (back) 952

INDEX 1052
Camera Sync Unit (front) 950

Communication 976

Description 933

Firmware update 471

How to connect 445

Specifications and features 929

Batch capture 571

Batch processing 605

description 605

settings 322

Blackmagic Design 899

connecting video source (Decklink Mini Recorder) 901

connecting video source (Intensity Pro) 900

installing 899

settings 901

Bones 119

create 119

C3D export 727

export 727

file format 728

settings 400

C3D import 187

INDEX 1053
Calibration 543

calibration kit 547

error messages 560

extended calibration 549

fixed camera calibration 557

fixed camera settings 257

outline (wand calibration) 547

perform 543

perform settings 545

reprocess 563

result 558

settings 253

transformation 259

troubleshooting 1024

wand calibration 547

wand calibration tips 548

wand settings 253

Camera positioning 439

Camera rays 130

Camera Sync Unit 950

Back side 952

Description 950

Digital IO 953

Electrical specifications 953

Front side 950

Mechanics 953

Specifications and features 949

Cameras

Communication 976

Environmental protection 978

INDEX 1054
How to set up 441

Sensor specifications 926

Sensor specifications (streaming video) 927

Specifications and features 929

Types 432

Underwater 978

Capture 566

6DOF real-time output 665

batch capture 571

capture period 569

capture rate 221

guidelines 479

how to 567

start capture dialog 569

troubleshooting 1025

Connection 477

Locate system 222

Locating the system 477

troubleshooting 1022

Coordinate system 110

for rigid body data 354

global (defined by calibration) 253

local 658

transformation 259

view in 3D views 109

D/A board 665

INDEX 1055
Data export 710

Batch exporting 710

C3D export 727

C3D file format 728

C3D settings 400

description 710

FBX export 742

FBX settings 409

MAT export 729

MAT file format 730

Matlab settings 402

TSV export 711

TSV settings 397

Data info window 167

data types 169

menu 168

plot 168

Data types 169

2D data 169

3D data 138

6DOF data 170

analog data 174

force data 176

Skeleton data 172

Discarded trajectories 137

delete trajectories 152

window 137

INDEX 1056
E

EMG device 803

Cometa 838

Delsys (API integration) 813

Delsys (SDK legacy) 826

Euler angles (Rotation angles) 709

calculation from the rotation matrix 1011

custom definition 1013

description 709

example 663

Events 706

Camera Sync Unit 276

External trigger 273

Wireless trigger 267

Exposure time 228

Tips 483

External timebase 494

how to use 494

settings 278

with bursts of signals 499

External trigger 492

how to use 492

INDEX 1057
F

FBX export 742

Settings 409

Firmware update 470

QFI 471

when locating system 470

when opening a new file 471

Fixed camera calibration 557

how to use 557

settings 257

Focus

Arqus 447

Miqus 451

Oqus 458

Tips 481

Force calculation 703

export 727

settings 362

troubleshooting 1030

viewing force 704

Force plates

AMTI Digital 756

analog 790

Arsalis 759

Bertec digital 764

digital 756

force plate control (Kistler) 301

INDEX 1058
Kistler digital 768

settings (COP) 386

settings (force plate types) 362

settings (location) 382

settings (use) 360

Frame rate 220

Gap fill 641

How to 642

Kinematic 645

Linear 643

Methods 642

Polynomial 643

Relational 643

settings 338

Static 643

Virtual 644

Generic devices

h/p/cosmos treadmill 912

Hardware

A/D board 752

Arqus 933

D/A board 665

INDEX 1059
EMG device 803

external timebase 494

external trigger 492

force plates 756

Generic devices 912

Instrumented treadmills 797

Miqus 944

Pretrigger 493

video 909

High-speed video (Oqus) 579

description 579

Settings (advanced) 238

Settings sidebar 91

Identification 624

automatic (AIM) 624

manual 620

settings (AIM) 339

Instrumented treadmills

h/p/cosmos Gaitway3D 797

Label list 158

Labeled trajectories window 137

INDEX 1060
Linearization 485

how to 485

settings 249

Marker (3D view) 109

data 117

Marker (IR) 529

Active marker types 1000

maintenance 1010

passive/active 529

placement 530

settings 227

sizes 529

Tips on settings 483

Marker discrimination 235

Marker threshold 228

Tips 483

Markerless mocap

How to set up the system 452

Matlab export 729

MAT export 729

MAT file format 730

Measurement range (timeline) 133

Mesh

Rigid body settings 358

Static 424

Static mesh settings 425

INDEX 1061
Miqus

Aperture and Focus 451

Automatic ordering 481

Camera back side 945

Camera front side 944

Camera identification 480

Camera Sync Unit (back) 952

Camera Sync Unit (front) 950

Communication 976

Description 944

Firmware update 471

How to connect 449

Specifications and features 939

Mixed system

How to connect 459

Motion glove

Manus 889

Settings 344

StretchSense 892

Oqus

Aperture and Focus 458

Communication 977

connectors 965

display 963

firmware update 471

high-speed 579

INDEX 1062
How to connect 455

Oqus Sync Unit 967

Specifications and features 955

Oqus Sync Unit 967

How to use 509

Overlapping trajectories 142

PCI boards

A/D board 752

Pitch 1011

definition 1013

example 663

view 170

Plot 179

3D data 151

data in Data info window 168

Mouse gestures 215

Panning 182

Plot menu 181

Shortcuts 215

User interaction 182

window 179

Window layouts 183

Zooming 182

Pretrigger 277

hardware with analog capture 493

How to use 493

INDEX 1063
Settings 277

Processing 600

AIM 624

available steps 600

batch processing 605

calculate force data 703

data export 710

settings 322

tracking 6DOF 659

Project Automation Framework

Analysis Modules 915

Description 1016

Resources 1016

Project options 217

Project view 62

Projects 60

backup 73

creating 69

presets 74

using 66

PTP sync 501-502

how to use 501-502

QDevice

QDevice API 1022

Resources 1022

INDEX 1064
QDS 462

advanced 467

menu 462

Qualisys DHCP server message 470

wizard (network) 465

QFI 471

Qualisys video 574

3D data overlay 587

calibration 576

high-speed video (Oqus) 579

How to connect 452

how to use 575

In-camera MJPEG 576

Markerless mocap 452

Sensor specifications 927

streaming video capture 576

Rays 130

Real-time output 590

description 590

how to use 596

Resources 1015

settings 387

Recalibration 563

Reference marker 557

Reference structure 547

Retracking 601

INDEX 1065
Rigid body (6DOF body) 650

creating 650

definition 653

graphic display 122

local coordinate system 658

referring coordinate system 354

settings 346

Roll 1011

definition 1013

example 663

view 170

Rotation angles (Euler angles) 709

calculation from the rotation matrix 1011

custom definition 1013

description 709

example 663

Safety notices 44

Scripting

Description 1017

QTM Scripting Interface 1017

Resources 1019

Skeleton solver

Calibration 690

Data 172

Export and streaming 702

Extra markers 679

INDEX 1066
FBX export 409, 742

Graphic display 123

How to measure 699

Kinematic gap fill 645

Marker label mapping 681

Processing 700

Segment names 679

Settings 341

Skeleton Assisted Labeling (SAL) 700

T-pose 690

TSV export format 720

View data 172

Skeleton tracking 671

AIM 682

Introduction 671

Skeletons 123

Smoothing 646

Butterworth 647

Methods 646

Moving average 647

SMPTE

How to use 512

Synchronization 512

Software masks

How to use 611

Spikes 645

Detection 646

Smoothing 646

Threshold 646

Store real-time data 572

INDEX 1067
Streaming video 576

Settings (advanced) 238

Settings sidebar 91

Synchronization 266

Audio 512

Oqus Sync Unit 509

Pretrigger 493

Settings 266

SMPTE 512

Synchronization output 285

Measurement time 290

settings 285

Synchronizing external hardware 505

System

requirements 48

Timeline control bar 133

Timestamp 284

settings 284

Timing

external timebase 494

external trigger 492

Trace 111

display (all) 131

display (individual) 144

range 133

INDEX 1068
Tracking

definition (2D) 618

definition (3D) 614

definition (6DOF) 649

Definition (skeleton) 671

parameters (3D) 325

parameters (6DOF) 345

retracking 601

settings (2D) 330

settings (3D) 325

settings (6DOF) 345

test 616

troubleshooting 1028

Trajectories 137

analyze data 153

delete 152

display 144

in 3D views 116

label lists 158

overlapping 142

plot data 151

select and move 141

split part 151

windows 137

Trajectory Editor window 159

Gap fill 641

Menu 165

Points of Interest sidebar 163

Settings sidebar 164

Shortcuts 211

INDEX 1069
Smoothing 645

Spikes 645

Toolbar 161

Trajectory info windows 137

data in 138

menu 144

Trajectory Overview window 166

Shortcuts 214

Traqr

Active 1000

Naked 1003

Trigger

External 273

Keyboard 267

Software 267

UDP start/stop 270

Wireless 267

Trigger port(s) 273

External trigger 273

Settings 273

TSV export 711

export 711

file format (3D) 713

file format (6DOF) 716

file format (Analog) 722

settings 397

Skeleton data 720

Twin systems 514

calibration 519

files 522

INDEX 1070
settings 333

setup 514

Upgrade firmware

QFI 471

USB A/D board 973

USB-1608G 975

USB-2533 973

Video 898

Blackmagic Design 899

compression 910

how to use 898

settings 304

view window 100

View window 84

2D 84

3D 109

video 100

Virtual trajectories 648

INDEX 1071
W

Wand calibration 547

extended calibration 549

how to use 547

settings 253

tips 548

Window layout 82

Wireless trigger 267

Settings 267

Yaw 1011

definition 1013

example 663

view 170

INDEX 1072

You might also like