DMXin VP
DMXin VP
DMXin VP
Title:
Project Period:
February 2021 – June 2021
Semester Theme:
Master Thesis
Supervisor(s):
Copyright © 2006. This report and/or appended material may not be partly or completely
published or copied without prior written approval from the authors. Neither may the contents be
used for commercial purposes without this written approval.
DMX in Virtual Production
for Film and Stage Performance
Igor Akoulinine
Supervisor
2
Table of contents
Table of contents 3
Attached files 7
Glossary 9
Abstract 10
Preface 10
Report outline. 10
GitHub repository. 10
Introduction 11
Motivation 12
Medialogy and this project. 13
Methodology 13
Project Management. 14
Tools. 14
Background. 15
Light field 15
Lighting control 15
Before electricity 15
Electric lamps 17
Dimmers 17
Analog 0-10v 18
DMX 19
RS485 19
DMX512 Data Protocol. 20
Lighting console 22
Art-net and ACN 24
Lighting for Cinematography 25
Virtual lighting 26
Visual Effects 26
Computer Generated Imagery in Cinematography 26
Real time cinematography. 27
Initial Problem Statement 28
Analysis 29
Previous work 29
Art-net and DMX for Unity 29
Lighting in Virtual Video Production 30
Virtual production types. 30
LED screen lighting. 34
Camera, photography and lighting. 34
3
Properties of the good cinematographic lighting 35
Use cases. 36
Requirements engineering. 38
Software requirements. 39
Nonfunctional 39
Functional 39
Final Problem Definition 41
Methodology. 41
Interview 41
Survey 43
Questionnaire structure. 44
Observations 44
Design 46
Design Goals 46
Data model 46
Class structure 47
UI design. 48
Color probe. 49
Implementation 50
General Unity notes. 50
Editor Window 51
Scriptable Object 56
Art-net 57
DMX USB Pro 58
Lights and Discoverability. 59
Group Controller 61
Cue Editor 62
Color Probe 64
Testing 67
Testing Software and Equipment. 67
Testing Set Up 68
The Stage 69
The Corridor 69
Color Probe 70
Observations. 71
Demo session. 71
Survey. 73
Interview. 73
Findings 73
Discussion 74
4
DMX for UE4 74
Personal evaluation. 75
Conclusion 75
Future Work 75
Research 75
Development 76
Acknowledgments 76
References 77
Appendix. 85
Random Figures. 85
Manual 90
Installation 90
Operations 91
Color probe(experimental). 94
Survey results 96
5
Kraftwerk’s video 1983, StyleGAN synthesised face (Karras, Laine, &
Aila, 2019).
Figure 14 VFX studio DNEG has launched its own virtual production and
previs outfit(FAILS, 2020) 30
6
Figure 34 Fixtures selection pane. 57
Figure 45. The corridor has 30 RGBW lights. Scene view(left) and main
camera view(right) 66
Figure 48 Scene from the episode filmed featuring the DMX system. 69
Attached files
Audio\ExpertInterview.wav
DMXtoolsSourceCodeAndPackage\
Images\Artifacts\AnalogConsole.jpg
Images\Artifacts\AnalogDimmer.jpg
Images\Artifacts\Automata.png
Images\Artifacts\cornelius.png
7
Images\Artifacts\corridor.png
Images\Artifacts\stage.png
Images\Artifacts\System in action.png
Images\Artifacts\The stage.png
Images\Diagrams\Art-net.jpg
Images\Diagrams\DMX (1).png
Images\Diagrams\Project schedule.png
Images\Diagrams\RS485 Topology.png
Images\Diagrams\SMILE (2).png
Images\Diagrams\TAM (1).png
Images\Diagrams\UseCaseDiagram (1).png
Images\Diagrams\UseCaseDiagram (2).png
Images\Diagrams\UseCaseDiagram.png
Video\DMXtoolsDemo.mp4
Video\LightingDemoGroup606.mp4
Video\VideoAbstract.mov
8
Glossary
DMX - digital multiplex. The industry's standard protocol to control lighting fixtures
for architectural and entertainment application
Fixture, head, lamp - DMX or light device.
Universe - DMX network that operates 512 channels.
Console, mixer, controller - lighting console.
Cue - the levels of all the channels saved into preset.
Cue stack - collection of cues.
Orthographic projection - a method of projection in which an object is depicted
or a surface mapped using parallel lines to project its shape onto a plane.
Programmer - memory section of the lighting console that holds the channels
levels information to be saved to the cue
HDPR - High Definition Rendering Pipeline, Unity’s pipeline for rendering highly
realistic environments.
AV - audio/video
Dimmer - device to control the intensity of the incandescent light sources.
EEvee - real time render plugin for Blender.
CCT - Correlated Color Temperature
HSI - color model which represents every color with three components: hue , saturation
and intensity.
RGB - color model which represents every color with three components red, green, blue.
RDM - Remote Device Management is a protocol enhancement to DMX.
9
Abstract
Real time virtual production is a state of the art method to capture the final pixels of
combined live action and visual effects in camera in real time. Gaming engine technology
allows instant interaction between real and virtual worlds during production. However,
control of the on set lighting is achieved by DMX protocol unlike the virtual lighting
which is controlled by game engine lighting tools. Author explores the advantages and
ease of use of DMX as a common lighting control method. This report will reflect on
analysis of the realities and challenges of lighting in virtual production to design and
implement system addressing to unify lighting control. This system also lets users choose
either to control entire lighting from the gaming engine or external lighting console. The
number of experts then were surveyed to make a conclusion on the potential role of DMX
protocol in virtual production pipelines. The response from the experts indicated strong
interests in this method of lighting control. However some of the respondents did not find
it convenient enough. Guidelines for further research of the method and development of
the system are suggested.
Preface
This report is the reflection of the Master Thesis in Mediology research and development
project. The system which is developed during the project is experimental and might suffer
from instability.
Report outline.
This report is structured in the following way: First I will introduce you to the topic,
myself, project structure and also will supply you with valuable historic background
information on the topic. Then I will analyse the current workflows of real time
cinematography and possible use cases for the future system, make a set of requirements
and prepare some tests. Afterwards I will take you on the journey of design and
implementation. When the system is ready we will test it. Finally I will present you with
findings and conclusions.
GitHub repository.
In addition to the materials attached to this report latest versions of the source code and the
Unity package file could be found at:
https://github.com/igolinin/DMXtools
10
Introduction
Figure 1 Typical real time virtual film production set up. Unnamed Netflix production(Nishida,
2021)
In recent years convergence of technology made it possible to capture final pixel visuals in
camera in the realtime. The hardware, gaming engines, camera tracking / motion control,
computer graphics imagery not only developed but also got packaged into user friendly
interfaces.
Now film directors can tell the stories that were not feasible to make the films of some
years ago.
Big productions use large LED volumes for the visual effects(VFX) like for example
Disney’s Mandalorian(Disneyplus, 2021) paving the road for others. Before, the
filmmakers were adding VFX to the filmed live action and now they are placing the live
action in the middle of the VFX.
Broadcast productions has already adapted different variations of real time virtual
productions including in-camera LED walls and blended in rendered back or fore grounds.
Real time gaming engines brought more to the cinematography than just render. They also
have physics and interactivity. The behavior of the virtual environment and the object
11
within it could be scripted to follow the story or to be interacted with by actors or the crew
in real time(Visual Effects Society, 2021).
With the acceptance of the real time technology production pipelines have changed
drastically. Now what used to be “pre” and “post” simply became “production”. Many
technological processes of the production which were sequenced can be executed in
co-called asynchronous manner - without one waiting for another to be completed. This
makes real time cinematography to be centered around the collaboration between different
departments, teams and individual talents.
In order for all parts to collaborate efficiently new tools and techniques already have been
and to be created. All the techniques that cinematographers gained over the years of
filming reality also should be kept.
One of the most important elements of the cinematography is lighting. New methods of
filming bring new challenges with the lighting control. Now it is not only on-set lighting
which should be considered but also virtual lighting rendered by the real time engine.
Being able to unify the source and method of lighting control is an important condition for
successful collaboration.
Unity3D(Unity Technologies, 2020) is one of the gaming engines which is extensively
used for virtual productions for films. Titles like Jungle Book(IMDB, 2016), Lion
King(IMDB, 2019), Blade Runner 2049(IMDB, 2017) were made with Unity.
Unity3D is not only used by the productions with Hollywood budgets. Small and medium
size video production studios are using it as well. User friendliness, high availability of the
“how to” information and flexible licensing makes it accessible by almost everyone.
The latest High Definition Render Pipeline(HDRP) brings photorealistic rendering and
advanced lighting control within the virtual space. However there is a gap that doesn't let
users control the real lighting devices from the engine - either manually or
programmatically. On the other hand virtual lighting could not be controlled from the
legacy lighting control consoles which are used to control the on-set lighting.
This project aims to unify the control of virtual and real lighting on virtual production
volumes. In addition I will explore the possibility of application of the same method in
stage performance acts which could also use the power of real time engines to achieve
great visuals during live shows.
Motivation
First, Let me introduce myself and explain how many of the topics covered in this report
are related to my previous occupations and interests.
I was born in Leningrad, USSR and already during school days(1989) learned my first
programming language(Fortran). After school I entered Physics of Metals faculty of Saint
Petersburg Technical University with specialization in Laser and Plasma Methods of
Material processing. However due to the USSR collapse and personal reasons I dropped
out of university and began following my other passion - music. As a double bass player I
have participated in many successful international bands, collaborated with many famous
artists in a number of projects including music soundtracks for play movies. I migrated to
Denmark in 1996.
12
In 2002-2004 I studied computer science at Niels Brock Business College and when in
2005 I decided to stop my artistic career as a musician, I entered a period of freelancing in
coding and lighting, both as technician and as light designer.
From 2010 to 2017 full time employed by Comtech(doesn’t exist any longer, now part of
Nordic Rentals) one of Europe's leading rental and production companies specializing
mainly in lighting equipment and outdoor stages.
Bachelor in Software Development from Copenhagen School of Design(KEA) was
achieved in 2019.
In autumn 2020 as part of my internship I passed a game development course at DADIU -
Danish Film Institute guided game development school. That gave me a great experience
of scripting in Unity.
All listed above facts explain my constant interest in making technical solutions within the
realm of art. This time I will exploit my skills and knowledge in lighting, programming
and computer graphics.
Methodology
This report will cover the applied research on the lighting control which could be divided
in two stages. Both stages are of exploratory type and the first one will prepare the ground
for the second one.
● Stage one, secondary research of the lighting control and virtual productions based
on the published books, articles, interviews and my personal experience. This
research will help me to shape the application which in my opinion will help to
resolve the problem.
● Stage two, primary research based on the experience of usage of the developed
application including expert interviews, the survey and observations of users.
Qualitative data received in this stage will then be analysed to form a conclusion
on the current situation and guidelines for further research.
13
Project Management.
This project followed the iterative nature of the agile methodology(Alliance, 2015) and
was not initially planned step by step. Instead it was divided into 4 sprint periods Figure 2.
The length of each of them was approximately 1 month. The first sprint was mainly coding
the proof of concept of the future application. That was done to make technical limitations
more visible. Followed by the sprint when no coding was done but the achievements of the
previous sprints were analysed along with the background information and current
situation in the realm. The sprint three was divided half for coding and half for testing. The
final sprint was dedicated to the findings analysis and writing of this report.
Tools.
To achieve this project’s goals a number of free helping tools were used. I would like to
credit all of them here.
14
Background.
In this section I will provide the reader with the essential background information on the
topics of lighting and cinematography. Since the beginning of cinematography, which is
basically recording the light on the film, these two technologies developed simultaneously,
but sometimes one had to wait for another. Very often light that worked on the theater
stage could not be used in film and sometimes it was the other way around. I will start with
light because people learned to control it long before they learned to record it. Then I will
line up the technological achievements in order of their appearance.
Light field
Ancient philosophers (Gallardo, 2000 p4) had different theories on the nature of light and
vision. Some thought that light was emitted from the eyes and the others that objects were
emitting it.
Skipping the theories evolution over time we can define light as electromagnetic radiation
that could be perceived by the human eye. That places light waves in between 400 nm and
700 nm of wavelength. Waves shorter(infrared) and longer (ultraviolet) cannot be seen by
the human eye.
The light is emitted by light sources which can vary in intensity(radiosity) and
wavelength(color). When it hits the object its further behavior is defined by the
physical(optical) properties of the material of that object. Light can fully or partially
reflect or bounce back, retract (bend), be absorbed, polarize (selectively penetrate) or
disperse( prism effect)(Gershun, 1939).
When light due to the material’s properties cannot penetrate through the object it throws
the shadow - area on the surface with lower illumination.
Size of the light source results in the shadow's boundary quality - either it is sharp or
spread out. The light that creates sharp shadows is called hard light and the one that
creates shadows with washed-off boundaries is called soft light.
The combination of lights and shadows perceived by our eyes forms the image that we
recognize as our environment.
Cameras can record the image in a similar manner, but even though they are getting better
and better every year they still cannot achieve light sensitivity and flexibility of the human
eye.
To improve the qualities of the recorded image and also to communicate to the viewer with
the meaningful visual message number of the lighting techniques used by the filmmakers.
Lighting control
From very early days of civilization people used lighting and shadows to play a role in
their religious, cultural and artistic acts and performances. In the following subchapters we
will follow the evolution of the lighting control.
Before electricity
The first attempts to control and take advantage of the light were placing the objects that
were supposed to be lit, in the rays of the natural lighting such as sunlight and fire. F.
15
Penzel(Penzel, 1978) in his book “Theatre lighting before electricity” demonstrates the
evolution of the lighting methods from ancient theaters which were built the way that the
stage is lit by the sunlight during the performance hours. Bonfires and torches were also
widely used for all kinds of religious and cultural ceremonies. Later, oil lamps and deep
candles were used for many centuries. French chemist Lavoisier(Brown, 2018) in 1781
suggested attaching movable reflectors to the oil lamps to achieve directional lighting.
From around 1815 gas lanterns began to be used in the european theaters. Figure 3 and
Figure 4 show the gas era lighting control system and light fixture(limelight). It is
important to mention that all the fire related methods of illumination were very unsafe and
often caused fire accidents.
16
Figure 4 Limelight (Penzel, 1978)
Electric lamps
The electric carbon arc lamp(Whelan, 2016) was first invented around 1800 by Sir
Humphry Davy in England and by Vasily V. Petrov in Russia. Practical usage of these
lamps began later in the 19th century. Spot projectors based on arc technology could be
controlled by shutters.
Carbon arc lamps were later replaced by incandescent lamps when those became brighter
and improved their at first short life span. Those properties have been achieved by
replacing carbon filament by metal(eventually tungsten) one and use of inert gas inside the
bulb instead of vacuum. The other big advantage of incandescent lamps was the ability to
turn them on and off and dim them up and down.
Dimmers
The device to control the intensity of the light fixture is called a dimmer.(McCandless,
1958) First commercial dimmers appeared in the 60-70’s of the 19th century. They were
placed backstage or under the stage and could not be operated remotely. Operators should
have to be informed of events on the stage to perform changes of the lighting. Later audio
intercom lines were used for this purpose.
Until the mid 20th century all the dimmers were of the variable resistor type and basically
burned off the excess of the power when dimming. Figure 5 shows the leaflet for the
theater switchboard that was deployed in one of the famous New York theaters in1920’s.
We can see that those days the dimmers and control surface were actually one unit.
17
Figure 5 (Foreman & Motter, 2014)
Situation has changed after the first remotely controlled dimmers became mass produced
in the 1950th. First, magnetic amplifier dimmer and then silicon-controlled rectifier (SCR)
dimmers . It allowed the control console to be moved to “front of house”(FOH) - a
publicly occupied area of the venue(McCandless, 1958).
Analog 0-10v
The 0-10V control system was widely used in the days before digital control took over.
The method is very simple: changes in control circuit between 0 and 10 volts DC reflect in
18
proportional changes in output of the dimmer. So 5V in the control circuit would set the
dimmer to output 50% of its full voltage capacity.
Biggest disadvantage of this setup was a requirement to use at least one cable to control
one dimmer. That made it necessary to use expensive multiplex cables. Digital era opened
new horizons for multichannel control.
DMX
DMX stands for Digital Multiplex and now is industry's standard protocol to control
lighting fixtures for architectural and entertainment applications. Protocol(USITT, 2018)
was created by the United States Institute for Theatre Technology(USITT) in 1986 and
was updated by 1990 revision. DMX's purpose was to create the communication standard
for interoperability between fixtures and lighting controllers designed and produced by
different manufacturers. DMX inherited its physical layer from the electrical interface
based on the RS-485 specification.
RS485
RS-485(sometimes referred as TIA-485 or EIA-485 was made to allow communication of
the electrical devices in the electrically noisy environment over high length cables. RS-485
along with RS232, RS422, RS423 belongs to the family of serial communication methods
for computers and devices. As you can see in the table 1 it beats all the sibling methods by
number of the drivers and receivers, max cable length and the transfer rate.
19
Max driver output voltage ±25 V ±6 V ±6 V –7..12 V
Min driver output voltage ±5 V ±3.6 V ±2.0 V ±1.5 V
(with load)
Table 1. Comparison of RS-family serial methods. (Lammert Bies © 1997-2019, 2019)
The main reason for this is that RS-485 uses differential signals sent over a twisted pair of
cables. RS-485 is also half duplex meaning that data can only be transferred in one
direction at a time. It is not a problem for DMX but becomes a challenge for bi-directional
RDM.
Network topology of RS485(Figure 6) is the long line bus with multiple drop-outs for the
devices. Terminating resistors are used one on each end of the line to eliminate reflections.
20
Figure 7. DMX frame.(“DMX Explained; DMX512 and RS-485 Protocol Detail for Lighting Applications,”
2017)
Originally NULL START CODE was also called dimmer code and was reserved for
dimmer class data only but now it is used for all types of devices.
DMX512 requires no parity between devices. Each device should know its address and
number of channels it uses data from.
Figure 8 shows how DMX works in practice. DMX controller sends DMX data packets
upto 44 times a second. Packet is a byte array of length upto 512 variables. Each of them
can have value from 0 to 255. Device only reads the amount of bytes it uses starting from
its address index. Device executes appropriate function according to the value of the byte
with specified index.
21
Figure 8 This is how DMX works in practice. img:I.Akoulinine
Lighting console
As was already mentioned earlier, the first consoles were the surfaces of the dimmer and
switch units. Operators would interact with the handles and dials of mechanical, back then,
switches and dimmers.
First remote consoles appeared in the 1950s(Holloway, 2010 p.4) and were the fader
boards 1:1 mapped to the dimmer channels. At the same time consoles began to be
equipped with crossfading functionality. Figure 9 demonstrates the method of crossfading
two presets A and B. The operator could set intensity levels of in this case 24 dimmers
independently for each of two presets and crossfade between them smoothly by gradually
and simultaneously increasing one and decreasing the other.
22
Figure 9 Crossfading technique demonstrated. src: (Holloway, 2010)
The preset of the channels intensity values is commonly known as cue. Later, computer
and microcontroller based consoles could store cues in memory and play them back in
order(cue list) or individually. Each cue can have its own delay time as well as fade in and
out times.
The effects variety which could be programmed on the computer based console was
getting more advanced but connection between console and the dimmers was still analog
0-10V which we covered above.
Soon after DMX became a standard in 1986 first consoles and dimmers and shortly other
devices became DMX driven.
Multichannel devices such as moving lights, strobes and color scrollers could not be
conveniently controlled by the fader board. To fix the issue soft patching was
introduced(Holloway, 2010). Soft patching is a method of mapping device’s channel
functions (pan, tilt, red, green, blue…) to the appropriate DMX channel. In this way an
operator can select a fixture or group of fixtures and manipulate values of named
properties - not the numbered channels.
In the beginning of the DMX era 512 channels seemed to be a large number. With the
growing popularity of multi channel devices 512 channels began to look very humble.
Some manufacturers to increase precision of their fixtures started to use the 16bit
approach. The idea is to use two neighbor channel’s 8 bit values and combine them
together to extend the number of steps available to go from 0 to 100 percent of total
attribute value. That improved accuracy, but made one universe of DMX512 too small to
23
accommodate all the fixtures for the large shows. To address the problem console makers
introduced products that can output 2 and more universes.
24
Lighting for Cinematography
Very poor exposure properties of the first film could not let cinematographers use any
artificial lighting sources. The sunlight was the only option back then. That is why Black
Maria(Figure 11) - first in history film production studio built by Thomas Edison in 1893,
had roof that could be opened and was placed on the revolving platform to follow the sun.
Through over 100 years of history of filmmaking lighting technology overcame many
technical challenges.
Indecent lamps in the beginning were too dull to be used for cinema lighting therefore it
was common to use mercury vapor tubes and carbon arc lamps. Before 1927 due to
none-sensitivity of the film emulsion to the red light, tungsten bulbs with the large red
component could not be used(Brown, 2018 p4). After it was resolved tungsten and carbon
arc light sources were competing but often co-working in different filming sets. A real
breakout for tungsten bulbs in brightness was the invention of halogen-tungsten lamps in
the early 1960s. At about the same time a new type of source - HMI(Hydrargyrum
medium-arc iodide) also known as metal-halide, was developed for television. They had
pleasant spectrum balanced daylight output but at first could not be used for filmmaking
due to the flicker caused by AC current. Legacy carbon arc lamps used DC instead. The
issue was promptly resolved by engineers.
Fluorescent tubes with high CRI(color rendering index) are often used to save watts of
power.
LED panels became very popular in the past decade because of the compact size, powerful
output and full color control.
25
Figure 11 Black Maria studio 1893 (Holz, 2014)
Virtual lighting
Virtual or 3D lighting(Gallardo, 2000) is a technique of placing and controlling the virtual
light sources rendered by 3D rendering engines or real-time gaming engines. These
engines render virtual 3D space to create 2D representation. Lee Lanier(Lanier, 2018,p16)
places virtual lighting in the same evolutionary line with lighting techniques of the arts.
Visually virtual light is rendered to look exactly the same as real light would look on the
photograph or video. When controlled over script or software tools virtual light retains all
the properties of the real light but also adds many others which do not exist in reality. For
example you can light an object and it won’t cast a shadow or you can cast a shadow but
make the object invisible. You can do all kinds of magic tricks in the 3D world.
We will continue the discussion on similarities and the differences of the real and virtual
lighting in later chapters.
Visual Effects
Visual effects or VFX is the term that is used to cover a wide range of techniques to
combine or augment live action footage with the artificially created or manipulated
imagery. First VFX were used in filmmaking from very early days. They were very
primitive but yet efficient to impress the audience. Among those there are many that are
still in use as for example:
Later with the development of the technology a great technique called rotoscoping was
added to the list. Puppets and scaled models of all kinds are also widely used.
26
Figure 12. From left to right: Henri Gouraud’s face 1971, A ComputerAnimated Hand by Edwin
Catmull 1972, Rebecca Allen’s Kraftwerk’s video 1983, StyleGAN synthesised face (Karras, Laine,
& Aila, 2019).
27
Figure 13. Image from the Mandalorian LED volume(Disneyplus, 2020).
Many believe that the future belongs to real time virtual production filmmaking. Many
filmmakers say that this way of filming lets the crew members of different roles
collaborate in the real time. More on real time cinematography challenges will be
discussed later in this report.
28
Analysis
In the chapters of this section of the report I will share my findings on the previous work
done in the direction of unifying the way lights are controlled in virtual and physical
space. Then weighing the cons and pros I will analyse the workflow of mixed video
production and overview possible use cases. Based on that I will create the set of the
requirements for the possible solution and finally formulate the question for my
exploratory research.
Previous work
Despite the young age of the virtual cinematographic productions there are many tools
developed to make the life of the filmmakers easier. Some of these tools were developed
addressing specific problems and scenarios and some were actually made for the sibling
game industry and then adapted by techs to be used on film shooting sets. In fact right now
real time virtual cinematography mainly became available because of the game engines.
Rendering engines which were used for animated films and VFX could not do the job in
real time. Now some of the 3D modeling and rendering engines have expanded their
functionality, mainly in the form of extensions or add-ons. For example Blender recently
introduced their users to EEvee (Eevee — Blender Manual, 2021) while Maya’s users are
arguing(“r/Maya - Will we get real time rendering like Eevee in Maya 2020?,” 2020) if it
should have a similar solution. However here I will limit the scope of previous work
review to the lighting control in the real time gaming engines.
29
Lighting in Virtual Video Production
30
Figure 14 VFX studio DNEG has launched its own virtual production and previs
outfit(FAILS, 2020)
31
Figure 15 Diego Prilusky’s volumetric video capture volume. (TED, 2020)
● Hybrid Green Screen Virtual Production is taking its roots from very early days
with the only difference that the screen was black. Later they used the bluescreen
and a similar sodium vapor process(Figure 16) also known as yellowscreen. In
fact, which color to use for the screen doesn’t make much difference and the
choice of green or blue colors is only because they are furthest away from the skin
color.
In the beginning mainly the live footage was used to fill the areas of the screen
with the background but with the development of CGI the background was
rendered by computers. This technology could also be used to remove an actor or
his body parts from the shot to replace it with CGI rendered animation. There are
many different ways the technology could be used. Sometimes it is also referred to
as chromakeing.
32
Figure 16 The sodium vapor process is used in production of Mary Poppins 1964, (Bedard,
2020)
Now hybrid virtual production variations are also going realtime and they require
very delicate lighting of both the real foreground and virtual background. One of
the most known lighting issues during this type of production is shadows thrown
by actors on the screen can make it hard to remove the chromakey. On the other
hand reflection from the screen can spill background color on the actor.
● Full LED Wall productions became possible due to improving LED panels specs
and their falling prices. We can say that this is the product of merging chroma
keying with a back projection screen which was also used by filmmakers for
decades. There are different configurations of these setups. From 75 meters in
diameter 270 degrees curved screen gigantic volume of Mandalorian to 65” 4K
LED screens positioned outside of the train windows(Figure 17).
33
Figure 17 Train set made with 40 of 4K monitors - one outside of each train window. Note
many light projectors panels in order to produce hard light. Stargate studio. Src(Unreal
Engine, 2020b)
With the only exception of Performance capture real time lighting control of either virtual
or real lighting must be applied.
34
Let’s try to do a little experiment instead. Look at the corner of your room which is
perhaps not very well illuminated. May be under your desk or table. Can you see all the
details? Move your head a little to change the angle of view. Did it help to collect more
information? Turn on the light or use the torch of your phone. Better. Come closer. Did
you get all that you wanted to see?
If you film it the way your eye captures it in the beginning and play it back to someone on
the TV screen or mobile phone, the viewer will not be able to do all these helping
manipulations. If she moves the head, turning the light, moving closer to the screen it in
most cases won’t give any extra information.
Above example shows that lighting and the composition of the frame should be done the
way that it delivers most information to the viewer, but of course in accordance with the
narrative.
Narrative, the story which is to be told to the viewers should be the most important here
and should drive the technology and not the other way around when the story is the slave
of the technology or to the particular set or volume. Good example of this technological
story dependency is Disney’s Mandalorian where it seems that many narrative lines are
there to use their gigantic LED volume. In my opinion the story suffers from the monotony
of the desert scenes.
● Exposure
○ Visual balance(enough light but not to much)
○ Contrast
35
● Avoid white clothing and white walls. White surfaces give a lot of uncontrollable
reflections
Use cases.
Exploration of the virtual production types and properties of quality film lightin let us
create a set of the use case scenarios which will be the solid ground for us to build future
system requirements, functional and non functional. We can also set our imagination loose
and add some use cases that are not related to cinematography or stage performance. There
is probably a need for coordinated control of lighting in virtual and real space on music
and theater stages and in museums or educational institutions. Further in this chapter I will
list all the use cases found with descriptions and the diagrams.
1. Larger virtual video production(Figure 18). In this scenario we assume that the
team has already physical lights and the lighting console on set and either LED or green
screen is present. Rendering is happening in real time. Rendered material could be
displayed on the LED or virtual monitor to be composited with the foreground image.
Lighting console is present and the network-connected to the Unity workstation. Practical
lights are connected to the console either by DMX or art-net
36
2. Small studio or VP enthusiast production(Figure 19). This use case covers smaller
filming crews with low budgets. For these teams controlling the real on-set DMX lights is
rather a hard task if they don’t have a DMX console on site. Also in some situations they
would like to match the color in the scene with the color of the light that falls on the actor
or elements of the practical set. For this reason translating the color in the virtual scene to
DMX values and sending it to the real lights would be a very useful scenario. The lack of
the console to save light presets also known as cues can be compensated with the feature
of saving the presets to be played back later. In this way a little extension for Unity can
replace the real console and save some budget resources which could be used elsewhere in
production.
3. Live show(Figure 20). This set up could be used in live music shows or theater
plays. Projection on the backdrop of the stage or LED Wall positioned upstage is not
exactly new but before the image or video used for this purpose was pre made. In the last
decades media servers, DMX or MIDI controlled, were used for this. Now as everything is
going real time rendered we can deploy Unity or its likes to create the image. Here only
your imagination is the limit for all the different combinations of the lights on the stage
and the environments, lights and VFX on the screen. The biggest advantage here of the
real time compared to the pre recorded content is that things often do not go exactly after
the timeline. This type of lighting during the light shows, when the operator doesn’t know
what is coming next is called busking and is a very common technique in the lighting of
37
the live shows.
4. Museum installations. Real time gaming engines are used more and more often in
museums. Large and small LED screens and projection screens display historic and
fictional scenes. For better immersion effect physical lights must interact with the events
in the virtual space. Considering the range of DMX fixtures available on the market,
control of these lights DMX control will be the best choice. Of Course there are more and
more new light sources on the market which could be controlled by wireless protocols like
WiFi and bluetooth based ZigBee and EnOcean but the range supported by these protocols
devices is very little compared to DMX.
Requirements engineering.
Based on the above use cases, a set of requirements for the system that addresses to cover
blank space in control of the lighting during virtual production could be generated. Some
of these requirements are derived from more than one use case and vice versa some of
these use cases generate more than one requirement. In my opinion this fits with the spirit
of the matter of cinematography in which any means are good for the perfect, in the sense
38
of the storytelling, shot. The same here could be applied to the non cinematographic use
cases.
Software requirements.
Nonfunctional
1. System should contain of
a. Unity editor console application
b. Unity editor preset recorder application
c. Prefabs of virtual lighting fixtures
d. Prefabs of twinned real fixture personality
2. Compatibility
a. Software and OS
i. System should be able to run on Unity.
ii. System should be able to integrate into the HDRP project.
b. Hardware
i. System should support USB DMX device
Functional
1. System should hold the DMX data for a number of the universes.
a. Amount of universes should be either pre-defined or user defined.
b. DMX data should be accessible by virtual lights and sent to real DMX
fixtures.
c. DMX data for all universes should be accessible for reading and writing
by public API method
39
f. Should have public interface to change universe and address values
g. RGBW lights
i. Should change RGBW color components according to the DMX
value of the appropriate channels.
ii. Should be a point light
h. Moving Lights
i. Should change RGBW color components according to the DMX
value of the appropriate channels.
ii. Should be a spot light
iii. Should update pan and tilt orientation of the light beam
40
c. System should send color information to update the real DMX fixture
connected to the DMX-out line.
d. Users should be able to correct the color information.
Methodology.
To explore the applicability of DMX in the virtual production the number of methods to be
used to collect the feedback information from experts and potential users. The reason for
the decision to use more than one method is not just a flair to try it all. My research is
touching subjects that are emerging and developing at the moment. Yes, there is nothing
new in controlling the lights with DMX - the technology was there for 35 years now. Yse,
realtime rendering(gaming) engines were also there for a couple of decades. Virtual
productions are also getting more and more as a common way of making films. I presume
that it will be very difficult to find experts equally profound in both real time engines and
DMX lighting. Therefore I chose to tackle it in three parallel directions in the hope that
even minimal results from each of them will deliver some valuable information or
innovative ideas.
Interview
To compose my interview I chose to use the approach called Problem Centered
Interview(PCI)(Döringer, 2021)(Figure 21). This approach is often used in social science
to interview the expert. It is based on the researchers prior knowledge and made in the
form of dialog and almost a discussion. Must haves for this method are follow up
questions. This method fits in the current situation because the author can be considered to
be a domain expert to a certain degree.
41
Figure 21 Problem Centered Interview(Döringer, 2021)
42
10. Would you prefer to control real lights from Unity or a standalone console?
11. Would you prefer to control virtual lights from Unity or an external console?
12. What do you think is the future of lighting control of mixed video production?
(mix between virtual and real sets)
13. Do you think DMX has a future in the virtual space of mixed video production?
14. If you had to buy such a solution what would be consideration points?
a. Price
b. Integration with other systems
c. Ease to use
d. Unique tech features
e. Support
15. Follow ups.
Even though PCI recommends having topics instead of directly formulated questions I
decided to keep them this way to discipline the researcher and avoid falling into the chaos.
Survey
As mentioned above, lack of the experts regarding the entire realm of my research topic
made me search for opinions outside of my circle. The community of Virtual filmmakers
on Facebook would be a great audience to get an opinion from. That would be a great
stimulus for me to develop application to the production or near production level of
readiness.
When my application was ready for usability test I have published a post with the short
description of my application and links to my repository on GitHub, Google Forms
questionnaire, and
I have embedded the button with the annoying red font pressing on which would open the
browser window with my survey.
43
The button is never displayed again after it has been pressed. According to my installation
manual CueEditor window functionality is displayed towards the end and therefore users
would have already tried the console and have an opinion to answer the questions.
Questionnaire structure.
The Google Forms published questionnaire was divided in two sections: one two collect
opinions regarding my research question and the other one two collect usability and
quality of the software.
First of all I would like to emphasize that I was not interested to know or consider any of
the respondents demographic data. Therefore there are no questions about age, nationality
or education.
Instead, for me it was very important to know the background people were coming from.
Precisely if they had background in rendering engines, virtual lighting, practical lighting
and DMX control.
Then there is a group of questions targeting lighting control placement rendering engine or
external console.
Then there were two questions which support or disprove some of my design decisions.
● Would you like to have an art-net out option?
● Would you like to have the DMX-in option?
Then there were two questions which support or disprove the presumed use cases.
Section two consisted of a pretty standard set of software evaluation questions.
● Ease of installation
● Ease of use
● Ease to learn how to use
● Hardware compatibility
● Operating system compatibility
● Ability to integrate with other apps
● Consistency with Interface
● Overall reliability
● Overall performance
● Software intuitivity
● How likely are you to recommend our software to others?
In the end there will be questions that both would help me to improve the software and
show new view angles on the problem.
Observations
Here I must not only include the observations of people using the software but also people
showing their interest on the topic when I demonstrate this software without their direct
participation.
Here is an example case.
44
Half way through development I was contacted by a group of students who’s project
supervisor was also Henrik Schønau Fog. Their topic was "Lean-Back Machina:
Attention-Based Skippable Segments in Interactive Cinema"(Rasmussen, Persson, Raursø,
& Petersen, 2021) . They needed rapid light change in the scene on both sides of the LED
screen.
Sounds like the right task for my application. We agreed to schedule the demo session
where I would demonstrate the program in action and teach them on DMX. They did not
have any prior knowledge of DMX. The entire session was recorded on video.
Observation of the session during playback helped me to discover new design ideas for my
application but also to see where DMX comes hard to grasp for a novice.
45
Design
Design Goals
When approaching the design phase of any software project it is important to define a set
of design goals that will make design decisions easier in the future. Process of software
development is full of tradeoffs and these goals will play the role of road signs when you
need to choose between, for example, look and performance.
Data model
DMX data by itself has a very simple structure. As was already mentioned earlier, the
DMX universe is just an array of bytes with the length of 512 elements. That means that
the entire DMX data construction is an array of arrays of bytes or simply two dimensional
arrays of bytes. So in pseudocode we can declare DMX data as
46
Class structure
From an object oriented point of view systems architecture follows both mediator and
observer patterns. On the Figure 23 you can find ArtNetData class(green outline) being
aggregated in almost every entity on the diagram.
Console, however, doesn't send the DMX values directly to the fixtures, but updates the
main data object instead like in the mediator. The fixtures are all subscribed to the changes
of the main DMX data exactly like in the observer pattern.
We can also see inheritance as all the different DMX devices derive from DMXFixture
class - it really helps to discover them all in the scene.
47
Figure 23. Entire system’s class diagram.
UI design.
When designing the user interface I had to keep in mind that Unity doesn’t have all the
GUI elements that are available for Windows or HTML. For example the dropdown box is
missing so I had to replace it with Select Grid which is a bar of buttons and takes much of
the screen space.
Therefore I had to use breadcrumbs UI pattern instead of the dropdown menus which I
planned in the beginning. There were some other challenges like missing toggle group.
Unity also has another system of GUI for editor windows which is called UIElementst, but
to me it looked to comlexed.
I have used a hierarchical design approach. In the first row there are buttons of general
purpose.
They affect the entire system. Below, there is a universe switch row. Under there is the
row of universe options like DMX out universe and receive art-net. It is hidden when the
‘All’ universe button is selected, elements which are single universe specific are hidden.
Below there are two view select buttons ‘Heads’ and ‘Channels’. If channels selected
when the universe is selected list of sliders, one for each DMX channel is displayed for.
If universe is enabled as art-net in universe - grid of labels displayed with current values of
the DMX channels instead of the sliders.
If the user selects the ‘heads’ view all the heads which belong to the selected universe are
displayed. Each head is selectable.
If the user selects one or more heads, the side panel will display the list of the sliders for
all the attributes selected fixtures have.
48
Figure 24 Mockup of the console’s UI.
Color probe.
When this project just got initiated one of the first feature requests for the possible solution
I received from this project’s supervisor was matching the real DMX lights color to the
light color in the virtual scene. At first I was not sure how to measure the color of virtual
light falling on the object in the environment.
During my literature research on the methods of illumination measurements I came across
an interesting one called mean room surface exitance (MRSE)(Cuttle, 2015).
The same author also introduces the cubic illumination concept, which is the spatial
illumination distribution over a point and characterized by the illumination value of each
cube’s facets.
This method is made for real interior illumination and addressed to help interior and light
designers(Duff, Antonutto, & Torres, 2015).
I can employ this method in the virtual environment by using virtual cameras to capture
the surface and then analyse the color information. Figure 25 demonstrates the conceptual
design. More on color probe in the implementation chapter.
49
Implementation
This chapter will reflect on the implementation phase of the application development. I
will go through all the challenges and solutions. I will also discuss possible alternative
solutions and why I did not use them. Readers who are not interested in knowledge of the
coding details can skip this chapter. All the source code fragments in this chapter are in C#
and are using Unity3d scripting API (Unity Technologies, 2020). I will start with general
issues and then take a detailed look at the elements.
● Reloads Domain - reset your scripting state, so the project starts with the fresh
one.
● Reloads the scene - destroys all the GameObjects.
Not to mention that it destroys everything but it also takes a very long time for Unity to
perform these two operations.
Luckily enough Unity introduced a new option in the project setting under Player menu
section
50
Figure 26 Disabling the reload in the project settings.(Unity Technologies, 2020b)
The effect of this settings could be seen on the diagram(Figure 26) below.
These modifications to the project helped to achieve more control over the data and
references. In addition all the classes that were intended to run in the editor mode were
equipped with [ExecuteAlways] attribute.
Editor Window
When I was starting to implement the window for my console I had a choice of two
systems that Unity has for the editor window programming. They have legacy OnGui()
which is just a rendering member function of the EditorWindow class. Recently they
introduced newer UIElements as a three layer compound system. It also has a class that
inherits from the EditorWindow class plus UXML file for markup and USS file for style.
Unity tried to copy common web apps architecture - backend, frontend and style. I found
that UI elements would take longer to learn and master and went the OnGui() pass.
I’ve made the console window as the central class of my system, it displays all the systems
controls and settings and aggregates communication and data objects. It also has some of
the control functionality but big portions of it delegated to other classes.
It has a dynamic UI and hides the irrelevant elements from the user. It also hides the
controls if the user chooses to pass control to the externa console.
Unity allows developers to add the menu items to the editor’s menu so the attribute
MenuItem over the Init() function.
ArtNetConsole.cs
[MenuItem("Window/Art-net/Console")]
static void Init()
{
//...
51
}
So the above code results in the new item added to the menu in order to open the console
window.
The script inside the Init() function makes sure that there is only one window of this type
in the editor.
ArtNetConsole.cs
ArtNetConsole window =
(ArtNetConsole)EditorWindow.GetWindow(typeof(ArtNetConsole));
window.titleContent.text = "ArtNet Console";
window.Show();
OnGui() function which is responsible for rendering the window delegates to the elements
render functions.
ArtNetConsole.cs
void OnGUI()
{
DrawLayouts();
DrawHeader();
DrawSidePanel();
DrawBody();
}
In the above code the DrawLayouts() is the one which holds the proportions of the window
zones which are rendered by DrawHeader(), DrawSidePanel(), DrawBody().
ArtNetConsole.cs
52
void DrawLayouts()
{
header.x = 0;
header.y = 0;
header.width = Screen.width;
header.height = 65;
side.x = 0;
side.y = header.height;
side.width = 400;
side.height = Screen.height - header.height;
body.x = side.width;
body.y = header.height;
body.width = Screen.width - side.width;
body.height = Screen.height - header.height - 20;
subHeader.x = 0;
subHeader.y = 0;
subHeader.width = Screen.width;
subHeader.height = 20;
}
You can see that in this way we can control the layout by relative and absolute values.
Basically this function configures the x, y, width and height of 4 private rectangle objects.
ArtNetConsole.cs
Rect header;
Rect side;
Rect body;
Rect subHeader;
53
Figure 29. Window divided into layout areas.
That was my implementation of the layout as you can see that was not very complicated
compared to implementing UXML layer. However it is not as convenient with the styles
and you need to change the style of every component individually.
Each area component contains a number of nested elements - buttons, toggles, sliders and
text or integer fields. Unity has two classes for the control elements GUILayout and
EditorGUILayout.
The first one was made for in-game layout before Unity introduced Canvas API. The other
one is actually made for the purpose I’m using - editor windows. They both work in the
editor window and I’m using both, because some of the elements exist only in one and not
in the other.
There is no event handlers attached to the GUI or EditorGUI elements so for example
ArtNetConsole.cs
This function returns the integer value of the index of pressed button every time OnGUI()
repaints.
The same situation with buttons and toggles with the only difference that they return
boolean values.
54
ArtNetConsole.cs
In a similar manner we can chain conditions to hide or display toggles in accordance with
the user’s choice.
ArtNetConsole.cs
You can see how enabling one option opens up rendering of the other. In this case if we
choose to set this universe as DMX out univers, then you can enable DMX serial
communication with the DMX fixture connected to the DMX line of the USB device.
Similar but not exactly the same situation is with sliders: they display the value of the
variable they are representing in the window. We should always check if the return value is
equal to the input value and if it is not we need to assign the new value to the variable
slider represent. Good example is DMX channel sliders.
ArtNetConsole.cs
55
Scriptable Object
Unity promotes the ScriptableObject class as being a perfect data container. It derives from
the same base object as MonoBehaviour class but it does not have some of the event
functions for example Update(). Scriptable object is aiming to save the memory as all the
objects which are referencing it are accessing the same copy.
ArtNetData.cs
However, in this case the fact that the data could not be saved to disk was not critical. For
me it was good enough that data was in order after I canceled the domain reload, so I could
retain the DMX values when the Play button was pressed. For stability reasons I have
included a check of ArtNetData actually exists in most critical places.
As you can also see on the Figure 31 there is one more public member to this class and it’s
of type UnityEvent. Unity has a built in event system and they promote it often together
with ScriptableObject for decoupling the components in the scene. Here I must explain my
intuition about the use of the ScriptableObject and UnityEvent in my system.
The idea was to use it as a wire used in the real DMX; all the lamps were “connected” to it
by aggregating it.
56
Figure 32. ScriptableObject aggregated by the light.
The event called dmxEvent would be raised every time there is something to update; it
helps to trigger functionality in the Edit mode when Update() function is not called.
Art-net
To implement art-net functionality I have used some of the code from(sugi-cho, 2018). As
was already mentioned, the ArtNetSocket object is aggregated in the console. After its
Open() function is called it creates the socket and starts listening on the specified local IP.
ArtNetConsole.cs
void OpenArtNet()
{
if (artnet != null)
artnet.Close();
artnet = new ArtNetSocket();
artnet.Open(IPAddress.Any, null);
ArtnetReceiver(CallUpdate);
}
The default port for art-net is 6454. After the socket is open, artnet is listening for
incoming packets and if it receives valid DMX packets it raises the calls to the delegate
function.
ArtNetConsole.cs
57
{
artNetData.SetData(universe, packet.DmxData);
}
}
callback();
};
}
This code is using the principles of functional programming when the delegate(sometimes
called callback) function is used as an argument of the caller function. You can also see
another example of functional programming in use of the co-called lambda function with
the event as an argument.
DMX.cs
That helped me get DMX out data directly from my ScriptableObject and by this let me
implement art-net in and DMX out to work simultaneously on the given universe.
DMX.cs
58
if (serialPort != null && serialPort.IsOpen)
{
serialPort.Write(TxBuffer, 0, TX_BUFFER_LENGTH);
};
Thread.Sleep(200);
{
}
}
I have developed three types of lights. Two of them RGBW light and Moving light are
virtual lights and have Unity light component attached to the transform. RGBW is set up
as point light and receives 4 DMX channels for changing RGBW values. The moving light
is a 6-channel fixture and on top of RGBW can move within Pan and Tilt planes.
The last DMXfixture child is a non-virtual light; it is a placeholder for the real DMX
fixture connected to the line. Users can assign a DMX address to this one which would be
the DMX address of the external fixture. The universe number should be of the DMX out
universe. Users can edit the list of the attribute names to be displayed in the console.
59
All of the DMX devices have ArtNetData ScriptableObject aggregated and therefore have
direct access to the data.They just use the values of the channels that this fixture’s address
is set to.
The discoverability of the fixtures is done by Unity’s FindObjectOfType() function which
finds all the objects derived from DMXFixture.
ArtNetConsole.cs
The operation that is triggered by ‘Find heads’ button is resulting into all the heads that
were found in the scene are displayed in the body area of the console(Figure 34).
60
Figure 34 Fixtures selection pane.
Fixture then could be selected for control. The DMX address of the fixture could be
changed in the console. Pressing the ‘auto’ button will automatically assign the first
available address in the universe.
Before I move on to the fixture control I must say that all the fixtures are packed as
prefabs and placing them in the scene is just dragging them from the project window to the
hierarchy. The script unpacks the prefabs right away. Otherwise after entering and then
exiting Play mode all the updated values like address and universe are reseted to the values
of the prefab.
Group Controller
Group controller is a little construct with big responsibilities. On the Figure 35 you can see
the class diagram of this very important component of my system.
61
Its functionality helps users to manipulate not with the numbered channels but with the
attributes names. For example if the user selects the fixtur in the fixtures pane the side
panel of the console will display the list of sliders - one for each attribute the fixture has.
So if for example we select 4-channel RGBW fixture we will get sliders as on the
Figure 36.
If we move the Red slider the light will start increasing the value of the red channel and
light will shine red. If you select one more light - the newly selected light will also turn
red. If you move Green slider color will change on both of them. Lets in addition to this
two select 6-channels Moving Light and the side panel will display more sliders as on the
Figure 37
If you move the color sliders - it will change the color on all: RGBW and moving lights. If
you move the pan or tilt slider, the beam of the moving light is going to move. In the other
words, sliders will affect the attribute of the light if it has it.
This is the most essential function of any lighting console. It translates the numbers into
attributes names for the user to control the light. Please note that this feature usually is not
available on the budget consoles which do not support the soft patch of the multichannel
fixtures.
Cue Editor
Now we can control the color, positions and whatever other attributes the fixtures can
have. We can adjust the look so we are happy with it and want to save it to be recalled
later. This preset as was already mentioned earlier in lighting vocabulary is called cue. To
achieve this functionality I have developed another editor window called Cue Editor.
Figure 38 shows both: class structure of the component and the way it looks in Unity
editor.
Many features of this window are similar to the ones of the console. It has its own menu
entry and executes always: in Edit and Play modes. It is also connected to the ArtNetData
object.
62
It can save cues - named snap shots of all the art-net data the system holds. The collection
of cues is usually called the cue stack . In the large lighting consoles you can play the cues
in a large number of different ways. Users can play them in sequence, set fade-in and
fade-out time, cross fade or combine them(Chamsys, 2014). My humble application can
only record them and play back one by one. All the cue stacks can be edited. The cues can
be removed. Stack could be cleared and saved to disk. Right now only one stack per scene
is supported and the user cannot edit the name of it - it gets the scene’s name.
As you can see on the Figure 38 the CueStack is the dictionary and saving it to disk was
rather challenging and could not be achieved by Unity built in serialization. Instead I’ve
used the(“Json.NET - Newtonsoft,” 2021) library to achieve the serialization of my stack.
The serialization was pretty much straight forward and could be achieved by code.
CueEditor.cs
63
System.IO.File.WriteAllText(path, json);
Reading data back from disk and deserialization was not as smooth and I had to copy data
byte by byte in the constructor of the Cue.
CueStack.cs
Color Probe
The color probe is the tool to extract the light information from the point in the virtual
scene. As I already mentioned in the design overview to achieve this I am using cubic
illuminance concept(Cuttle, 2015). I have modified it to the requirements of my
application.
Imagine that you have an actor standing in front of the LED wall that you want to
illuminate with. The LED wall displays a virtual environment and you want to match the
colors. I propose to use a reduced version of the cubic illuminance measuring method.
Since we are only interested in the information of light falling on the facets of the cube
that could be seen in the camera and we usually have two lighting points that would affect
the lights - we are left with two facets. And due to the fact that angle is not always 90
degrees we can just use two plane surfaces to capture the light. Figure 39 shows the shape
of the probe, angle could be adjusted.
64
Figure 39. Color probe in the scene.
There is a camera attached in front of each facet. The camera is set for orthographic
projection, it only renders the TransparentFX layer which the probe facets assign to. At
the same time I exclude this layer from the culling mask of the Main camera.
The probe cameras render into render texture - a special type of texture that could be
created in run time. The idea of the method was adapted from the youtube video(Bospear
Programming, 2018).
The render texture is then referenced by a placeholder object which I have made for the
real DMX lights. As you can see on the Figure 40 shows how render texture is used by the
fixture.
65
Then we copy the render texture to the 2Dtexture which we can retrieve RGB values from.
GenericDMXFixture.cs
RenderTexture.active = previous;
RenderTexture.ReleaseTemporary(tempTexture);
Color32[] colors = tempTexture2D.GetPixels32();
color = colors[0];
This color information is then set to appropriate channels of the ArtNetData and after is
transmitted to the real DMX fixture.
There is a possibility to correct each color channel’s information before it is sent to the
DMX device.
GenericDMXFixture.cs
The sliders on the Figure 40 can help user to add or subtract the DMX value.
I have added an automatic positioning feature to the probe. If you place the probe in the
scene it will automatically find the Main Camera and will place itself in front of it. Offset
distance to the camera could be adjusted.
66
Testing
This chapter will explain how the testing procedures were coordinated and executed. I will
start with technical testing of my system and after moving to the observations, survey and
the interview which would help me to answer the question of my research. Not everything
went down as it was planned. The detailed information - below.
In very early stages of the development I have added the serial DMX and art-net
functionality to the project. To test art-net connectivity I have used Chamsys On
PC(“ChamSys MagicQ Downloads,” 2020) and ArtNetominator(The ArtNetominator,
2019). The first is the computer based simulator of Chamsys’s large hardware consoles.
The software comes almost without any limitations compared to the commercial products.
The limitations are the number of output universes and not being able to use USB DMX
devices.
The second tool mentioned is the art-net data monitor which listens on all possible
network interfaces on the port 6454.
To test DMX serial communication I have used USB DMX Pro MkII (Enntec, 2021) . It
is a very advanced device, with much richer functionality than I needed for my
experiments.
On the fixture side I used:
● NanLite MixPanel 60 Bicolor Hard and Soft CCT and RGBWW Light
Panel(“NanLite MixPanel 60 RGBW LED Panel,” 2019)
● Eurolite LED THA-60PC Theater-Spot(Eurolite, 2017)
67
Figure 42 Mixpanel, Eurolite, USB DMX pro(“NanLite MixPanel 60 RGBW LED Panel,”
2019)(Eurolite, 2017)(Enntec, 2021)
Testing Set Up
All the testing sessions were situated at the SMILE lab (“Samsung Media Innovation
Lab (SMILE),” 2017) at Aalborg University campus in Copenhagen, Denmark. In
the lab there was a large LED wall consisting of 9(3 x 3) pieces of 55” Samsung
LED screens. Figure 43 demonstrates testing set-up layout of the equipment. As
you can see that it is convenient enough even for one person to operate the test.
The actor who performed in all the tests was SMILE lab’s mascot Cornelius.
68
Figure 43 Test Set-up in the SMILE lab at AAU.
Testing Environments.
To run the testing of my system number of the testing virtual environments was
constructed. Those environments did not have any specific quality or resolution
requirements. I will show two environment examples here and will tell you what they were
aimed to test
The Stage
This scene is like a playground for my system Figure 44. It is a large box made out of 5
planes(one side is open). The walls are made out of HDPR default material.
The stage contains a large number of the RGBW and Moving lights. Main purpose to test
art-net control of the virtual lights with shows made on the external lighting console. The
same stage was used to test the color probe in different sceneries. One of them was flying
through the dancing light beams.
The Corridor
This environment is built upon the assets which were taken from Unity’s demo game
project(Unity-Technologies, 2021). I’ve joined together some parts of the spaceship to
make a little space station. This corridor is one of the parts of it. I have placed 30 RGBW
lights assigned DMX addresses to all of them and made a number of presets. On Figure 45
you can see that all the lamps are set to different colors.
69
Figure 45. The corridor has 30 RGBW lights. Scene view(left) and main camera
view(right)
Color Probe
To test the performance of the color probe tool following test conditions were raised. The
environments mentioned in the previous chapter were equipped with the probe. To achieve
this the probe was added to the scene and two render textures - one for each facet of the
probe were referenced by two MixPanel60 placeholders Figure 40.
Two move the camera through the environment a simple FlyCamera.cs script was attached
to the main camera. That allowed the user to move the camera by standard for gaming
WASD keys.
70
Figure 46. Color probe visible(left) and invisible (right)
The topology of the set was similar to the the one on Figure 43, but only MixPanels60s
were used. The test was made in Unity’s Play mode. For the first run, the camera's culling
mask was set to render all layers which made the probe visible. The camera then was
moved by the operator across the environment. The point of the test was to compare the
color of the probe’s facets with the light output of the DMX controlled fixture.
The second run was made in the same conditions but with the culling mask being set to
exclude the TransparentFX layer. This made the color probe invisible.
The testing procedure was repeated multiple times in the corridor and on the stage and was
filmed for future analysis.
Observations.
Demo session.
Near the end of sprint 2 the group of undergraduate students from AAU contacted me.
They were working on a real time virtual production project. The topic of their research
was interactive storytelling. As part of the experiment they had to shoot a remake of one of
the scenes from the sci-fi film Ex Machina(Garland, 2015). The scene was situated in
the room with mainly white walls and white furniture. The room is lit by white
lighting. There are two actors having a conversation in the room. In the half way
through the episode the lighting changes to red. The episode was planned to be
filmed in the SMILE lab where actors would perform in front of the LED screen.
The on set lighting consisted of the DMX fixtures mentioned above. Some of the
group members had some photography and lighting experience and also good
knowledge of CGI in general and Unity in particular. They have also developed a
camera tracking system based on the HTC Vive products(Vive, 2021). However
they did not have any knowledge of DMX control and asked me to help them with
the task.
We agreed to have a demo session where I could demonstrate to them my system
and we can talk about different scenarios and use cases. The session was recorded.
Later playback and analysis helped me to improve the system and make it more
convenient to use for novice DMX users. Figure 47 demonstrates the session after
which the group took a timeout to make a decision on how to organize their pipeline.
71
Figure 47 Live demo session.
Deployment assistance.
After a short period of time the group returned to me and asked to assist them to integrate
my tools to their project. Appointment was made for the live session in the lab. After a
discussion which resulted in the number of possible solutions, the group finally accepted
the idea of using my system.
Notes taken during the session helped further improvements and gain better understanding
of why DMX is sometimes hard to get a grasp of for a novice.
The group achieved the successful filming of their episode to make it work for them they
had to do some pragmatic modifications. The results could be seen on the Figure 48.
72
Figure 48 Scene from the episode filmed featuring the DMX system.
Survey.
As was already mentioned above, a survey link was attached to the button on the window
of Cue Editor. At the moment of writing I have received 7 qualified answers. That means
that participants have already had some knowledge of either Unity or DMX control.
Interview.
To capture the sense of importance of the topic and to approve my design assumptions I
have appointed an interview with Sebastian Bülow. Sebastian is one of Denmark's most
innovative visual technical artists. He made a large number of live visual performances
alone with different music bands. He is an expert in the history of cinematography and
teaches the AV production course at Aalborg University. Not to mention that he designs all
the pipelines for his shows himself.
Since we both have knowledge and experience in the AV and lighting and both got very
excited about the matter. The interview's structure happened to be slightly different from
how it was planned, but essential information and strong interest in the technology was
captured. The interview was audio recorded.
Findings
The observations of the process of deploying and using the system by the peer group has
shown scepticism and confusion towards DMX in the beginning and total acceptance at
73
the end. The feedback I’ve received from the group made me modify the system. The
multiple iterations of explaining the technology helped me to make it more intuitive.
The expert interview did not answer the question of research directly but inspired the
researcher and the respondent to further investigations on the topic. Among interesting
ideas was video mapping and use of DMX for virtual scenography or scene reload.
Another interesting idea was two extend the beams of the real light on to virtual space to
create even more depth.
The survey questions answered by experts indicated strong interest in the technology, but
kept the ones which were not familiar with the DMX in advance confused.
The software was mainly evaluated as stable and compatible, but not intuitive and easy to
use by users with no DMX experience .
There is evidence of the correlation of proficiency in either DMX or Unity and opinion on
ease to use the opposite.
Survey also registered a high level of support for all proposed use cases. Two features
which were down prioritised by the developer art-net out and DMX in were voted as
important. One user stated that art-net out is must have for him because all his setup is
art-net.
Color probe was pointed out by one user as a progress in the right direction.
Discussion
● Protocol itself including Art-net and sACN, but no USB DMX support
● DMX fixtures package
● DMX engine
● Pixel mapping engine
The implementation is feature-rich and therefore seems to be difficult to grasp for the user
that is new to DMX. Before I came across this solution I had already decided to do my
implementation in Unity which I am much more familiar with. I have not studied UE’s
implementation in detail to minimize the bias in hope to find an even better solution.
However by watching the tutorial videos I have noticed one issue that I have resolved in
my solution. You can only see output of the lamps during Play mode and it makes it very
difficult to position and focus the lights. You have to run Play mode every time you want
to check how it works and all the changes made during Play mode are not saved so
74
sometimes the user has to switch multiple times from Edit to Play mode just to find the
right position, color or intensity. I think it is not a very big problem if you want to set up a
virtual live show, but for cinematography, very precise lighting is critical. It makes a big
difference to be able to view the light in Edit mode.
Personal evaluation.
Here, I would like to make a personal statement about my system and DMX on Unity in
general. I think that due to the low number of respondents with skills and experience in
both lighting and real time engines, I, who spend years in both of the technologys
separately and months working to join them together, have the right to be heard.
I think that overall system reliability is high. It works in different scenarios and is quite
robust. Can run in Edit and Play mode. For most of the scenarios I would use the system
and it would save me time to set up, but also simplify the control. Even with cue editors
being so primitive it is possible to change the look of the stage instantly.
I think Unity should make their own DMX engine to be the part of the platform and make
every light or even object to be able to assign different properties as DMX channels
sourcing the data from the common DMX data object similar to how it is done in my
implementation. The part of the DMX which usually causes confusion among novices is
addressing. Perhaps there are better ways to hide this complexity behind some easy UI or
API.
Conclusion
DMX is a good candidate to be used as a method of the control of not only lights
but other objects within both; virtual and physical space. Asychronical nature of
the DMX makes possible instant state changes. High availability of the DMX
equipment for both; control and output, and high number of trained lighting
personnel would convey the acceptance of DMX within real time engines. DMX
has shown itself as a reliable interaction bridge between legacy lighting equipment
and modern real time game engines. Future work is necessary in research as well
as in the development of even more convenient tools.
Future Work
Research
To conduct further research on possible acceptance of DMX in real time productions the
Technology Acceptance Model (TAM) could be used. TAM is an information systems
theory that models how users come to accept and use a technology(Davis, 1989). I would
suggest augmenting the general TAM with some external factors leading to the two main
constructs which would indicate the intention to use DMX(Chuttur, 2009). Figure 49
shows how the results and quality of the application’s use would be correlated to the
perceived usefulness and years of previous experience would correlate to ease of use.
75
Figure 49 TAM. proposed model.
If applied to a larger audience this model could be used in comparative research of
DMX against the non-DMX solution in completing the same tasks. In this study
result and the output of two solutions could be compared. Years of experience with
DMX could be the determining factor for perceived ease of use.
Development
There are many improvements that could be made to the system. Some of them were
known beforehand and were not implemented due to lower priority compared to the
others. Some were discovered during usability testing.
● Art-net output.
● Colors to DMX with the color picker
● Extended use of other color and tone notations than RGB(CCT, HSI).
● Tooltips could be added to all the controls.
Another interesting idea would be to create a light version of the DMX framework in the
form of a prefab to control 1-4 DMX fixtures with an easy set up scenario. Then this light
version could be compared with the current full featured system in completion of the same
tasks.
Acknowledgments
I would like to express my special thanks to this project’s supervisor Henrik Schønau Fog,
great A/V expert Sebastian Bülow and everyone who gave me valuable feedback.
Above all I would like to thank my family for great help and support during this project
and entire Master studies.
76
References
Ali-A. (2020). Fortnite Travis Scott *LIVE* FULL CONCERT Event! (HD) [YouTube
Alliance, A. (2015, June 29). What is Agile Software Development? Retrieved May 17,
basstronix. (2020, June 11). basstronix/UnityArtNet. Retrieved January 22, 2021, from
Bedard, M. (2020, September 20). Rotoscoping: The Perfect Marriage of Live-Action with
https://www.studiobinder.com/blog/what-is-rotoscope-animation-definition/
Bospear Programming. (2018). [Unity] Get Light Intensity On Player [YouTube Video].
Chamsys. (2014). Chapter 15. Cue Stacks. Retrieved May 14, 2021, from Chamsys.co.uk
website: https://secure.chamsys.co.uk/help/documentation/magicq/cue-stacks.html
ChamSys MagicQ Downloads. (2020). Retrieved May 14, 2021, from ChamSys website:
https://chamsyslighting.com/pages/magicq-downloads
9(37), 9–37.
CodePlex Archive. (2021). Retrieved May 3, 2021, from CodePlex Archive website:
https://archive.codeplex.com/?p=acn
77
Cooper, T. (2012, October 25). Unity Serialization - Unity Technologies Blog. Retrieved
https://blogs.unity3d.com/2012/10/25/unity-serialization/?_ga=2.193225901.1400
471804.1612794762-572226612.1606124811
https://workspace.google.com/u/0/marketplace/app/cross_reference/26911403334
7?hl=en&pann=docs_addon_widget
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of
davivid. (2016, January 30). davivid/Unity-DMX. Retrieved May 3, 2021, from GitHub
website: https://github.com/davivid/Unity-DMX
Diagrams.net. (2021). Diagram Software and Flowchart Maker. Retrieved May 17, 2021,
Disneyplus. (2021). Mandalorian, Technology S:1, E: 4. Retrieved January 11, 2021, from
Disneyplus.com website:
https://www.disneyplus.com/en-gb/video/8112cabc-5ed9-4176-b0f8-b4baa0ab15f
DMX Explained; DMX512 and RS-485 Protocol Detail for Lighting Applications. (2017,
https://www.element14.com/community/groups/open-source-hardware/blog/2017/
08/24/dmx-explained-dmx512-and-rs-485-protocol-detail-for-lighting-applications
78
DMX Overview. (2021). Retrieved May 3, 2021, from Unrealengine.com website:
https://docs.unrealengine.com/en-US/WorkingWithMedia/DMX/Overview/index.h
tml
265–278. https://doi.org/10.1080/13645579.2020.1766777
Duff, J., Antonutto, G., & Torres, S. (2015). On the calculation and measurement of mean
https://doi.org/10.1177/1477153515593579
Eevee — Blender Manual. (2021). Retrieved May 3, 2021, from Blender.org website:
https://docs.blender.org/manual/en/latest/render/eevee/index.html
Enttec. (2021). DMX USB Pro interface Mk2, the new industry standard | ENTTEC.
https://www.enttec.com/product/controls/dmx-usb-interfaces/dmx-usb-pro-interfac
e/
Eurolite. (2017). LED THA-60PC Theater-Spot bk. Retrieved May 14, 2021, from
https://www.thomann.de/gb/eurolite_led_tha_60pc_theater_spot_bk.htm
FAILS, I. (2020, April). How Previs Has Gone Real-Time. Retrieved May 4, 2021, from
https://www.vfxvoice.com/how-previs-has-gone-real-time/
http://backstagefox1929.blogspot.com/2014/03/part-i-hub-switchboard-operating-
manual.html
79
Gallardo, A. (2000). 3D lighting: history, concepts, and techniques.
Garland, A. (2015, January 21). Ex Machina. Retrieved May 18, 2021, from IMDb
website: https://www.imdb.com/title/tt0470752/?ref_=fn_al_tt_1
Gershun, A. (1939). The light field. Journal of Mathematics and Physics, 18(14), 51–151.
Git. (2021). Retrieved May 17, 2021, from Git-scm.com website: https://git-scm.com/
GitHub. (2021). Build software better, together. Retrieved May 17, 2021, from GitHub
website: https://github.com/
Google Docs. (2021). Retrieved May 20, 2021, from Google.com website:
https://docs.google.com
Holz, L. (2014). Jersey Indie. Retrieved March 17, 2021, from Jersey Indie website:
https://www.jerseyindie.com/the-black-maria-film-festival/
igolinin. (2021, April 29). igolinin/DMXtools. Retrieved May 26, 2021, from GitHub
website: https://github.com/igolinin/DMXtools
IMDB. (2016, April 7). The Jungle Book. Retrieved May 21, 2021, from IMDb website:
https://www.imdb.com/title/tt3040964/
IMDB. (2017, October 4). Blade Runner 2049. Retrieved May 21, 2021, from IMDb
website: https://www.imdb.com/title/tt1856101/?ref_=nv_sr_srsg_0
IMDB. (2019, July 12). The Lion King. Retrieved May 21, 2021, from IMDb website:
https://www.imdb.com/title/tt6105098/?ref_=nv_sr_srsg_0
Json.NET - Newtonsoft. (2021). Retrieved May 14, 2021, from Newtonsoft.com website:
https://www.newtonsoft.com/json
80
Karras, T., Laine, S., & Aila, T. (2019). A stylebased generator architecture for generative
Lammert Bies. Retrieved March 11, 2021, from Lammert Bies website:
https://www.lammertbies.nl/comm/info/rs-485
Lanier, L. (2018). Aesthetic 3D Lighting: History, Theory, and Application. Taylor &
Francis.
MyBib. (2021). MyBib Citation Manager. Retrieved May 20, 2021, from MyBib website:
https://www.mybib.com
NanLite MixPanel 60 RGBWW LED Panel. (2019). Retrieved May 14, 2021, from
Kamerahuset.dk website:
https://www.kamerahuset.dk/nanlite-mixpanel-60-rgbww-led-panel
Nishida, S. (2021, April 15). Netflix released, “DX at the shooting site” that is no longer
on location ... Next-generation studio infiltration. Retrieved May 10, 2021, from
Paint3D. (2018). Get Paint 3D - Microsoft Store. Retrieved May 17, 2021, from Microsoft
Store website:
https://www.microsoft.com/en-us/p/paint-3d/9nblggh5fv99#activetab=pivot:overv
iewtab
r/Maya - Will we get real time rendering like Eevee in Maya 2020? (2020). Retrieved May
81
https://www.reddit.com/r/Maya/comments/d2rnhq/will_we_get_real_time_renderi
ng_like_eevee_in/
Rasmussen, M. E., Persson, M. K., Raursø, N. E., & Petersen, T. A. (2021). Lean-Back
RocketJump Film School. (2016). Lighting 101: Quality of Light [YouTube Video].
Retrieved from
https://www.youtube.com/watch?v=Jw066PBZe60&list=PLw_JAmvzR_MBd6rU
WTpVma4demi1pnS9S&index=5
Samsung Media Innovation Lab (SMILE). (2017, May 3). Retrieved May 14, 2021, from
Cph.aau.dk website:
https://www.en.cph.aau.dk/collaboration/students/labs-campus/labs/samsung-medi
a-innovation-lab--smile-.cid314851
from https://www.britannica.com/art/stagecraft/Electrification#ref1004215
TED. (2020). How volumetric video brings a new dimension to filmmaking | Diego
https://www.youtube.com/watch?v=iwUkbi4_wWo&t=569s
The ArtNetominator. (2019). The ArtNetominator - Free ArtNet DMX Monitoring and
website: https://www.lightjams.com/artnetominator/
Unity Technologies. (2020a). Unity - Manual: Configurable Enter Play Mode. Retrieved
https://docs.unity3d.com/Manual/ConfigurableEnterPlayMode.html
82
Unity Technologies. (2020b). Unity - Scripting API: Retrieved May 12, 2021, from
Unity Technologies. (2020c). Unity - Unity. Retrieved May 17, 2021, from Unity website:
https://unity.com/
https://github.com/Unity-Technologies/SpaceshipDemo
https://github.com/Unity-Technologies/SpaceshipDemo
Unreal Engine. (2018). Rendering to Multiple Displays with nDisplay. Retrieved May 12,
https://docs.unrealengine.com/en-US/WorkingWithMedia/nDisplay/index.html
Unreal Engine. (2019). Virtual Production Hub - Unreal Engine. Retrieved January 14,
https://www.unrealengine.com/en-US/virtual-production?sessionInvalidated=true
Unreal Engine. (2020a). Control Lighting With DMX In 4.26 | Inside Unreal [YouTube
https://www.youtube.com/watch?v=p9XUd4TQl2Y
https://tsp.esta.org/tsp/documents/docs/ANSI-ESTA_E1-11_2008R2018.pdf
83
Visual Effects Society. (2021, May 20). Intersection Between Camera, VFX & Lighting
for Virtual Production. Retrieved May 22, 2021, from Vimeo website:
https://vimeo.com/483297338?utm_campaign=5370367&utm_source=affiliate&ut
m_channel=affiliate&cjevent=b303ef85b22311eb814933300a180510&clickid=b3
03ef85b22311eb814933300a180510
Visual Studio Code. (2016, April 14). Retrieved May 17, 2021, from Visualstudio.com
website: https://code.visualstudio.com/
Vive. (2021). VIVE European Union | Discover Virtual Reality Beyond Imagination.
Whelan, M. (2016). Arc Lamps - How They Work & History. Retrieved March 9, 2021,
Appendix.
Random Figures.
Here I would like to present some interesting images from the virtual productions.
of them don’t need any comments.
84
(Unreal Engine, 2020b)
85
Note LED lights positioned right above the lens. Stargate studio. (Unreal Engine,
2020b)
Diego Prilusky’s volumetric video, hundreds of cameras that capture light and motion
from every angle.
86
Unnamed Netflix production, Japan(Nishida, 2021)
87
Unnamed Netflix production, Japan(Nishida, 2021)
88
“A night view seen through a champagne glass. It's hard to make such a video by
"synthesis", but it's easy with virtual production.”(Nishida, 2021)
89
Manual
Installation
1. Download the package from
https://github.com/igolinin/DMXtools/blob/main/DMXtools/DMXtools.unityp
ackage
2. Open Unity HDRP project ( or create a new one).
3. Select Menu/Assets/Import Package/Custom Package and browse to
DMXtools.unitypackage. Click “Import”.
4. Navigate to Menu/Edit/Project Settings/Player and set
a. Api Compatibility Level: .Net 4.x,
5. Navigate to Menu/Edit/Project Settings/Editor and set
a. Enter Play Mode Settings: true
b. Reload Domain: false
c. Reload Scene: false
Operations
!!! In case of any errors which might come after transitions between play
and edit mode please, use the “Find Heads” button. It fixes the connections
and rebuilds data objects if missing>
2. User can select one of 8 built-in art-net universes or choose all(heads only)
3. The selected universe can be marked as “DMX Out Universe” to send data of this
universe to the real DMX fixtures connected via DMX usb dongle(only Enttec usb
pro MKii tested, but should also work with open DMX dongle as well).
90
a. When the universe is enabled as “DMX out universe”, the option “Serial
DMX” becomes visible.
b. USB DMX devices must be connected to the PC before “serial DMX” is
enabled.
c. Only one physical DMX universe is supported.
4. To get familiar with systems features please open demo scene “Playground”
5. Number of virtual DMX fixtures are already placed in the scene. By selecting the
universe and choosing the heads view user can
a. “Find heads”, which detects all the universe’s DMX fixtures that are
present in the scene. The same button can be used in case of the error
which might occur when exiting play mode.
b. “Reset Selection” resets the selection of the heads.
6. When one or more heads are selected, the side panel displays sliders for the
fixture’s common attributes.
7. Each universe including DMX out universe can be set to “Receive Art-net” in this
case art-net data is received from the external art-net controller. This feature only
works real time in the play mode, but pressing “Update Art-net Data” can update
virtual fixtures output after the latest received art-net data while in edit mode.
8. Cue - the snapshot of all the DMX data values for all universes can be saved with
a basic cue editor.
9. Navigate to Menu/Window/Art-net/Cue Editor.
10. To save the cue simply enter the cue name and click the “Record Cue” button.
91
11. To remove the cue click “Remove Selected”
12. To save the entire cue stack click “Save Stack”. The stack is saved on a per scene
basis.
13. To play back the cues enable the “Playback” option. It is recommended to reset
selection and black out on the console before beginning the playback.
14. To add more fixtures to the scene drag and drop prefabs from
Assets/Prefabs/Lights/HDRP.
a. Set universe and address numbers in the inspector.
b. Set address number in console.
c. Click ‘auto’ to set the first available(not used) DMX address.
d. Press “Find heads” if there are any errors.
15. To control practical DMX fixtures via USB dongle, the placeholder object must
be placed in the scene. You can find prefabs of these objects in
Assets/Prefabs/Lights/Practical.
a. Drag and drop GenericDMXFixture to the scene
b. Chose number of channels in Channels/Size
c. List all the channel attribute names under “Element1, ... Element N”
d. Select Universe and DMX address.
e. Select the color probe render texture if you want this fixture to follow the
color of the probe.
f. Send DMX must be enabled to control the fixture by probe’s values.
g. Move sliders up and down to make color correction.
i. You can add or subtract 0-255 units to/from any color component.
Clamped result will still be within DMX valid values(0-255)
92
Color probe(experimental).
21. To place the color probe in the scene drag and drop ColorProbeSet prefab
from Assets/Prefabs/Tools/ColorProbSet in the scene it will position itself in
front of the camera. Offset could be changed in the inspector.
22. Please exclude the TransparentFX layer from the culling mask of your
Main Camera to make the color probe invisible.
93
94
Survey results
95
96
97
98
99
100
101
102
103