0% found this document useful (0 votes)
16 views41 pages

Cgavr Clouds

Uploaded by

nikhil2102005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views41 pages

Cgavr Clouds

Uploaded by

nikhil2102005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Umeå University June 13, 2022

Department of Physics

3D Cloud Visualization In Real-Time

Filip Nilsson, [email protected]

Master Thesis
Master of Science Program in Engineering Physics
3D Cloud Visualization In Real-Time June 13, 2022

3D Cloud Visualization In Real-Time

Filip Nilsson, [email protected]


June 13, 2022
Master Thesis, Master of Science Program in Engineering Physics, Umeå University, Umeå, Sweden

Examiner: Patrik Norqvist, [email protected]


Internal supervisor: Eva Krämer, [email protected]
External supervisor: Izze Niemi, [email protected]

i
3D Cloud Visualization In Real-Time June 13, 2022

Acknowledgments
First of all, I want to thank Tactel AB for this opportunity to carry out this project with them, where
they have been very open-minded and generous by letting me use their locals and their software. It
has been a beneficial project for me to get an insight into what the life of a software engineer can look
like. Then I also want to thank Izze at Tactel for her availability together with fast and helpful replies
when I have had problems or needed someone to discuss ideas with. Further, I also want to thank Eva
at the university for her fast and great feedback on my writing. Lastly, I want to thank my partner
Vilma for her patience when I have entered my programming mode, where I can lose the perception
of the outside world.

ii
3D Cloud Visualization In Real-Time June 13, 2022

Abstract
The simulation of clouds can make virtual environments appear more realistic. This project produces
an algorithm that visualizes 3D clouds in real-time. The algorithm consists of two processes, an
initialization and a visualization process. The initialization process initializes clouds that are static
in shape, where the shape is composed of a system of semi-transparent spherical particles. The
visualization process moves, colors and draws the clouds on the screen. The position of the clouds
change with time to simulate a simple cloud motion. The cloud particles are colored by solving
the light transport equation by assuming that the sunlight scatters once with a cloud particle before
reaching the camera that captures the scene. By solving the light transport equation, the clouds
change their color and brightness depending on the direction of the incident sunlight. The algorithm
can be described as a trade-off between computational performance and quality in the generated
visual result. However, the algorithm proved to have frame rates that can be categorized as real-
time performance. By modeling each cloud out of 100 − 800 particles, the algorithm requires a few
minutes of initialization to produce a virtual cloud scene which has a frame rate of around 60 FPS
when simulating up to 1000 cloud particles. For the same initialization time the algorithm can produce
a virtual cloud scene with frame rates of 24 FPS or better when simulating up to 4000 cloud particles.

iii
Contents
1 Introduction 1

2 Theory 2
2.1 Overview over clouds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2.2 Related work in the cloud simulation area . . . . . . . . . . . . . . . . . . . . . . . 3
2.2.1 The light transport equation . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2.2 Other work on cloud simulations . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 OpenGL - Open Graphics Library . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.4 Performance measures for computer programs . . . . . . . . . . . . . . . . . . . . . 9

3 The Implementation 10
3.1 Modeling phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.2 Animation phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.3 Rendering phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3.1 Solving the light transport equation over the clouds . . . . . . . . . . . . . . 14
3.3.2 Evaluating the solved light transport equation . . . . . . . . . . . . . . . . . 16

4 Result 20
4.1 Performance study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.1.1 Initialization process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.1.2 Visualization process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.2 Visual Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

5 Discussion 27
5.1 Trade-off between computational performance and quality of the visual results . . . . 28
5.2 Limitations of the algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.3 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.3.1 Optimize the initialization process . . . . . . . . . . . . . . . . . . . . . . . 30
5.3.2 Optimize the visualization process . . . . . . . . . . . . . . . . . . . . . . . 32
5.3.3 Improving the performance study . . . . . . . . . . . . . . . . . . . . . . . 33

6 Conclusion 33
3D Cloud Visualization In Real-Time June 13, 2022

1 Introduction
Visualization of natural phenomena such as clouds can be the difference that makes artificial environ-
ments appear more realistic. On the other hand, realistic clouds can be challenging and computation-
ally complex to visualize on computers. This is mainly due to the complicated underlying physics
of clouds that needs to be resolved over millions of simulated cloud particles. In addition, artificial
environments are ever-changing environments that set high demands on the visualizations. Such de-
mands can, for example, be that the visualization needs to run in real-time and adapt to changes in the
environment. To this end, various methods with different computational complexity have been de-
veloped to visualize realistic clouds. A high complexity generally indicates that the method resolves
more of the underlying physics, which typically results in a more realistic appearance. However, a
high complexity can compromise the real-time performance of the visualization [1].

The digital interaction agency Tactel AB is a company that mainly develops software applications.
Several of those applications target the aircraft industry. One of those is the Panasonic Avionics ARC.
The ARC can be described as an interactive 3D flight map that uses a 3D engine. This 3D engine
is a real-time high resolution moving map of the Earth, written in the programming language C++
and using the computer graphics library OpenGL [2]. Despite the fact, that the aim with the engine
is to visualize the real environment, the engine lacks realistic clouds. Therefore, the purpose of this
project is to implement an algorithm that visualizes realistic 3D clouds in real-time in Tactels’ engine.
The goal is that the algorithm can adapt to changes in the dynamic environment of the engine. Such
changes can be that the camera or the sun changes position.

Further, for Tactel to benefit from this work, the algorithm must have a good real-time performance.
Hence, the computational complexity of the algorithm must be reasonable in order to run with other
processes that are implemented in the engine. To ensure this, some simplifications are made. There-
fore, some of the underlying physics are neglected, such as multiple scattering of light in the clouds
and processes involved in the formation and destruction of the clouds. Hence, the algorithm does not
need to be a completely physically correct simulation. Instead the algorithm can be considered as a
simulation inspired by physics. Furthermore, this project is limited to visualizing only one type of
cloud, namely the cumulus clouds.

The method to build this algorithm can roughly be divided into three separate steps. The first step is
to form clouds consisting of spherical particles. The second is to add motion to the clouds making

1
3D Cloud Visualization In Real-Time June 13, 2022

their appearance more realistic. The last step is to color the clouds by capturing how they are affected
by light sources in their surrounding. Finally, with the algorithm implemented a performance study is
made to determine the real-time performance of the algorithm.

2 Theory

2.1 Overview over clouds

An atmospheric cloud is a mass of water particles floating in the atmosphere. These particles consist
of water droplets, ice crystals, or a combination of them both. There are always water particles in
the atmosphere, but most of them consist of invisible vapor. The vapor originates from water sources
like the ocean, where parts of the water have evaporated. Clouds form when vapor is cooled and con-
denses to water droplets or even freezes to ice crystals. For example, a cloud can form when vapor
rises in the atmosphere. The vapor then condenses to water droplets due to decreased temperature and
pressure. On the other hand, if a cloud gets heated or the pressure increases the water droplets will
vaporize and the cloud dissolves [3].

Clouds can be categorized in different ways, where the most common is by the factors of shape and
height. These factors mainly depend on wind velocity, temperature and pressure [4]. The different
height categories are low clouds (0-2 km), middle clouds (2-7 km) and high clouds (5-13 km). In
addition to the height categories, there are three main types which are distinguished by shape, namely
Stratus clouds, Cumulus clouds, and Cirrus clouds [3].

Stratus clouds, Cumulus clouds, and Cirrus clouds have their own characteristics. Stratus clouds
belong to the low clouds and consist of water droplets. They are uniform in color and have indistin-
guishable structures. Stratus clouds appear as a blanket covering the sky [5]. Cumulus clouds also
belong to the low clouds, and thereby they consist primarily of water droplets. Unlike Stratus clouds,
Cumulus clouds have clear structures. Cumulus clouds have a fluffy, cauliflower-like top and a flat
bottom. The fluffy top arises from non-uniform wind velocities in the atmosphere. The flat bottom is
due to that the condensation level is approximately at the same height over the whole cloud [6]. On
the other hand, Cirrus clouds belong to the high clouds and consist primarily of ice crystals. Cirrus
clouds have clear structures that can be described as wispy and thin [7].

2
3D Cloud Visualization In Real-Time June 13, 2022

2.2 Related work in the cloud simulation area

Various methods have been developed to simulate realistic clouds in artificial environments. The dif-
ferent environments set different requirements for the simulations. Such a requirement can be that
the clouds need to adapt to changes in a dynamic scene. Alternatively, the clouds can be stationary
in a static scene [1]. Some environments demand the clouds to have a 3D shape. A 3D shape makes
the clouds appear different depending on the viewing angle. Another requirement may be that it must
be possible to move through the clouds in a realistic way. Some environments need the simulation to
run in real-time [12]. Consequently, the chosen simulation method must reflect the demands that the
environment sets.

Generally, the various cloud simulation methods can be categorized into two categories, namely
physics-based approaches and procedural approaches. The physics-based approaches are based on
the underlying physics. For example, based on physics, they simulate cloud forming, cloud motion
and how nearby light sources affect the clouds. The procedural approaches simulate the same pro-
cess but without connection to the physics. Instead, the procedural approaches use noise functions,
textures, and pictures to form, move, and color the clouds. Generally, the physics-based approaches
can be more computationally complex because they often include solving partial differential equa-
tions. However, the physics-based approaches are usually not physically accurate even though they
are based on physics. The physics-based approaches still simplifies and make assumptions about the
underlying physics to increase the computational performance of the simulations. Beyond the two
categories above, each simulation can roughly be divided into three phases, a modeling phase, an ani-
mation phase, and a rendering phase. The modeling phase defines how the cloud shape is represented,
the animation phase determines the cloud motion, and the rendering phase determines the color and
the appearance of the cloud. These phases are either computed before run-time, during run-time, or
a combination between before and during run-time, because moving computations to before run-time
can increase the overall real-time performance [1].

2.2.1 The light transport equation

When light interacts with a media, in this case clouds, the light intensity is reduced due to absorption
and scattering with the cloud particles. Absorption extracts energy from the incident light and the
light intensity decreases when the light travels through the cloud. Scattering is when the incident
light collides with cloud particles and gets scattered in different directions. The scattering can be

3
3D Cloud Visualization In Real-Time June 13, 2022

categorized as elastic or inelastic scattering, where elastic implies that the energy of the incident light
is conserved after the collision. In contrast, inelastic scattering is the opposite. One type of elastic
scattering is Mie scattering [9]. Mie scattering occurs when the collision particles are similar or larger
in size compared to the wavelength of the incident light. Typical for Mie scattering, is that it scatters
all wavelengths of light approximately equal in a forward direction. For cloud particles Mie scattering
is dominant in the visible spectrum because the water droplets are about the same size as the wave-
length of the light in the visible spectrum. Thus white sunlight gives the cloud its white appearance
[8]. In clouds scattering is more dominant than absorption, because clouds have high albedo values
[11]. Albedo is a measure of reflectivity, where high albedo means high reflectivity.

Mathematically the process of light interacting with a media can be modeled with the light transport
equation (LTE). The light transport equation is derived by N. Max [14] and takes the form

Z d
I(d, ⃗x) = I(0, ⃗x)T (0, d) + g(s)T (s, d)ds, (1)
0

The LTE can be interpreted as a line integral over a ray parameterizing a medium in the direction ⃗x,
where s is a parameterization of the line along ⃗x. Thus s = 0 is where the ray enters the media while
s = d is where the ray exits the media. I(d, ⃗x) represents the intensity of light at distance d along
the ray in direction ⃗x. The term I(0, ⃗x) then represents the intensity of the light before the ray has
entered the media, thus this term represents the intensity of light coming from behind the cloud from
direction ⃗x. The term T (s0 , s1 ) represents the transmittance, which is a measure of the ratio of light
intensity preserved after traveling a certain distance in a medium [14]. The transmittance is defined
as

T (s0 , s1 ) = e−τ (s0 ,s1 ) , (2)

here τ is the optical depth of the medium. The optical depth is a dimensionless quantity that contains
information about how opaque the medium is. Mathematically the optical depth is defined as
Z s1
τ (s0 , s1 ) = k(s)ds, (3)
s0

where k(s) is the attenuation coefficient. The coefficient can be defined as

k(s) = ση(s), (4)

4
3D Cloud Visualization In Real-Time June 13, 2022

where σ is the attenuation cross section and η(s) is the number density of the medium at distance s.
The term g(s) is a source term [14], that accounts for multiple scattering by defining it as
Z
g(s) = αk(s)P (θ(⃗x, ⃗x′ )) I (s, ⃗x′ ) d⃗x′ . (5)

Here α is the albedo value of the media. The integral is over the whole solid angle of a sphere. In
other words, the integral captures intensity contributions from all incident light directions ⃗x′ at the
current distance s in the medium. The angle θ is the scattering angle between the directions ⃗x and
⃗x′ . The phase function P defines the characteristics of the scattering. For clouds the phase function
should characterize Mie scattering. An approximation of Mie scattering is the Henyey-Greenstein
phase function

1 1 − q2
P (θ) = . (6)
4π (1 − 2q cos θ + q 2 )3/2

The constant q ∈ [−1, 1] is a symmetry parameter defining the direction of the scattering. If q > 0, the
phase function peaks in the forward scattering direction. In contrast, if q < 0, the phase function peaks
in the negative scattering direction. Lastly, q = 0 implies a scattering independent of the scattering
angle, namely isotropic scattering [14]. To conclude, the first term in the LTE represents how the
light intensity from behind the medium decreases due to absorption, while the second term represents
how the incident light intensity from other directions decreases due to scattering and absorption. An
illustration of the different terms in the LTE can be seen in Figure 1, where blue atmospheric light
from behind the cloud interacts with a cloud particle, and where yellow sunlight is scattered multiple
times before interacting with the cloud particle from different directions.

Figure 1: Illustration of the light transport equation, where a particle is exposed to light
from the background (a blue sky) and light that has been scattered multiple times (sunlight)
before interacting with the particle from different directions.

5
3D Cloud Visualization In Real-Time June 13, 2022

2.2.2 Other work on cloud simulations

This section will review some previous methods developed to simulate clouds, where the different
methods uses different approaches to simulate clouds. A fully procedural method was developed by
M. Hasan et al. [10]. In which the volume representation and rendering of the clouds are made with
textures generated by Perlin noise. Unlike fully random noise seen in Figure 2b with no noticeable
pattern, Perlin noise seen in Figure 2a has a noticeable pattern. In the method by M. Hasan et al. [10]
Perlin textures are applied on polygons to form the clouds. Lastly, the method simulates motion by
shifting the position of the textured polygons. This method performed well in real-time.

Another method which is a more physics-based method was developed by T.Nishita et al. [11]. This
method uses metaballs as volume representation of the clouds. Metaballs are a particle system where
nearby particles merge and form a uniform structure. This method neglects motion but solves the
light transport equation. The light transport Equation 1 captures how nearby light sources affect the
cloud. This method uses the light transport equation to calculate multiple scattering in the clouds.
The method solves the equation by exploiting the fact that the cloud particles scatter strongly in the
forward direction, together with numerical integration and the Monte Carlo method. Based on these
calculations, the cloud is colored. This method requires several minutes of computational time to
render a stationary cloud image.

A similar method that also solves the light transport equation for multiple forward scattering with
real-time performance was developed by M. Harris et al. [12]. The method approximates the light
transport equation by Riemann sums. This method uses static clouds with particle systems to repre-
sent the cloud volumes.

The last method to be mentioned is also developed by M. Harris et al. [13]. This method represents
the cloud volume by textures and illuminates the texture by a similar procedure as by M. Harris et
al. [12]. In addition to that, the method by M. Harris et al. [13] also has a physics-based motion.
The motion is calculated by discretizing the fluid flow equation, namely the Navier-Stokes equation,
together with thermodynamics, water condensation and evaporation. This method proved to have a
real-time performance.

6
3D Cloud Visualization In Real-Time June 13, 2022

(a) Perlin noise. (b) Fully random noise.

Figure 2: Illustration of Perlin noise and fully random noise.

2.3 OpenGL - Open Graphics Library

Open graphics Library (OpenGL) is a graphics library that can be utilized to render computer graphics
[15]. To render a object on the screen there are some basic components that need to be implemented.
Firstly, a mesh of the object is needed. The mesh represents the shape of the object consisting of
connected vertices [17]. An example of a triangular mesh of a 3D sphere is illustrated in Figure 3.
With the shape of the object complete the next step is to position, resize, and orient the object on
the screen, which is done through transformations. Transformations are matrix-vector multiplications
between a transformation matrix and a position vector [18]. To change the position the translation
matrix  
1 0 0 tx
 
0 1 0 ty 
 
T =


 (7)
0 0 1 tz 
 
0 0 0 1

is used, where ti represents the translation in direction i. To resize the object, the scaling matrix
 
sx 0 0 0
 
 0 sy 0 0
 
S=
 
 (8)
 0 0 sz 0
 
0 0 0 1

7
3D Cloud Visualization In Real-Time June 13, 2022

is used. Here si represents the scaling for the component in direction i. Lastly, to determine the
orientation of the object, the rotation matrix
 
cos Θ + n2x (1
− cos Θ) nx ny (1 − cos Θ) − nz sin Θ nx nz (1 − cos Θ) + ny sin Θ 0
 
ny nx (1 − cos Θ) + nz sin Θ cos Θ + n2y (1 − cos Θ) ny nz (1 − cos Θ) − nx sin Θ 0
 
R= 
nz nx (1 − cos Θ) − ny sin Θ nz ny (1 − cos Θ) + nx sin Θ cos Θ + n2z (1 − cos Θ)
 
 0

0 0 0 1
(9)
can be used, where Θ represents the rotation angle and [nx , ny , nz ] is the axis to rotate around [18].

Figure 3: A triangular mesh of a 3D sphere.

With both the mesh and the transformations of the object complete, the object can be colored in a
shader. A shader is a program operating on the graphics processor unit (GPU) and not on the central
processor unit (CPU), where for example the C++ code can operate. A shader is partly responsible for
the processes of rendering the graphics [16]. The simplest shader consists of a vertex- and a fragment
shader. The vertex shader calculates the positions of the vertices in the mesh after the transformations.
Then the positions are passed to the fragment shader, which colors the vertices and interpolates the
wanted color between the vertices [17].

The colors in OpenGL consists of four color channels, namely RGBA. RGB denotes the red, green
and blue color channel, while A represents the opacity channel [17]. All four color channels are
clamped between [0, 1], where for example RGB = [1, 0, 0] yields a red color, RGB = [0, 1, 0] yields
a green color, RGB = [0, 0, 1] yields a blue color, RGB = [0, 0, 0] yields a black color while RGB
= [1, 1, 1] yields a white color. An opacity value of zero corresponds to fully transparent and a value
of one corresponds to fully opaque. There are occasions when a color span ranges outside of [0, 1],
to not lose detail in such scenarios, a method called Tone mapping can be utilized. Tone mapping
methods transform intervals to the range [0, 1]. Such a method is gamma correction [19]. It maps the

8
3D Cloud Visualization In Real-Time June 13, 2022

range [0, C −1/γ ] to the range [0, 1] by the formula

γ
Iout = CIin , (10)

where γ, C > 0 are constants, Iin is the uncorrected value and Iout is the corrected value [20].

2.4 Performance measures for computer programs

In order to determine the performance of a computer program, the term computational complexity or
just complexity is commonly used. The complexity of a program contains information about its effi-
ciency, where a high complexity implicates a bad efficiency. Generally, the complexity of a program
can be divided into time and space complexity. Time complexity is often referred to as only com-
plexity. Time complexity represents the order of magnitude of the elementary operations required,
while space complexity represents the order of magnitude of the memory required. Both time and
space complexity can be represented with ordo notation O. Where for example, O(1), O(n), O(n2 )
represents constant, linear and quadratic complexity [21].

When writing computer programs, different data structures are used. Such structures can be vectors
and hash tables. Both these structures are containers where values can be stored. In a vector, each
value is stored to a specific index, while values in hash tables are stored to a specific key. Both vectors
and hash tables have space complexity of O(n), where n is the number of elements in the container.
The data structures have different operations that have various complexity. Operations of O(1) are
for example, looking up indices or keys in vectors or hash tables. While looping through vectors and
hash tables, the complexity is of O(n). On the other hand, looping through a two-dimensional vector,
namely an array of dimension n × m has complexity O(n × m). In addition, vectors may need to have
a particular order. For ordering vectors, sorting algorithms can be used. These sorting algorithms can
have different complexity. Table 1 shows the complexity of two sorting algorithms, namely insertion
and merge sort. The insertion sort traverses a vector and sorts one element at a time according to a
wanted sorting order, while the merge sort divides the vector into a number of sorted segments and
merges the segments in pairs until all segments have been merged. The merge process is a function
that merges two sorted segments such that the merged segment is sorted [21].

Another performance measure, especially useful for graphic programs, is the frame rate. The frame
rate is measured in number of frames per second [FPS]. For example, a program with 10 FPS shows
10 scenes per second. There exists no such thing as an optimal frame rate, since different artificial

9
3D Cloud Visualization In Real-Time June 13, 2022

Table 1: Summary of the best, average and worst case complexity together with space
complexity, for the sorting algorithms insertion and merge sort.

Complexity Complexity Complexity


Sorting algorithm Space Complexity
Best Average Worst
Insertion Sort O(n) O(n2 ) O(n2 ) O(1)
Merge Sort O(n log n) O(n log n) O(n log n) O(n)

environments have different optimal frame rates. For example, movies are usually shown in 24 FPS,
while games and sports where the motion is faster are shown in higher counts around 30 − 60 FPS. A
higher frame rate generally leads to a smoother motion and a more detailed visualization. A too high
a frame rate can lead to an unnatural visualization, as there may be too many details for the eye to
capture, while a too low frame rate can make the visualization appear jagged and unrealistic [22]. If
a visualization is to be rendered in real-time, the frame rate limits the rendering time that each scene
can require. This time is mapped to frame rate in Table 2.

Table 2: The maximum rendering time a frame can require to reach certain frame rates.

frame rate [fps] rendering time per frame [ms]


1 1000
24 42
30 33
60 17

3 The Implementation
The method to visualize clouds that has been developed in this work is a combination between a
physics-based and a procedural approach. The modeling and animation phases are categorized as
procedural compared to the rendering phase, which is physics-based. If all the modeling, animation
and rendering phases ran during run-time, the implementation would be slow and have poor real-time
performance. Therefore the implementation is divided into the initialization process and the visual-
ization process. The initialization process, shown in Algorithm 2, runs once before run-time, while
the visualization process, shown in Algorithm 3, runs during run-time for every frame. Generally, the
initialization process contains parts that only need to run once or parts that are constant during the
simulation. The parts that are not in the initialization process run in the visualization process. Fur-
ther, the implementation is constructed to work with the dynamic environment of the 3D engine. The

10
3D Cloud Visualization In Real-Time June 13, 2022

focus is to visualize the clouds such that they reflect the environment where they are positioned. This
is achieved by taking into account the camera’s viewing angle, the atmospheric light, and the sunlight.

3.1 Modeling phase

A particle system is used to represent the volume of each cloud. Each cloud is an independent parti-
cle system where the positions of the particles are fixed relative to other particles in the same cloud,
which implies that the shape of each cloud is static during the simulation. The shape of each particle
is modeled as a sphere, where the mesh of a sphere is built once and then reused for all particles. The
mesh can be seen in Figure 3. Each particle is given five attributes, a radius r, a position ([x, y, z]), a
particle number, a color ([r, g, b]) and an opacity value (a). For simplicity, the radius is chosen to be
equal for all particles, while the other attributes vary.

To form the shape of a single cloud, a random number generator is used to generate the positions of
the particles. To control the shape of the cloud, several larger spheres Np of radius R are positioned
manually to form a basic cloud structure around the origin. Then the random number generator is
used to position smaller particles inside the larger spheres. The positions are stored in vectors. Every
time a new particle is added, a test is run to ensure that the new particle does not overlap already
existing particles by a factor c ∈ [0, ∞] of the radius of the particles. When c = 0 the particles can be
placed at the same position, while for c = 1 the particles can at nearest be placed at a radius from each
other. The larger the factor is, the fewer particles are needed to fill out the volume. To approximate
the number of particles np needed to fill out a cloud, the formula

Np R3
np = (11)
cr3

can be used. The formula is derived by approximating that the volumes of the larger spheres and
the smaller spheres are equal. Conversely, the radius r of the cloud particles can be approximated
by Equation 11 by instead explicitly defining np . The implementation uses c = 0.6. This value is
selected so that the clouds do not appear sparse, while at the same time, the particles are not allowed
to fully overlap, to reduce the total number of particles needed to fill out the cloud.

Two manually designed cloud styles are used for the position of the larger spheres. These are seen in
Figure 4a-4b. They are designed to have a relatively flat bottom and a fluffy top, to look like cumulus
clouds. Figure 4c-4d illustrates the cloud styles with smaller randomly generated particles inside to

11
3D Cloud Visualization In Real-Time June 13, 2022

illustrate a more detailed cloud structure. In addition, every time a new particle is added to a cloud,
the particle gets a particle number, which is an integer to label the particles in each cloud.

(a) Basic structure of (b) Basic structure of


cloud style 1. cloud style 2.

(c) Cloud style 1 with (d) Cloud style 2 with


particles generated in- particles generated in-
side of it. side of it.

Figure 4: Illustration of the cloud styles that the implementation uses. All particles are
white and fully opaque.

When the structure of a cloud is complete, the cloud is to be moved from the origin to a position above
the Earth’s surface. Since the Earth is spherical, the cloud needs to be rotated such that its center of
mass points downward to the Earth, to ensure that the flat bottom of the cloud is parallel with the
Earth’s surface. Such a rotation is achieved with the rotation matrix Equation 9. After the rotation,
the structure is shifted from the origin such that its center of mass is positioned on a position above
the Earth’s surface.

To form many clouds the algorithm generates positions where to place all the clouds by selecting a
center position, and around it randomly generate positions for the wanted number of clouds. Simi-
lar to the forming of the cloud shapes, a factor also determines how close each cloud can be to one
another. An additional factor also determines how far away from the center position the clouds can
be positioned. With the position of the clouds generated, the algorithm then randomly generates the
shapes, rotates, and positions each cloud around the Earth as described in previous paragraphs. In
addition, each cloud also gets a random rotation around its center of mass. This gives each cloud
a more arbitrary appearance even though only two different cloud styles are in use. The process of
building up the shape of the clouds is complete. This process runs in the initialization process because

12
3D Cloud Visualization In Real-Time June 13, 2022

the shape of the clouds is static during the simulation.

The last part of the modeling phase determines the displaying order of the clouds on the screen. This
is part of the visualization process because the displaying order changes in every scene depending on
the position of the camera. The particles are displayed one at a time. This implies that the particles
closest to the camera must be displayed last. This is because the last displayed object is displayed
in front of everything else. To not have a faulty visualization, the particles are sorted by decreasing
distance to the camera, and then the particles with the largest distance to the camera are displayed first.
The sorting algorithm implemented for this is an insertion sort. The insertion sort is selected because
it is simple to implement. To optimize the sorting, the position of the clouds are sorted first. Then
the particles in each cloud are sorted. This sorting approach is viable because no cloud is positioned
in the same position. This implies a faster sorting because the particles per cloud are sorted instead
of sorting the particles for all clouds at once. Then, for each sorted particle the algorithm constructs
the translation matrix, Equation 7, based on the particle’s position and the scaling matrix, Equation
8, based on the particle’s radius. These matrices, together with the mesh, a color and an opacity
are given to the shader. The shader then illuminates, scales and places the mesh of each particle on
the screen. How the color and the opacity are calculated will be described in the rendering phase in
Section 3.3.

3.2 Animation phase

The motion of the clouds is generated by uniformly shifting the positions of the particles with time.
This is done by generating a random direction of motion such that the center position of the clouds
moves parallel with the Earth, while the magnitude of the motion is defined as a fraction of the radius
R. This ensures a visible motion that scales with the size of the clouds. If the clouds were allowed
to move in the generated direction for all time, the clouds would move out in space due to the Earth’s
spherical shape. To prevent this from happening the motion is restored after some minutes, and the
clouds start over at their initial positions. The direction of motion is set in the initialization process,
while applying the motion occurs in the visualization process just before the translation matrix is set.

13
3D Cloud Visualization In Real-Time June 13, 2022

3.3 Rendering phase

3.3.1 Solving the light transport equation over the clouds

As in the studies [11, 12] the clouds in this project are also illuminated by solving the LTE, Equation
1, over the clouds but with a different approach. Instead of solving for multiple scattering in the cloud
particles, a single scattering approximation in the viewing direction is used. This implies that the
scattering by one particle does not affect the scattering for any other particle. The light sources that
are considered in the implementation are the light from the atmosphere and the sun. The atmospheric
light is assumed to interact with the particles in the viewing direction, while the sunlight is assumed
to be scattered once in the viewing direction. Depending on the viewing angle, the light from the
different light sources travel different distances in the clouds before interacting with a particle. This
means that the light from the atmosphere travels a distance db in the cloud before interacting with
the particle while the sunlight travels a distance ds . After the light has interacted with the particle,
the light travels a distance df before it exits the cloud. A cloud particle exposed to light from the
atmosphere and the sun, with the single scattering approximation is illustrated in Figure 5.

Figure 5: Illustration of the light transport equation with the single scattering approxima-
tion. Where a particle is exposed to light from the atmosphere and light from the sun that
scatters in the viewing direction.

Based on the single scattering approximation, the integral in the source term g(s) in Equation 5 can
almost be neglected except for the contribution from the scattering in the viewing direction. By
defining the intensity I (s, ⃗x′ ) in the source term as the intensity of the sunlight Is that decreases as it
moves in the cloud, the source term becomes

g(s) = αk(s)P (θ) Is T (0, ds ). (12)

14
3D Cloud Visualization In Real-Time June 13, 2022

An albedo value of α = 0.9 is used, this value is chosen high to simulate that clouds have high albedo
values. By representing each cloud as a particle system, the LTE, Equation 1, including the source
term, can be rewritten for each particle as

Z db
I(db , df , ds , ⃗x) = Ib T (0, db + df ) + αk(s)P (θ) Is T (0, ds )T (s, db + df )ds. (13)
0

The first term takes into account how the light intensity from the atmosphere Ib ≡ I(0, ⃗x) decreases
in the cloud before and after interacting with the particle, while the second term samples the scat-
tered sunlight before it interacts with the particle and calculates how the intensity of the sunlight is
decreased in the cloud after interacting with the particle.

By approximating the scattering cross section σ with a fraction ck of the particle cross-sections, and
the number density as the number of particles divided by the total volume of all particles, the attenu-
ation coefficient k(s) in Equation 4 can be evaluated to

np 3ck
k(s) = ck ση(s) = ck πr(s)2 4πr(s)3
= . (14)
np 3 4r(s)

The positive constant ck is introduced to control the attenuation of the light intensity as it travels
through the cloud. For ck < 1, the light traveling through the cloud will undergo less attenuation than
for ck > 1. Less attenuation implies that the light intensity is less decreased in a cloud, thus the cloud
will generally be brighter. Therefore, a value of ck = 0.5 is used to make cloud appear brighter. As
mentioned earlier, the radius r(s) of the particles is modeled as constant, thus the attention coefficient
k(s) is constant and independent of the distance s. Therefore, the optical depth τ integrals, Equation
3, in the transmittance expressions T in Equation 2 are trivial to evaluate as

Rs
− 1 kds
T (s0 , s1 ) = e−τ (s0 ,s1 ) = e s0
= e−k(s1 −s0 ) . (15)

By expanding Equation 13 with the evaluated transmittance expressions the equation takes the form
Z db
−k(db +df )
I(db , df , ds , ⃗x) = Ib e +akP (θ)Is e−kds e−k(db +df −s) ds
0
Z db (16)
−k(db +df ) −kds
= Ib e + akP (θ)Is e e−k(db +df −s) ds.
0

In Equation 16 their is only one integral left to solve, this can be solved analytically, which results in

15
3D Cloud Visualization In Real-Time June 13, 2022

an expression of the LTE given as

I(db , df , ds , ⃗x) = Ib e−kdb e−kdf + Is aP (θ)e−kds e−kdf (1 − e−kdb ) (17)

or as

I(db , df , ds , ⃗x) = Ib T (0, db )T (0, df ) + Is aP (θ)T (0, ds )T (0, df )(1 − T (0, db )) (18)

without the transmittance expression evaluated.

3.3.2 Evaluating the solved light transport equation

The attributes left to determine are the color and opacity of each particle. This is done by evaluating
Equation 18 for all particles. The intensity of the atmosphere Ib and the intensity of the sun Is are
defined as Is = 30Ib , to simulate a stronger sunlight than the atmospheric light. To evaluate Equation
18 the phase function P (θ), and the distances db , df and ds are to be determined. The scattering an-
gle θ is determined by calculating the angle between the incident sunlight and the viewing direction.
To approximate Mie scattering in the clouds, the Henyey-Greenstein phase function, Equation 6, is
used. To evaluate the phase function the symmetry parameter q = 0.3 is used, because for values
closer to one, the peak in the phase function is sharp and huge. This is not wanted because in the
engine, the viewing angle can quickly change. Thereby the scattering angle changes quickly. Quick
changes in the scattering angle lead to drastic differences in the value of the phase function, which
will imply rapid changes in the appearance of the cloud. Rapid appearance changes can be perceived
as a flickering behaviour. A flickering behavior in visual scenes can be perceived as erroneous. To
counteract the flickering behavior, a low symmetry parameter is selected because for low values the
phase function is more smooth. Therefore, the appearances of the clouds change smoother and no
flickering behaviour occurs when the symmetry parameter is low.

The last parameters required to evaluate the LTE are the distances the different rays travel through the
cloud. These distances are approximated by an algorithm. For each particle the algorithm takes step
lengths in a ray’s direction, and updates the total step length until the distances to all other particles
are longer than a particle radius. Then the algorithm assumes that the calculated total step length is
outside the cloud, and the total step length is returned. The algorithm uses 20% of the particle radius
as step length. This step length is chosen to ensure that the step length scales with the size of the
cloud particles, however the step length will be discussed further in Section 5.3.1. The ray-distance

16
3D Cloud Visualization In Real-Time June 13, 2022

algorithm is described with pseudocode in Algorithm 1.

Algorithm 1: Pseudocode describing the method for calculating ray-distances


dist=rayDist(stepDir,stepLength,particlePos,index,radius)
Input: stepDir - Direction to step in
stepLength - Length of the steps
particlePos - The particle positions for a certain cloud
index - Index of particle to calculate ray distance for
radius - Radius of particle
Output: dist - Ray distance for current particle
insideCloud ← true
dist ← 0
while insideCloud == true do
for i=0 to length(particlePos) do
if ||particlePos[i]-(particlePos[index]+d)||<radius then
dist ← dist+stepLength
insideCloud ← true
break
else
insideCloud ← false
end
end
end
return dist

With Algorithm 1 the distances db , df and ds are approximated and thereby the solution of the LTE
Equation 18 can be calculated for each particle. The calculated intensity I(db , df , ds , ⃗x) ∈ [0, ∞]
while the color channels in OpenGL are clamped between [0, 1]. To base the color of the particles on
Equation 18 the intensities I(db , df , ds , ⃗x) are tone mapped to the range [0, 1] by Equation 10. Then the
color of each particle is assigned the color [r, g, b] = [I(db , df , ds , ⃗x), I(db , df , ds , ⃗x), I(db , df , ds , ⃗x)].
In other words, the color of the particles belong to a grayscale, where intensities close to zero are
black and intensities close to one are white.

Evaluating the LTE Equation 18 for each particle yields a new flickering behavior in the visualization.
This time the flickering behavior arise due to that the distance db that the atmosphere light travels in
the clouds before interacting with a particle changes drastically and rapidly. This change is because
the particles closest to the viewer are the particles that are the most visible, and those particles have
the largest db values. When the shape of the cloud is irregular the parameter db changes rapidly when
changing the viewing angle. Thereby, the appearance changes fast, and a flickering behavior arises.
The distance df that the light travels after interacting with the particle is on the other hand always

17
3D Cloud Visualization In Real-Time June 13, 2022

small for the particles closest to the viewer. Thereby, changes in df do not imply as drastic transition
in the appearance as changes in db . To counteract the flickering behavior due to changes in db , the
parameter db is instead modeled as constant for all particles.

In addition to the tone map, the color is also linearly mapped from the range [0, 1] to [Imin , Imax ] =
[0.70, 0.95], these values are chosen to simulate white bright clouds. Rain clouds could on the other
hand, be simulated by using [Imin , Imax ] = [0.25, 0.50]. The opacity value of each particle is chosen
to be uniform for all particles, because defining it as a value depending on the ray distance, also leads
to a flickering behavior. The opacity value is chosen as a = 0.1 because this gives the clouds a semi-
transparent appearance.

The rendering phase is divided between the initialization and visualisation processes, because running
the ray-distance calculation rayDist, shown in Algorithm 1, for a large amount of particles would
result in a huge time complexity. Instead this algorithm runs in the initialization process for all
particles in different unit directions −−→ These unit directions are generated with the parametric
xyz.
equation of a sphere
−−→ = [cos ψ sin θ, sin ψ sin θ, cos θ]
xyz

by varying the azimuth angle ψ ∈ [0, 360) and the zenith angle θ ∈ [0, 180] by the steps ∆ψ and ∆θ.
Then for each particle in each cloud the ray distance is calculated for all unit directions and stored
in hash tables. The keys used in the hash tables consists of integers defined as a particle number, a
azimuth angle, and a zenith angle. The selected method to define the keys makes it possible to, during
the visualization process map an incident ray direction to a key for a certain particle. The key can
then be used to look-up a already calculated ray distance with operations of O(1) instead of running
the Algorithm 1 every time a ray distance is required. The looked-up ray distances are then used to
evaluate the solved LTE, Equation 18, to determine the color of each particle. The implementation
uses ∆ψ = ∆θ = 15, the corresponding unit directions are illustrated in Figure 6. These values
are used because they result in smooth transitions in the cloud appearance when the viewing angle
changes. Of course, a lower value could also be used, but that increases the computational time.

18
3D Cloud Visualization In Real-Time June 13, 2022

Algorithm 2: Pseudocode describing the initialization process


setParameters(r, R, np , Np , ck , c, α, a, Is , Ib , Imin , Imax , ∆ψ, ∆θ)
centerPos ← setCloudCenter()
velocityDir ← setVelocityDir()
cloudPos ← generateCloudPos(centerPos)
unitDir ← generateUnitDir(∆ψ, ∆θ)
for i=0 to length(cloudPos) do
particlePos ← generateParticlePos(cloudPos[i])
particlePos ← rotateTranslateParticlePos(cloudPos[i],
particlePos)
stepLength ← setStepLength()
for j=unitDir do
for k=0 to length(particlePos) do
dist ← rayDist(j,stepLength, particlePos,k,r)
end
end
PARTICLE_POS[i] ← particlePos
DIST[i] ← dist
end

Algorithm 3: Pseudocode describing the visualization process


cloudOrder ← sortClouds(cloudPos)
for i=0 to length(cloudPos) do
sortedParticlePos ← sortParticlesPerCloud(
PARTICLE_POS[cloudOrder[i]])
for j=0 to length(sortParticlesPerCloud) do
currentParticle ← sortParticlesPerCloud[j]
T ← setTranslationMatrix(currentParticle,velocityDir)
S ← setScaleMatrix(currentParticle)
A ← setOpacity()
I ← evaluateLTE(currentParticle,DIST[i])
I ← toneMap(I)
I ← linearlyMap(I)
RGB ← [I,I,I]
callShader(T,S,RGB,A)
end
end

19
3D Cloud Visualization In Real-Time June 13, 2022

Figure 6: Unit directions generated with the parametric equation of a sphere, by varying the
the azimuth angle ψ ∈ [0, 360) and the zenith angle θ ∈ [0, 180] with steps of 15 degrees.

4 Result
The goal for this project is to simulate realistic cumulus clouds that adapt their appearance in real-
time based on the artificial environment of the engine. To evaluate the real-time performance, a
performance study is made. In addition, visual results are also produced to be able to evaluate the
quality of the appearance of the visualized clouds. In the study, the number of clouds simulated is
varied as [1, 5, 10, 20, 30, 40], while the number of particles per cloud is varied as [100, 200, 400, 800].
The study is performed on a MacBook Pro (2018) with a 2.2 GHz 6-core Intel Core i7 processor, a
16 GB 2400 MHz DDR4 RAM-memory and an Intel UHD Graphics 630 graphics card.

4.1 Performance study

The initialization process is studied by measuring its average run time by running it five times for
the different number of cloud and particle combinations. The visualization process is studied by
measuring the average run time, the standard deviation of the run time, the total number of particles
simulated, and the overall frame rate for the whole engine. To measure these properties, the visualiza-
tion process is run for about a minute for each setup, and data is logged for each frame. This results
in about 500 − 3000 data points for each setup depending on the frame rate.

4.1.1 Initialization process

Figure 7 shows the mean run-time of the initialization process for the different amounts of clouds.
The Figure 7 shows that the initialization process cannot be categorized as real-time since real-time
performance requires that a scene can be calculated in millisecond, as Table 2 states. However, the

20
3D Cloud Visualization In Real-Time June 13, 2022

goal is not to make the initialization process run in real-time but rather for it to support the visualiza-
tion process. As Figure 7 illustrates, the initialization process can initialize 40 clouds in under three
minutes when the clouds contain 100-200 particles per cloud. For the case with 400 particles per
cloud, the initialization takes just over ten minutes to initialize 40 clouds. For 800 particles per cloud,
the initialization of 20 clouds takes 30 minutes, the study is terminated after 20 clouds because the
initialization time of 30 minutes is considered to be too long. The study shows that the mean time in-
creases linearly with the number of clouds if the number of particles per cloud is kept constant. Based
on the assumption of linearity, the initialization of 40 clouds with 800 particles per cloud would take
around an hour.

30
initialization process [min]

25
Mean time

20 100 particles per cloud


200 particles per cloud
15 400 particles per cloud
800 particles per cloud
10

0
0 10 20 30 40

Number of clouds

Figure 7: The mean run-time [min] for the initialisation process plotted against the number
of clouds, for different amount of particles per cloud

By considering that the initialization process must loop through all the loops in the process a rough
complexity analysis of the initialization process, shown in Algorithm 2, indicates that its worst-case
complexity TI is

TI (n, m, d, k) = O(n2 ) + O(n)(O(m2 ) + O(m) + O(d)O(m)O(k)O(m)) (19)

where n is the number of clouds, m is the number of particles, d is the number of directions used to
pre-calculate the ray distances, and k is the number of steps taken in each distance calculation. Where
the different terms in Equation 19 are

• O(n2 ) - Generating cloud positions and ensuring that they are not to close or to far away from
one another.
• O(n) - Generate structure for each cloud.
• O(m2 ) - Generating particle positions and ensuring that they are not to close to one another.
• O(m) - Translate and rotate each particle.

21
3D Cloud Visualization In Real-Time June 13, 2022

• O(d)O(m)O(k)O(m) - Pre-calculate the ray distance for all directions, all particles, all step
lengths, and ensuring that the algorithm has stepped outside the cloud.

By assuming that the number of particles m is larger than the number of clouds n, it is possible to
determine from Equation 19 that the process of pre-calculating the ray distances is dominant and con-
tributes the most to the time complexity of the initialization process. The second-largest contribution
comes from the term that generates the position of the particles per cloud.

4.1.2 Visualization process

In contrast to the initialization process, the goal of the visualization process is for it to run in real-time.
There is no strict boundary between real-time and not real-time performance because, as explained in
Section 2.4, there is no universal optimal frame rate. However, the goal with the visualization process
is to calculate each frame at a rate such that the virtual scene created by the engine does not appear
jagged. The virtual scene appear jagged if its frame rate is too low. For example, at 60 FPS, the virtual
scene does not appear jagged, while the virtual scene appear jagged at less than 24 FPS. However,
between 24-60 FPS it varies if the virtual scene appears jagged or not. As a guideline, a value closer
to 60 FPS is preferred.

Figure 8 illustrates the mean calculation time the visualization process requires for each frame, in-
cluding two lines representing the max time in milliseconds each frame can take to be rendered in
60 FPS and 24 FPS. The Figure 8 shows that the mean visualization time for each frame behaves
approximately linear when increasing the number of clouds and keeping the number of particles per
cloud constant.

Figure 8 also proves that some cloud constellations correspond to calculation times that in theory,
could run at about 60 FPS. For 100 particles per cloud, 60 FPS can be achieved for the simulation
of up to 20 clouds. For 200 and 400 particles per cloud, 60 FPS can be achieved for the simulation
of up to 10 clouds respective 5 clouds. Lastly, for 800 particles per cloud, the 60 FPS limit is only
achieved when simulating 1 cloud. The implementation can simulate even more clouds for a frame
rate of 24 FPS. For 100, 200, 400 and 800 particles per cloud, the 24 FPS limit is not exceeded for
the simulation of 40, 20, 10, respective 5 clouds. However, in practice, the engine runs other routines
in addition to the clouds. Figure 9 illustrates the mean frame rate of the engine running all routines,
including the clouds. The figure shows lower frame rates than the calculation times in Figure 8 pre-

22
3D Cloud Visualization In Real-Time June 13, 2022

60
150

50

for whole program [fps]


visualization process [ms]

Mean frame rate


100 particles per cloud
40
100
100 particles per cloud 200 particles per cloud
200 particles per cloud
Mean time

400 particles per cloud


30
400 particles per cloud 800 particles per cloud
800 particles per cloud 60 FPS limit
20
60 FPS limit 24 FPS limit
50
24 FPS limit
10

0
0 0 10 20 30 40
0 10 20 30 40 Number of clouds
Number of clouds
Figure 9: The mean frame rate [fps] for
Figure 8: The average time [ms] the vi-
the visualisation process together with
sualization process takes to calculate a
the rest of the engine’s processes plotted
frame for different number of clouds, for
against the number of clouds, for differ-
different number of particles.
ent amount of particles per cloud.

dicted. This is a reasonable result because the program needs to simulate more routines, such routines
are for example, the sun in Figure 12 or the Earth in Figure 15. Figure 9 shows that for 100, 200, 400
and 800 particles per cloud, the 60 FPS limit is only achieved when simulating 10, 5, 1 respective 1
cloud. While the 24 FPS limit is not exceeded for the same cloud constellations as in Figure 8.

In Figure 10 the total number of particles simulated is illustrated as a function of particles per cloud.
By combining the result in Figure 9 and Figure 10 it is possible to determine that the program visual-
izes the virtual scenes in 60 FPS when the total number of particles is approximately less than 1000.
The program visualizes the virtual scenes in 24 FPS or more when the total number of particles is
approximately less than 4000.
15

16000
Total number of particles simulated

visualization process [ms]

14000

12000 10
100 particles per cloud
Std time

10000 100 particles per cloud 200 particles per cloud


200 particles per cloud 400 particles per cloud
8000
400 particles per cloud 800 particles per cloud
5
6000 800 particles per cloud

4000

2000
0
0 10 20 30 40
0
0 10 20 30 40 Number of clouds
Number of clouds
Figure 11: The standard deviation of the
Figure 10: The total number of particles
time [ms] the visualization process takes
simulated per frame plotted against the
to calculate a frame for different number
number of clouds, for different amount
of clouds and different number of parti-
of particles per cloud.
cles per cloud.

The standard deviations of the calculation times per frame are visualized in Figure 11. As the fig-
ure shows, the standard deviations increase when the number of clouds and the number of particles

23
3D Cloud Visualization In Real-Time June 13, 2022

per cloud increase. In other words, the calculation times vary more when the visualization process
simulates more particles or clouds. That the calculation times vary more imply that also the frame
rate will vary more. If the variation leads to a decrease in frame rate, it can cause the visualization to
appear jagged. To summarize, when the number of simulated particles increases, the possibility that
the visual scene will appear jagged also increases.

By considering that the visualization process must loop through all the loops in the process a rough
complexity analysis of the visualization process, shown in Algorithm 3, indicates that its worst-case
complexity TV is
TV (n, m, d, k) = O(n2 ) + O(n)(O(m2 ) + O(m)) (20)

where n is the number of clouds and m is the number of particles. Where the different terms in
Equation 20 are

• O(n2 ) - Sorting clouds.


• O(n) - Visualize each cloud.
• O(m2 ) - Sorting particles per cloud.
• O(m) - Set attributes for each particle in a cloud.

By assuming that the number of particles m is larger than the number of clouds n, it is possible to
determine from Equation 20 that the process of sorting the particles is dominant and contributes the
most to the time complexity of the visualization process. The second-largest contribution comes from
the term that assigns attributes, such as color and position to each particle.

4.2 Visual Results

Beyond the real-time performance, the implementation should also visualize clouds that appear re-
alistic. As for the real-time requirement, there is no clear boundary of what is a realistic cloud vi-
sualization. However, to attempt to make the clouds appear realistic, the focus is on capturing how
surrounding light sources affect the clouds. In addition, to make the cloud simulation appear real-
istic, a simple cloud motion is also implemented. As this is a report, the motion is impossible to
demonstrate. However, the motion is created by shifting the position of the clouds with time. This
generates a virtual scene in which the cloud sky slowly moves in a uniform motion over the Earth for
some minutes before it is reset to its initial position, and the motion starts over. The motion can be
experienced as the clouds being more alive.

24
3D Cloud Visualization In Real-Time June 13, 2022

In Figure 12 a cloud visualized from two different viewing directions is shown. One viewing direc-
tion is in the same direction as the sun (will be referred to as direct sunlight), while the other viewing
direction is in the opposite direction as the sun (will be referred to as indirect). In Figure 12 the cloud
in direct sunlight is experienced as slightly brighter than the cloud in indirect sunlight. This feature
arises from solving the LTE. For the case where the cloud is in direct sunlight, the intensity of the
sunlight has not been decreased in the cloud for the particles closest to the screen. Thereby, these
particles appear bright. It is also possible to see that the outer parts of the cloud appear darker. This
feature arises due to that the cloud is semi-transparent and what is seen are the particles in the back
where the intensity of the sunlight decreased. For the other case where the cloud is in indirect sun-
light, the particles closest to the screen are the particles where the intensity of the sunlight decreased
the most, thereby appearing darker. For the cloud in indirect sunlight it is also possible to see that
the outer parts of the cloud are a bit brighter. This is due to the semi-transparent feature of the cloud
so that the particles in the back are seen through the cloud. These particles in the back of the cloud
are exposed to direct sunlight, and therefore appearing brighter. Figure 13 shows the same features as
Figure 12 but for a landscape view of 20 clouds. Figure 13 simulates 20 clouds with 800 particles per
cloud. Thus it has an initialization time of around 30 minutes, and the program visualizes the clouds
in a frame rate of less than 10 FPS. Figure 13 also demonstrates that the clouds have a relatively flat
bottom and a more irregular top to resemble cumulus clouds.

(a) Cloud viewed from the (b) Cloud viewed from the op-
same direction as the sun. posite direction as the sun.

Figure 12: Both clouds are simulated with 800 particles.

Figure 14 illustrate close-up views of four clouds generated with different amounts of particles. When
the background of the clouds is dark, the individual cloud particles can be distinguished. The particles
are most visible when the cloud is built up with the least amount of particles. When the particles per
cloud increases, the individual particles become less visible but in the outer parts of the cloud it is still
possible to see the individual particles. Figure 15 illustrates a distant view of clouds generated with

25
3D Cloud Visualization In Real-Time June 13, 2022

(a) Clouds viewed from the same direction as the sun.

(b) Clouds viewed from the opposite direction as the sun.

Figure 13: Landscape view on 20 clouds with 800 particles per cloud.

different amounts of particles. By looking closely, the feature of the visible individual particles can
still be seen, but it is not as clear as shown in Figure 14. It is also possible to fly through the clouds.
When flying through a cloud individual particles can be distinguished from the inside of the cloud,
similar to the close-up view in Figure 14.

(a) 100 particles per cloud. (b) 200 particles per cloud.

(c) 400 particles per cloud. (d) 800 particles per cloud.

Figure 14: Close-up view on clouds with different number of particle per cloud.

Both Figure 15 and Figure 16 show how many clouds the engine can visualize in 24 FPS for the
different amounts of particles per cloud. As the figures show, the cloudy sky becomes more sparse
on clouds when the number of particles per cloud increases. In Figure 16 all clouds are viewed in the
direction opposite of the sunlight, thus the clouds have the features as the cloud in Figure 12. How-
ever, in Figure 16 the details are smoothed for the clouds with fewer particles per cloud in comparison
to the clouds with more particles per cloud, where the details appear more distinct. In Figure 16 the

26
3D Cloud Visualization In Real-Time June 13, 2022

(a) 40 clouds with 100 parti- (b) 20 clouds with 200 parti-
cles per cloud. cles per cloud.

(c) 10 clouds with 400 parti- (d) 5 clouds with 800 particles
cles per cloud. per cloud.

Figure 15: Distant view on clouds with different number of particle per cloud.

shape of the clouds does not appear the same due to the random rotation of each cloud, even though
only two different cloud styles are in use.

5 Discussion
The purpose of this work is to develop a cumulus cloud algorithm in Tactels’ 3D engine. The goal is
to visualize realistic clouds that are visualized in real-time. The produced algorithm consists of two
processes, an initialization and a visualization process. The initialization process initializes clouds
that are static in shape. Each cloud consists of a system of semi-transparent spherical particles. The
shape of each cloud is three-dimensional. Thus the clouds’ shapes appear different when looking at
the clouds from different directions. The visualization process displays the clouds on the screen, shifts
the position of the clouds with time to simulate motion, and evaluates the light transport equation to
color the cloud based on the lights from the atmosphere and the sun. Evaluating the light transport
equation makes the cloud appear brighter when looking at the cloud from the same direction as the
sun, while the cloud appears darker when the cloud is viewed from the opposite direction as the sun.
Due to that, the intensity of the sunlight is decreased when traveling through the cloud.

27
3D Cloud Visualization In Real-Time June 13, 2022

(a) 40 clouds with 100 particles per cloud.

(b) 20 clouds with 200 particles per cloud.

(c) 10 clouds with 400 particles per cloud.

(d) 5 clouds with 800 particles per cloud.

Figure 16: Landscape view of clouds with different number of particles per cloud, viewed
from the opposite direction as sun.

5.1 Trade-off between computational performance and quality of the visual


results

The optimal scenario would be if the produced algorithm could visualize as many particles as pos-
sible for short initialization times and high frame rates. This is not the case, as the result indicates
the initialization times increase and the frame rates decrease when the number of simulated particles
increase. The result also indicates that the details of the clouds get more distinct when the num-
ber of particles per cloud increase. This can be described as a trade-off between initialization time,
frame rate, the number of clouds simulated, and how distinct the clouds’ details are. This trade-off
has no optimal solution because the optimal solution can differ depending on the context where the
algorithm is being used. The context may be that the algorithm is used to simulate one cloud with
distinct details at high frame rates. Or a sky with many clouds with distinct details at low frame rates.
These contexts have different optimal solution. The optimal solution off the trade-off can also dif-
fer between individuals, based on their preferences on what they think is the optimal virtual cloud sky.

28
3D Cloud Visualization In Real-Time June 13, 2022

5.2 Limitations of the algorithm

Although the algorithm works and can visualize clouds, the algorithm still has some limitations. First
of all, as the trade-off described earlier, the number of particles the algorithm can simulate is limited
by the performance. The performance also limits how distinct the details of the clouds are. How the
performance can be improved will be described later in Section 5.3.

As mentioned earlier, all the clouds are shaped based on two cloud styles shown in Figure 4, where
the cloud styles are randomly rotated for each cloud to make the clouds appear different in shape, such
that the cloud sky appear more arbitrary. If the computational performance is not a limitation, there
will be no upper limit on how many clouds that can be simulated. Then for an increasing number of
simulated clouds, the cloud sky would probably begin to appear repetitive when the random rotations
of the cloud styles eventually have been reused multiple times. To prevent the clouds from appearing
similar for an increasing number of simulated clouds, more cloud styles can be implemented to reduce
the chances of a rotation being reused.

All simulated clouds move in the same direction and with the same velocity. Depending on individual
preferences, the motion can appear too uniform and unreal. This can be improved by adding some
variance and height dependence in the direction and velocity of the cloud motion. However, even
though the motion is simple, it still contributes to a more alive and realistic simulation in comparison
to completely static clouds. The reason that the motion of the clouds is simple is largely due to the
fact that time was not available to make it more advanced. The motion could for example be expanded
to simulate gravity, by rotating the direction of motion to always be parallel with the Earth. Then the
motion does not have to be restored after a few minutes. However, rotating the motion also implies
that all particles need to be rotated to ensure that the flat bottom of the clouds always is parallel with
the Earth, such rotations would alter the real-time performance.

Another limitation of the developed algorithm occurs when the clouds are colored, due to that each
cloud is colored independent of other clouds. This implies that if two clouds aligns in a row be-
tween the viewer and the sun, the clouds would approximately appear equal in color. However, the
cloud closer to the viewer should be darker because the intensity of the sunlight is already decreased
in the cloud closest to the sun. However, coloring the clouds dependent on other clouds could be
implemented, but it would also increase the time complexity and therefore, affect the algorithm’s per-
formance. It should also be mentioned that the particles in each cloud also are colored independent of

29
3D Cloud Visualization In Real-Time June 13, 2022

other particles due to the single scattering approximation. If the particles are to be colored dependent
on other particles, a new method for solving the light transport equation needs to be developed.

The algorithm only considers the direction of the sunlight and not the position of the sun when the
clouds are colored. This means that the clouds are always affected by the sun, wherever the sun is
positioned. For example, if the sun is on the other side of the Earth compared to a cloud, the cloud is
still affected by the sun, even though the Earth probably blocks the sunlight from reaching the cloud.
This can be improved in future work by also defining the color of the clouds based on the position of
the sun, and the distance between the sun and a cloud. Such that the clouds appear darker when the
sun disappears behind the horizon, to simulate clouds during the night. This can be further extended
to simulate sunset and sunrise by letting the color of the clouds change from red-orange to white
depending on the sun’s position.

So far only cumulus clouds have been simulated, so the algorithm is limited to only simulating just
cumulus clouds. In future work, other cloud types could be implemented as well. Another limitation
of the algorithm is that the clouds do not cast a shadow on the Earth’s surface as real clouds do. Thus
the shadow of clouds is also a feature to be implemented. However, as mentioned earlier, implement-
ing new features will also affect the computational performance of the algorithm.

5.3 Future work

As mentioned earlier there are suggestions available to extend the algorithm to visualize clouds that
appear more realistic, such as sunset, sunrise and for the clouds to cast a shadow. In addition to such
features, the future work should also include optimization of the algorithm to increase its computa-
tional performance. A better performance would make it possible to simulate more cloud particles
without compromising the real-time performance. Beyond that a better performance would also make
it possible to run the algorithm on hardware with less capacity than the computer used for testing in
this project.

5.3.1 Optimize the initialization process

As mentioned before, running the initialization process can take some minutes or up to an hour, de-
pending on how many clouds and how many particles per cloud that are simulated. As the worst-case

30
3D Cloud Visualization In Real-Time June 13, 2022

complexity of the initialization process states in Equation 19, the most dominant contributor to the
time complexity comes from the process of pre-calculating the ray-distances. Therefore reducing
the time complexity of the ray-distance calculations will decrease the complexity of the initialization
process the most. Optimizing other processes beyond the calculation of the ray-distances would have
a neglectable effect because the complexity of the ray-distance calculations is so dominant for the
initialization process.

Several possibilities to reduce the time complexity of ray-distance calculations will be presented in
the following paragraphs. First of all, the unit directions used with the angle steps ∆ψ = ∆θ = 15
are denser closer to the poles of the spherical parameterization shown in Figure 6. At the poles, some
directions are approximately equal and therefore redundant. Developing a new method for the gener-
ation of unit vectors such that the vectors are more evenly distributed, can reduce the number of unit
directions. Reducing the number of unit directions will reduce the time complexity of the initializa-
tion process. However, the directions must still be created such that it is possible to map the directions
to keys, such that the algorithm can look-up the keys in the visualization process.

Another possibility for reducing the time complexity of the initialization process would be if the
ray-distance calculations can be reused for each cloud style. This would require some more efficient
mapping between the rotation of a cloud and the look-up keys. This was tested during the project but,
unfortunately not successful. However, by achieving such a mapping the ray-distance calculations
will be reusable for clouds with the same cloud style. Therefore, the ray-distance calculations will
only be calculated for as many clouds styles as the algorithm uses, currently two styles. This would
constrain the time complexity of the ray-distance calculations to the number of cloud styles in use.
However, this would also include an addition mapping between a rotation and a key. Thus the run-
time performance would probably be altered negatively.

Reducing the complexity of the ray-distance calculations even further can be achieved by varying the
algorithm’s step length when it iterates through the cloud. For example, the algorithm could initially
take large step lengths, and as it gets closer to the edge of the cloud, it can take shorter step lengths.
This would probably reduce the total number of iteration needed to calculate the ray-distance for each
direction and particle. Thereby the time complexity would be reduced. Section 3.3.2 states that the
step length used is 20% of the radius of the particles to ensure that the length of the steps scales with
the size of the cloud particles. The ratio of 20% could be investigated more to find a more optimal
ratio. For example, a lower ratio can make the distance calculations more exact due to smaller step

31
3D Cloud Visualization In Real-Time June 13, 2022

lengths but would, on the other hand, require more iterations. In comparison, a higher ratio would
require fewer iterations, but the distance would also be more approximated due to larger step lengths.
The size of the step length can be described as a trade-off between the time complexity and correct-
ness of the ray-distance calculation. The ratio of 20% has been determined through trial and error
to find a ratio that produces the calculations in a reasonable time and with reasonable correctness.
However, there might be a more optimal value of the ratio that could be determined in future work.

This project did not investigate whether it is possible to store the data from the initialization process
and reuse the data over and over again in the visualization process. This would lead to the initialization
running once instead of every time the algorithm is run. For this to be possible, some things need to
be investigated in future work, for example, the memory consumption of the stored data, the available
memory on the hardware, and if loading the stored data is more efficient than running the initialization
process.

5.3.2 Optimize the visualization process

The visualization process can also be optimized to increase its performance. As the worst-case com-
plexity of the visualization process states in Equation 20, it is the sorting of the particles that con-
tributes the most to the time complexity. The algorithm uses the insertion sort algorithm to order the
particles and the clouds such that they get displayed back to front from the viewer’s point of view.
By instead using the merge sort algorithm, the time complexity of the sorting would be reduced from
O(m2 ) to O(m log(m)) as Table 1 states. In turn, it would improve the entire time complexity of the
visualization process.

For each frame, the algorithm performs calculations for each simulated cloud even though the cloud
might be outside the viewer’s view. These calculations for clouds outside the view are unncesary. An
additional routine that either visualizes less particles or no particles when clouds are far away from
the viewer or outside the visible view, can reduce the complexity. Thereby, it is probably possible to
simulate even more clouds. A similar routine that instead simulates more particles when the viewer is
close to a cloud can reduce the number of visible individual particles, as Figure 14 shows. However,
simulating more particles will always compromise the computational performance.

The visualization process can also be optimized by displaying the clouds’ volume as one uniform
mesh structure instead of with many spherical meshes. Where the uniform mesh structure forms the

32
3D Cloud Visualization In Real-Time June 13, 2022

shape of the cloud. Then a particle system can be used as sample points, and based on those points,
the uniform structure can be colored. This can increase the frame rate due to that only one object is
drawn on the screen for each cloud instead of drawing many objects in form of particles. This method
would also remove the feature of visible particles when the viewer is close to a cloud, as Figure 14
shows. However, for this to be possible, a method to construct a uniform mesh structure that can
resemble a cloud must be developed.

In general, in Section 3.3 there are several parameters defined, for example, the intensity of the sun-
light, the intensity of the atmosphere light, opacity value, and albedo value. For all these parameters,
there are underlying reasons why the parameters are selected as they are, which is motivated in Sec-
tion 3.3. During the process of developing the algorithm, these parameters have been varied, and
those presented in Section 3.3 are the ones that produced a reasonable and realistic visualization.
However, in future work with more available time, the parameters can be varied further to investigate
if a combination of the parameters can result in an even more realistic visualization.

5.3.3 Improving the performance study

Even though the result from the performance study, in Section 4.1, is reliable, it still could be im-
proved in future work. First of all, performance studies could be made on hardware with better and
worse capacities than the computer used in this project, to investigate how the algorithm would per-
form on such hardware. In addition, the average time of running the initialization process was only
measured five times for each cloud constellation. The number of measurements could be extended in
future work to see if the result would be altered. However, when performing the study, the measured
times were consistent. Thereby, the number of measurements were kept low to save time. This saved
time was instead used for studying the real-time performance, namely the visualization process, where
500 − 3000 data points were sampled for each cloud constellation. More focus was put on studying
the visualization process because the real-time performance is more connected to the project’s goal
than the initialization process.

6 Conclusion
In this project, an algorithm to visualize 3D cumulus clouds has been developed. The algorithm
is a combination between a procedural approach and a physics-based approach. Where the shape
and the motion of the clouds are procedural, while the coloring of the clouds is physics-based. The
algorithm can visualize objects on the sky that resembles cumulus clouds. The clouds are moving

33
3D Cloud Visualization In Real-Time June 13, 2022

slowly over the sky, in a uniform motion to make the visualization appear more alive and realistic.
Also, the clouds change their brightness depending on the direction of the incident sunlight. The
developed algorithm can be described as a trade-off between computational performance and quality
in the visual results. High quality in the visual result yields a lack of computational performance,
while a low quality in the visual result yields a high computational performance. To conclude, with a
few minutes of initialization, the algorithm has proven to be able to visualize virtual cloud scenes in
frame rates between 24 − 60 FPS, which can correspond to real-time performance.

References
[1] P. Goswami, "A survey of modeling, rendering and animation of clouds in computer graphics",
Vis Comput 37, pp.1931–1948, 2021.

[2] "PANASONIC AVIONICS ARC", Tactel AB, https://tactel.se/sv/cases/


flygkartan-som-ger-dig-tillgang-till-varlden/, Accessed: 2022-02-22.

[3] "Moln", SMHI, https://www.smhi.se/kunskapsbanken/meteorologi/


moln-introduktion-1.3852, Accessed: 2022-04-25.

[4] "What Are Clouds?" , NASA, https://www.nasa.gov/audience/forstudents/


5-8/features/nasa-knows/what-are-clouds-58.html, Accessed: 2022-04-25.

[5] "Stratus - dimmoln", SMHI,


https://www.smhi.se/kunskapsbanken/stratus-dimmoln-1.31695,
Accessed: 2022-04-25.

[6] "Cumulus - stackmoln", SMHI, https://www.smhi.se/kunskapsbanken/


meteorologi/cumulus-stackmoln-1.4089, Accessed: 2022-04-25.

[7] "Cirrus - fjädermoln", SMHI,


https://www.smhi.se/kunskapsbanken/cirrus-fjadermoln-1.31791,
Accessed: 2022-04-25.

[8] "Why Are Clouds White?", World Atlas,


https://www.worldatlas.com/articles/why-are-clouds-white.html,
Accessed: 2022-04-26.

34
3D Cloud Visualization In Real-Time June 13, 2022

[9] L. Leelakrishnan, "Scattering of Light: Meaning, Types & Examples", Embibe,


https://www.embibe.com/exams/scattering-of-light/, Accessed:
2022-05-10.

[10] M. Hasan, S. Karim and E. Ahmed, "Generating and Rendering Procedural Clouds in Real
Time on Programmable 3D Graphics Hardware", 2005 Pakistan Section Multitopic Conference,
pp.1-6, 2005.

[11] T. Nishita, Y. Dobashi and E. Nakamae, "Display of clouds taking into account multiple
anisotropic scattering and sky light", In Proceedings of the 23rd Annual Conference on
Computer Graphics and Interactive Techniques, pp.379-386, 1996.

[12] M. Harris, Mark, A. Lastra, "Real-Time Cloud Rendering", Computer Graphics Forum,
Blackwell Publishers, pp.76-84, 2001.

[13] M. Harris, W. Baxter, T. Scheuermann, L. Thorsten, "Simulation of cloud dynamics on


graphics hardware", Proceedings of the SIGGRAPH/Eurographics Workshop on Graphics
Hardware, pp.92-101, 2003

[14] N. Max, "Optical models for direct volume rendering", in IEEE Transactions on Visualization
and Computer Graphics, vol.1, no.2, pp.99-108, 1995.

[15] "OpenGL", Learn OpenGL,


https://learnopengl.com/Getting-started/OpenGL, Accessed: 2022-04-29.

[16] "Shaders", Learn OpenGL,


https://learnopengl.com/Getting-started/Shaders, Accessed: 2022-04-29.

[17] "Hello Triangle", Learn OpenGL,


https://learnopengl.com/Getting-started/Hello-Triangle, Accessed:
2022-04-29.

[18] "Transformations", Learn OpenGL,


https://learnopengl.com/Getting-started/Transformations, Accessed:
2022-04-29.

[19] "HDR", Learn OpenGL, https://learnopengl.com/Advanced-Lighting/HDR,


Accessed: 2022-05-02.

35
3D Cloud Visualization In Real-Time June 13, 2022

[20] "Power Law (Gamma) Transformations", TheAILearner, https:


//theailearner.com/2019/01/26/power-law-gamma-transformations/,
Accessed: 2022-05-02.

[21] L-E. Janlert, T. Wiberg, "Datatyper och Algoritmer", Studentlitteratur AB. 2000.

[22] D. Brunner, "Frame Rate: A Beginner’s Guide", TechSmith,


https://www.techsmith.com/blog/frame-rate-beginners-guide/,
Accessed: 2022-05-03.

36

You might also like