0% found this document useful (0 votes)
8 views11 pages

Extension Writeup

Uploaded by

Sohaib Alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views11 pages

Extension Writeup

Uploaded by

Sohaib Alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Modelling and Simulating Complex Projectile

Motion
A. Joshi and S. Alam

Abstract

1 Intercontinental Projec- 1.1.3 GLM


tile Modelling The only new addition to our toolset was
GLM. We decided to use GLM for its vec-
1.1 Tools to begin with tor and matrix operations. If we had writ-
Before starting off with a project as such, it ten our own matrix and vector classes we
is imperative to take inventory of the tools would not have been able to optimise our
we have at our disposal, and begin building code as well as GLM, which would have
a solution to the problem statement. lead to inefficiencies, especially when we
aren’t using the GPU for these calculations.
1.1.1 SFML
1.2 Algorithm
As with our previous applications, we de-
cided to use SFML as our primary graphics 1.2.1 Motivation
library. We decided to keep using SFML, Our intention is to create a projectile
as to finish this project faster, and prevent launcher, that works on a model of the
ourselves from getting bogged down in the Earth. We can decompose this problem
details of learning a new library. into the following steps:
Since we were not using OpenGL di-
rectly, we will be rendering purely on the 1. Be able to render a sphere
CPU side, and will not be lending the
help of the GPU for rasterization. This 2. Be able to map a texture on to the
is a trade-off we were willing to make, as sphere
we were not looking to make a game, but 3. Be able to draw a point on the sphere
rather a simulation. (as the launch point)

1.1.2 ImGUI 4. Be able to compute and draw the tra-


jectory of the projectile
Like with SFML, we already had the code
infastructure to use ImGUI, and we decided 5. Be able to account for the rotation of
to use it for the same reasons as SFML. the Earth

1
6. Be able to animate the projectile More generally, we can say that this
point C will always be colinear with A and
1.2.2 Rendering a Sphere B, but isn’t nessecarily on the line segment
AB.
To render a sphere, we must note that we Staying with the two dimension story, if
have no way to render a sphere directly we were to draw a set point of points P that
(since we do not have access to OpenGL, were all on a straight line not going through
and could use gluSphere()). Note that the origin, and we were to normalise them
SFML does provide a method for us to ren- in reference to the origin, with a certain
der polygons, so we thought of rendering a distance d, we would contruct an arc of a
sphere in terms of polygons. circle with radius d, since all this exercise
To do so, we must first introduce the is, is drawing a set of points on a circle with
idea triangle subdivision. The idea is to radius d. It is then trivial to prove that the
take a triangle and divide it into smaller tri- same would hold in three dimensions2 .
angles (as the name would suggest). There The reason to even go through such an
are multiple ways of doing this (see [3] for exercise, it to realise that we can start of
more), and the general reasoning behind a octahedron, and then subdivide it, yield-
this that we can have a better more re- ing us the points on a straight line. Then
fined representation of any polygon, with- we can normalise these points to get the
out having to store extra information1 . See points on a sphere, and obviously we can
in figure 1 how by subdividing a tetrahe- also control the radius of such a sphere. To
dron, we can approximiate a smooth sur- keep things simple, we use an octahedron,
face. Usually a few iterations of this pro- because it is a comprised of 8 equilateral
cess is suffice to give a good approximation triangles, which are trivial to subdivide.
of the limit surface. Now that we have our points that we
The second idea we must introduce is can render, we need to somehow convert
of normalisation, with respect to a set dis- these from 3D to 2D, so that we can render
tance. Normally, normalisation preserves them on to the screen. This is where GLM
the direction of a vector, but scales it such does most of the heavy lifting, in that, we
that its magnitude is 1. Our normalisa- don’t have to manually construct the equa-
tion is a bit different, however, because we tions to this, but can leave it up to GLM
don’t end up with magnitude 1, but rather to do this for us.
a magnitude of a set distance.
Here is a two-dimensional example of
The Rendering Pipeline
normalisation with respect to a distance:
3 shows two points, A and B, and the line Throughout this procedure, we will be us-
drawn between them. Currently, the dis- ing 4D vectors (x, y, z, w), and 4×4 matri-
tance between A and B is 6 units, however ces. The reason for this is that we can use
if one were tasked to find a point on the the fourth dimension to store information
line AB that is 12 units away from A (see about the vector. I would recommend using
figure 4 as point C). [2] as a guide to further understanding the
1
Though, obviously we take a memory and computation penalty for this, we can achieve a smooth
limit surface
2
This is a exercise left to the reader

2
Figure 1: Image shows the tetrahedron being subdivided 0, 1, 2, 6 times

Figure 2: shows the subdivison process on a equilateral triangle

Figure 3: Image shows the tetrahedron being subdivided 0, 1, 2, 6 times

Figure 4: shows the point C, which is 12 units away from A

3
intricate process defined here. This being your camera 3 units to the right, this would
be equivalent of moving the entire world 3
ˆ If w = 0, then the vector is a direc- units to the left instead. Mathematically,
tion vector this is equivalent of describing everything
ˆ If w = 1, then the vector is a point in terms of the basis vectors defined relative
to the camera, rather than in world space.
To begin, all of the points that describe a This is the idea behind the view matrix.
sphere are relative to the origin (obviously),
Now that we are in Camera Space, we
however this origin is not nessecarily the
can start to project our points onto the
origin of the world (but rather relative to
screen. This is done by the projection ma-
the origin of the model). To make it rel-
trix. We obviously have to use the x and
ative to the world, we can apply a model
y coordinates of the points to determine
matrix transformation. The model matrix
where to place our points on the screen,
is consistent of:
however we must also use the z coordinate
ˆ A translation matrix – which de- to determine which point should be more
scribes the position of the object in on the screen than the other. The pro-
the world relative to the origin of the jection matrix converts the frustum of the
world. camera to a cube, and then scales the cube
to the screen. 5 shows the steps taken de-
ˆ A rotation matrix – which describes scribed here. Once our coordinates have
the orientation of the object in the been projected onto the screen, we can then
world relative to the basis vectors of render them using SFML, this is done by
the world. creating a vertex array, and then filling it
with the points we have projected onto the
ˆ A scaling matrix – which describes
screen.
the size of the object in the world rel-
ative to the basis vectors of the world. <<Get UV coordinate for a point
xyz>>
After applying the model matrix, our coor- <<Rendering a Sphere>>=
dinates are now in world space (points are <<Get subdivided octahedron>>
defined relative to the origin of the world). <<Map the octahedron onto a sphere>>
Quote from Futurama:
sf::Texture texture = sf::Texture();
‘The engines don’t move the for (int i = 0; i <
ship at all. The ship stays triangles.size(); i++) {
where it is and the engines glm::vec3 v1 = triangles[i].v1;
move the universe around it’ glm::vec3 v2 = triangles[i].v2;
glm::vec3 v3 = triangles[i].v3;
For example, if you want to view a moun-
tain from a different angle, you can either glm::vec4 p1 = MVP * glm::vec4(v1,
1.0f);
move the camera or move the mountain.
glm::vec4 p2 = MVP * glm::vec4(v2,
Whilst not practial in real life, the latter is 1.0f);
easier and simpler in CG than the former glm::vec4 p3 = MVP * glm::vec4(v3,
Intially, your camera is at the origin 1.0f);
of the world space and you want to move

4
Figure 5: shows the steps taken to get screen coordinates

5
the y axis3 . To begin with then4 :
sf::VertexArray
triangle(sf::Triangles, 3); y = − cos(θ)
triangle[0].position = x = − cos(ϕ) sin(ϕ)
sf::Vector2f(p1.x, p1.y);
triangle[1].position =
z = sin(ϕ) sin(θ)
sf::Vector2f(p2.x, p2.y); From this we can infer that:
triangle[2].position =
sf::Vector2f(p3.x, p3.y); θ = arccos(−y)
<<Set UV coordinates>> ϕ = atan2(z, −x)
window.draw(triangle, Where atan2 is the four-quadrant inverse
&texture);
}
tangent function. This returns values in
the range [−π, π], however these values go
from 0 to π, then flip to −π, proceeding
back to 0. While mathematically correct,
this cannot be used to map uv coordinate,
1.2.3 Mapping a texture onto the since we want a smooth transition from 0
sphere to 1.
Fortunately,
After the arduous task of getting the trian-
gles we want on to the screen, we can now atan2(a, b) = atan2(−a, −b) + π
move on to the task of mapping a texture
This formulation gives values in the de-
onto the sphere. To do so, we must in-
sired smooth range of [0, 2π], therefore
troduce the idea of uv coordinates. These
coordinates specify the location of a 2D ϕ = atan2(z, −x) + π
source image (or in some 2D parameter-
ized space). We need to find a mapping Since we have our θ and ϕ values, we
of a point from a 3D surface (in this case can now convert them to uv coordinates.
of a sphere) onto uv coordinates. This is done by:
uv coordinates are defined in the range ϕ
u=
[0, 1], where u is the horizontal coordinate 2π
and v is the vertical coordinate. Their θ
v=
range allows them to be used in any tex- π
ture, regardless of size, since they are rela- Now that we have our uv coordinates,
tive to the size of the texture. SFML provides a method of interpolation
For spheres, surface coordinates are de- between these coordinates defined by the
fined in terms of two angles θ and ϕ, where vertices of the triangles, so we need not
θ measures the angle made between the y worry about the interpolation of the uv co-
axis and the point and ϕ is the angle about ordinates:
3
Annoyingly, many textbook definitions of ϕ and θ are not only swapped, but also the axes of measure-
ment are also changed, we consider the ”poles” of our sphere to be the y axis, however many textbooks
consider the ”poles” to be the z axis, which ends up changing the equations in a subtle, yet frustrating
to debug manner.
4
Assuming unit sphere

6
conclude that our y coordinate will be the
<<Get UV coordinate for a point
sine of the latitude.
xyz>>=
glm::vec2 getUV(glm::vec3 xyz) {
By the same logic, we can infer that the
float theta = acos(-xyz.y); x and z coordinates will be the cosine of the
float phi = atan2(xyz.z, latitude since the x and z coordinates are
-xyz.x) + M_PI; the projection of the latitude onto the xz
return glm::vec2(phi / (2 * plane.
M_PI), theta / M_PI); The longitude will affect the x and z co-
} ordinates, since the longitude is the angle
<<Set UV coordinates>>=
about the y axis. The x coordinate will be
glm::vec2 uv1 = getUV(v1);
glm::vec2 uv2 = getUV(v2); the cosine of the longitude, and the z coor-
glm::vec2 uv3 = getUV(v3); dinate will be the sine of the longitude.
Therefore the equations are:
triangle[0].texCoords =
sf::Vector2f(uv1.x, 1 - uv1.y); y = sin(latitude)
triangle[1].texCoords = x = cos(latitude) cos(longitude)
sf::Vector2f(uv2.x, 1 - uv2.y);
triangle[2].texCoords = z = cos(latitude) sin(longitude)
sf::Vector2f(uv3.x, 1 - uv3.y);
1.2.5 Computing and drawing the
Interestingly, we have had to reverse our v trajectory of the projectile
coordinate, this is because SFML’s reading
We gave the user the option to select these
of texture coordinates is from the top left
configuration items for the projectile:
corner, rather than the bottom left corner.
This is a common convention in computer ˆ Latitude
graphics, and is something to be aware of.
ˆ Longitude

1.2.4 Drawing a point on the sphere ˆ Launch velocity

We decided that the user would be allowed ˆ Launch angle (cardinal)


to select a launch point (as this point would
ˆ Elevation angle
act as the starting point for our projectile).
And the easiest way for the user was to se- The latitude and longitude are easy to
lect latitude and longitude points, as these understand, and the launch velocity is the
are the most intuitive to the user. speed at which the projectile is launched.
The process from here is simply the in- The launch angle is the angle at which
verse of the process described above. the projectile is launched, with reference to
Also note that in our model, latitude/- the Westerly direction. The elevation an-
longtitude (0, 0) is the point (0, 0, −1) gle is the angle at which the projectile is
We can derive these equations, by real- launched, with reference to the horizon.
ising that since our sphere revolves around To visualise these the last 3 parame-
the y axis, only the latitude component ters properly, suppose a local coordinate
will affect our final y coordinate. Since our system, where the normal to the sphere is
sphere is also centered at the origin, we can (by definition) orthogonal to the to the two

7
other basis vectors of the coordinate sys- tion to find the position of the projectile
tem. We can define the z ′ axis to be the at any given time. We can infer the direc-
normal to the sphere, and the x′ axis to be tion in which gravity will act in, since it
the basis vector ‘facing’ the Westerly direc- will into the center of the sphere. Since our
tion. The y ′ axis is then the basis vector sphere is centered at (0, 0, 0), we know that
facing the Northerly direction5 . the direction is simply the negative of the
We can create a local coordinate system position vector of the projectile.
by applying the cross product two times to Next we can calculate the acceleration
the normal of the sphere. The implemen- due to gravity, by using the formula:
tation we used is a derivation of the one
GM
defined in [4]. a=
r2
<<Get local coordinate system>>= One inaccuracy of our model, is that
void CoordinateSystem(const our mass of Earth (or Mars or Moon) M ,
glm::vec3 &v1, glm::vec3 *v2,
is scaled and not accurate to the real mass
glm::vec3* v3)
{ of the planet. Our scaled mass was calcu-
*v2 = glm::vec3(-v1.z, 0, lated as:
v1.x) / std::sqrt(v1.x *
v1.x + v1.z * v1.z); M = g ∗ r2 /G

where g is the acceleration due to gravity


*v3 = glm::cross(v1, *v2);
} on the surface of the planet, r is the radius
of the planet (in our scaled version), and G
From this assumption, we can define all is the gravitational constant.
possible directions where the projectile can We can then calculate the acceleration
be thrown as a hemisphere, with radius of due to gravity as:
the launch velocity. Further, we can define float distanceFromCenter =
the launch angle to be the ‘longtitude’ and glm::distance(glm::vec3(0,
the elevation angle to be the ‘latitude’ of 0, 0), xyzPosition);
this hemisphere. g = launchControlSettings.bigG *
From this we can use the formulation planetMass /
given in [5] to find the components of ve- (distanceFromCenter *
locity vector with reference to the local co- distanceFromCenter);
ordinate system 6 :
acceleration = -g * difference;
vx = v cos(elevation) cos(launch)
This would integrate to the rest of the
vy = v cos(elevation) sin(launch)
code as follows:
vz = v sin(elevation)
float g =
Now that we know the velocity vector launchControlSettings.bigG *
of the projectile, we can use verlet integra- planetMass /
5
This is to say that x′ and y ′ is a propotional representation of the x and y axis of the world space
(since our sphere’s poles are through the y axis)
6
Again, note that vz is the cosine not the sine. This is because we want zero elevation to be the
horizon, and not the zenith.

8
(launchControlSettings.radius planetMass /
* (distanceFromCenter *
launchControlSettings.radius); distanceFromCenter);

glm::vec3 difference = acceleration = -g * difference;


glm::normalize(xyzPosition);
glm::vec3 acceleration = -g * points.push_back(xyzPosition);
difference; numPoints++;
}
int numPoints = 0;
int maxPoints = 1000; Note how we keep a track of the num-
ber of points, as we don’t want to calcu-
float dt = 0.001f;
late the trajectory of the projectile indef-
while (glm::distance(glm::vec3(0, 0,
0), xyzPosition) >= initely, causing memory and computation
launchControlSettings.radius issues later on (plus SFML’s draw calls for
&& points more than 1000 points is not the
numPoints < maxPoints) { most efficient.).
// update our position
xyzPosition += xyzVelocity * dt +
0.5f * acceleration * dt * dt; 1.2.6 Accounting for the rotation of
the Earth
// adjust our position based on
rotation We account for the rotation of the Earth,
xyzPosition = adjustforRotation( by shifting each point on the projectile by
xyzPosition, the same amount that the Earth has ro-
launchControlSettings.angularVelocity,
tated, in the time taken for the projectile
dt, numPoints);
to be at that point. This is done by:
// update our velocity // This function calculates the
xyzVelocity += acceleration * dt; current spherical coordinates
of the projectile,
// update our acceleration // And takes away some component with
difference = respect to the angular velocity
glm::normalize(xyzPosition); glm::vec3 adjustforRotation(glm::vec3
currentPos, float angularVel,
// Acceleration is calculated by float dt, int pointIndex) {
working out which component of float length =
the velocity glm::length(currentPos);
// will be affected the most of currentPos =
immediate effect of gravity glm::normalize(currentPos);
// by calculating the normaised float theta =
difference between the std::acos(-currentPos.y) -
position and the glm::pi<float>() / 2.f;
// center of the earth float phi =
float distanceFromCenter = std::atan2(-currentPos.z,
glm::distance(glm::vec3(0, 0, currentPos.x);
0), xyzPosition); phi -= angularVel * dt * pointIndex;
g = launchControlSettings.bigG * return

9
getCartesian(glm::degrees(theta), transformedPoint.y =
glm::degrees(phi), 1) * length; (transformedPoint.y + 1.0f)
} * 0.5f * RENDER_HEIGHT;

It is good to note that we could have sf::CircleShape circle(2);


also simply moved the landing position if (transformedPoint.z > 4.8) {
circle.setFillColor(sf::Color::Red);
of the projectile by the same amount the
} else {
Earth had rotatated (as described in [1]), circle.setFillColor(sf::Color::Magenta);
as an alternative method to account for the }
rotation of the Earth. circle.setPosition(transformedPoint.x,
transformedPoint.y);
window.draw(circle);
1.2.7 Animating the projectile }

The most trivial process of the entire al-


gorithm is the animation of the projec-
1.3 Results
tile. This is done by calculating how many
points in the projectile there are, with re- Our model is a good approximation of the
spect to a fixed time limit (.e.g. 5 seconds) real world, and can be used to simulate the
and then waiting for that many frames to trajectory of a projectile on Earth, Mars,
pass before moving on to the next point. and the Moon. The model is not perfect,
and there are some inaccuracies, such as
float timePerPoint = 5.f /
the mass of the planets, and the fact our
projectilePath.size();
if (animatationClock. model does not account for the Earth’s true
getElapsedTime().asSeconds() shape, nor its atmosphere. One big prob-
>= timePerPoint) lem with our model, is that there seems to
{ be artefacting of the texture on the sphere.
currentAnimatedPoint++; We believe this to be an issue with SFML’s
if (currentAnimatedPoint texture interpolation algorithm, and we are
>= projectilePath.size()) { not sure how to fix this. We have tried to
launchControlSettings.
increase the resolution of the sphere, but
isAnimated = false;
currentAnimatedPoint = 0; this has not fixed the issue.
}
animatationClock.restart();
}
for (int i = 0; i <
currentAnimatedPoint; i++)
{
glm::vec3 transformedPoint = mvp
*
glm::vec4(projectilePath[i],
1.0f);

transformedPoint.x =
(transformedPoint.x + 1.0f)
* 0.5f * RENDER_WIDTH;

10
References [3] Jos Stam. “Evaluation of Loop Sub-
division Surfaces”. In: 2010. url:
[1] Department of Applied Mathematics https : / / api . semanticscholar .
and Theoretical Physics. “Deflection org/CorpusID:8420692.
of a Projectile due to the Earth’s Ro-
tation”. In: (). url: https : / / www . [4] Vectors. https://pbr-book.org/3ed-
damtp . cam . ac . uk / user / reh10 / 2018/Geometry and Transformations/Vectors#Coordina
lectures/ia-dyn-handout14.pdf. (Visited on 08/12/2024).
[2] Essence of Linear Algebra. [5] Eric W. Weisstein. Spherical Coordi-
http://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE
nates. https://mathworld.wolfram.com/. ab.
(Visited on 08/09/2024). Text. (Visited on 08/12/2024).

11

You might also like