Extension Writeup
Extension Writeup
Motion
A. Joshi and S. Alam
Abstract
1
6. Be able to animate the projectile More generally, we can say that this
point C will always be colinear with A and
1.2.2 Rendering a Sphere B, but isn’t nessecarily on the line segment
AB.
To render a sphere, we must note that we Staying with the two dimension story, if
have no way to render a sphere directly we were to draw a set point of points P that
(since we do not have access to OpenGL, were all on a straight line not going through
and could use gluSphere()). Note that the origin, and we were to normalise them
SFML does provide a method for us to ren- in reference to the origin, with a certain
der polygons, so we thought of rendering a distance d, we would contruct an arc of a
sphere in terms of polygons. circle with radius d, since all this exercise
To do so, we must first introduce the is, is drawing a set of points on a circle with
idea triangle subdivision. The idea is to radius d. It is then trivial to prove that the
take a triangle and divide it into smaller tri- same would hold in three dimensions2 .
angles (as the name would suggest). There The reason to even go through such an
are multiple ways of doing this (see [3] for exercise, it to realise that we can start of
more), and the general reasoning behind a octahedron, and then subdivide it, yield-
this that we can have a better more re- ing us the points on a straight line. Then
fined representation of any polygon, with- we can normalise these points to get the
out having to store extra information1 . See points on a sphere, and obviously we can
in figure 1 how by subdividing a tetrahe- also control the radius of such a sphere. To
dron, we can approximiate a smooth sur- keep things simple, we use an octahedron,
face. Usually a few iterations of this pro- because it is a comprised of 8 equilateral
cess is suffice to give a good approximation triangles, which are trivial to subdivide.
of the limit surface. Now that we have our points that we
The second idea we must introduce is can render, we need to somehow convert
of normalisation, with respect to a set dis- these from 3D to 2D, so that we can render
tance. Normally, normalisation preserves them on to the screen. This is where GLM
the direction of a vector, but scales it such does most of the heavy lifting, in that, we
that its magnitude is 1. Our normalisa- don’t have to manually construct the equa-
tion is a bit different, however, because we tions to this, but can leave it up to GLM
don’t end up with magnitude 1, but rather to do this for us.
a magnitude of a set distance.
Here is a two-dimensional example of
The Rendering Pipeline
normalisation with respect to a distance:
3 shows two points, A and B, and the line Throughout this procedure, we will be us-
drawn between them. Currently, the dis- ing 4D vectors (x, y, z, w), and 4×4 matri-
tance between A and B is 6 units, however ces. The reason for this is that we can use
if one were tasked to find a point on the the fourth dimension to store information
line AB that is 12 units away from A (see about the vector. I would recommend using
figure 4 as point C). [2] as a guide to further understanding the
1
Though, obviously we take a memory and computation penalty for this, we can achieve a smooth
limit surface
2
This is a exercise left to the reader
2
Figure 1: Image shows the tetrahedron being subdivided 0, 1, 2, 6 times
3
intricate process defined here. This being your camera 3 units to the right, this would
be equivalent of moving the entire world 3
If w = 0, then the vector is a direc- units to the left instead. Mathematically,
tion vector this is equivalent of describing everything
If w = 1, then the vector is a point in terms of the basis vectors defined relative
to the camera, rather than in world space.
To begin, all of the points that describe a This is the idea behind the view matrix.
sphere are relative to the origin (obviously),
Now that we are in Camera Space, we
however this origin is not nessecarily the
can start to project our points onto the
origin of the world (but rather relative to
screen. This is done by the projection ma-
the origin of the model). To make it rel-
trix. We obviously have to use the x and
ative to the world, we can apply a model
y coordinates of the points to determine
matrix transformation. The model matrix
where to place our points on the screen,
is consistent of:
however we must also use the z coordinate
A translation matrix – which de- to determine which point should be more
scribes the position of the object in on the screen than the other. The pro-
the world relative to the origin of the jection matrix converts the frustum of the
world. camera to a cube, and then scales the cube
to the screen. 5 shows the steps taken de-
A rotation matrix – which describes scribed here. Once our coordinates have
the orientation of the object in the been projected onto the screen, we can then
world relative to the basis vectors of render them using SFML, this is done by
the world. creating a vertex array, and then filling it
with the points we have projected onto the
A scaling matrix – which describes
screen.
the size of the object in the world rel-
ative to the basis vectors of the world. <<Get UV coordinate for a point
xyz>>
After applying the model matrix, our coor- <<Rendering a Sphere>>=
dinates are now in world space (points are <<Get subdivided octahedron>>
defined relative to the origin of the world). <<Map the octahedron onto a sphere>>
Quote from Futurama:
sf::Texture texture = sf::Texture();
‘The engines don’t move the for (int i = 0; i <
ship at all. The ship stays triangles.size(); i++) {
where it is and the engines glm::vec3 v1 = triangles[i].v1;
move the universe around it’ glm::vec3 v2 = triangles[i].v2;
glm::vec3 v3 = triangles[i].v3;
For example, if you want to view a moun-
tain from a different angle, you can either glm::vec4 p1 = MVP * glm::vec4(v1,
1.0f);
move the camera or move the mountain.
glm::vec4 p2 = MVP * glm::vec4(v2,
Whilst not practial in real life, the latter is 1.0f);
easier and simpler in CG than the former glm::vec4 p3 = MVP * glm::vec4(v3,
Intially, your camera is at the origin 1.0f);
of the world space and you want to move
4
Figure 5: shows the steps taken to get screen coordinates
5
the y axis3 . To begin with then4 :
sf::VertexArray
triangle(sf::Triangles, 3); y = − cos(θ)
triangle[0].position = x = − cos(ϕ) sin(ϕ)
sf::Vector2f(p1.x, p1.y);
triangle[1].position =
z = sin(ϕ) sin(θ)
sf::Vector2f(p2.x, p2.y); From this we can infer that:
triangle[2].position =
sf::Vector2f(p3.x, p3.y); θ = arccos(−y)
<<Set UV coordinates>> ϕ = atan2(z, −x)
window.draw(triangle, Where atan2 is the four-quadrant inverse
&texture);
}
tangent function. This returns values in
the range [−π, π], however these values go
from 0 to π, then flip to −π, proceeding
back to 0. While mathematically correct,
this cannot be used to map uv coordinate,
1.2.3 Mapping a texture onto the since we want a smooth transition from 0
sphere to 1.
Fortunately,
After the arduous task of getting the trian-
gles we want on to the screen, we can now atan2(a, b) = atan2(−a, −b) + π
move on to the task of mapping a texture
This formulation gives values in the de-
onto the sphere. To do so, we must in-
sired smooth range of [0, 2π], therefore
troduce the idea of uv coordinates. These
coordinates specify the location of a 2D ϕ = atan2(z, −x) + π
source image (or in some 2D parameter-
ized space). We need to find a mapping Since we have our θ and ϕ values, we
of a point from a 3D surface (in this case can now convert them to uv coordinates.
of a sphere) onto uv coordinates. This is done by:
uv coordinates are defined in the range ϕ
u=
[0, 1], where u is the horizontal coordinate 2π
and v is the vertical coordinate. Their θ
v=
range allows them to be used in any tex- π
ture, regardless of size, since they are rela- Now that we have our uv coordinates,
tive to the size of the texture. SFML provides a method of interpolation
For spheres, surface coordinates are de- between these coordinates defined by the
fined in terms of two angles θ and ϕ, where vertices of the triangles, so we need not
θ measures the angle made between the y worry about the interpolation of the uv co-
axis and the point and ϕ is the angle about ordinates:
3
Annoyingly, many textbook definitions of ϕ and θ are not only swapped, but also the axes of measure-
ment are also changed, we consider the ”poles” of our sphere to be the y axis, however many textbooks
consider the ”poles” to be the z axis, which ends up changing the equations in a subtle, yet frustrating
to debug manner.
4
Assuming unit sphere
6
conclude that our y coordinate will be the
<<Get UV coordinate for a point
sine of the latitude.
xyz>>=
glm::vec2 getUV(glm::vec3 xyz) {
By the same logic, we can infer that the
float theta = acos(-xyz.y); x and z coordinates will be the cosine of the
float phi = atan2(xyz.z, latitude since the x and z coordinates are
-xyz.x) + M_PI; the projection of the latitude onto the xz
return glm::vec2(phi / (2 * plane.
M_PI), theta / M_PI); The longitude will affect the x and z co-
} ordinates, since the longitude is the angle
<<Set UV coordinates>>=
about the y axis. The x coordinate will be
glm::vec2 uv1 = getUV(v1);
glm::vec2 uv2 = getUV(v2); the cosine of the longitude, and the z coor-
glm::vec2 uv3 = getUV(v3); dinate will be the sine of the longitude.
Therefore the equations are:
triangle[0].texCoords =
sf::Vector2f(uv1.x, 1 - uv1.y); y = sin(latitude)
triangle[1].texCoords = x = cos(latitude) cos(longitude)
sf::Vector2f(uv2.x, 1 - uv2.y);
triangle[2].texCoords = z = cos(latitude) sin(longitude)
sf::Vector2f(uv3.x, 1 - uv3.y);
1.2.5 Computing and drawing the
Interestingly, we have had to reverse our v trajectory of the projectile
coordinate, this is because SFML’s reading
We gave the user the option to select these
of texture coordinates is from the top left
configuration items for the projectile:
corner, rather than the bottom left corner.
This is a common convention in computer Latitude
graphics, and is something to be aware of.
Longitude
7
other basis vectors of the coordinate sys- tion to find the position of the projectile
tem. We can define the z ′ axis to be the at any given time. We can infer the direc-
normal to the sphere, and the x′ axis to be tion in which gravity will act in, since it
the basis vector ‘facing’ the Westerly direc- will into the center of the sphere. Since our
tion. The y ′ axis is then the basis vector sphere is centered at (0, 0, 0), we know that
facing the Northerly direction5 . the direction is simply the negative of the
We can create a local coordinate system position vector of the projectile.
by applying the cross product two times to Next we can calculate the acceleration
the normal of the sphere. The implemen- due to gravity, by using the formula:
tation we used is a derivation of the one
GM
defined in [4]. a=
r2
<<Get local coordinate system>>= One inaccuracy of our model, is that
void CoordinateSystem(const our mass of Earth (or Mars or Moon) M ,
glm::vec3 &v1, glm::vec3 *v2,
is scaled and not accurate to the real mass
glm::vec3* v3)
{ of the planet. Our scaled mass was calcu-
*v2 = glm::vec3(-v1.z, 0, lated as:
v1.x) / std::sqrt(v1.x *
v1.x + v1.z * v1.z); M = g ∗ r2 /G
8
(launchControlSettings.radius planetMass /
* (distanceFromCenter *
launchControlSettings.radius); distanceFromCenter);
9
getCartesian(glm::degrees(theta), transformedPoint.y =
glm::degrees(phi), 1) * length; (transformedPoint.y + 1.0f)
} * 0.5f * RENDER_HEIGHT;
transformedPoint.x =
(transformedPoint.x + 1.0f)
* 0.5f * RENDER_WIDTH;
10
References [3] Jos Stam. “Evaluation of Loop Sub-
division Surfaces”. In: 2010. url:
[1] Department of Applied Mathematics https : / / api . semanticscholar .
and Theoretical Physics. “Deflection org/CorpusID:8420692.
of a Projectile due to the Earth’s Ro-
tation”. In: (). url: https : / / www . [4] Vectors. https://pbr-book.org/3ed-
damtp . cam . ac . uk / user / reh10 / 2018/Geometry and Transformations/Vectors#Coordina
lectures/ia-dyn-handout14.pdf. (Visited on 08/12/2024).
[2] Essence of Linear Algebra. [5] Eric W. Weisstein. Spherical Coordi-
http://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE
nates. https://mathworld.wolfram.com/. ab.
(Visited on 08/09/2024). Text. (Visited on 08/12/2024).
11