lectureXX OpenGL
lectureXX OpenGL
Programming
Adapted from SIGGRAPH 2012 slides by
Ed Angel
University of New Mexico
and
Dave Shreiner
ARM, Inc
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
Outline
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
In the Beginning …
OpenGL 1.0 was released on July 1st, 1994
Its pipeline was entirely fixed-function
the only operations available were fixed by the
implementation
Vertex
Vertex
Transform and
Data
Lighting
Primitive Fragment
Setup and Coloring and Blending
Rasterization Texturing
Pixel
Data Texture
Store
Vertex
Vertex
Transform and
Data
Lighting
Primitive Fragment
Setup and Coloring and Blending
Rasterization Texturing
Pixel
Data Texture
Store
An Evolutionary Change
Vertex Vertex
Data Shader
Primitive
Fragment
Setup and Blending
Shader
Rasterization
Pixel Texture
Data Store
Vertex Vertex
Data Shader
Primitive
Fragment
Setup and Blending
Shader
Geometry Rasterization
Shader
Pixel Texture
Data Store
More Evolution – Context Profiles
Vertex Vertex
Data Shader
Primitive
Fragment
Setup and Blending
Shader
Rasterization
Tessellation Tessellation
Geometry
Control Evaluation
Shader
Shader Shader
Pixel Texture
Data Store
OpenGL ES and WebGL
OpenGL ES 2.0
Designed for embedded and hand-held devices such as cell
phones
Based on OpenGL 3.1
Shader based
WebGL
JavaScript implementation of ES 2.0
Runs on most recent browsers
OpenGL Application
Development
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
A Simplified Pipeline Model
Application
GPU Data Flow Framebuffer
Vertices
Vertices
Fragments
Pixels
Vertex Fragment
Rasterizer
Processing Processing
Vertex Fragment
Shader Shader
OpenGL Programming in a Nutshell
GL_TRIANGLES
GL_TRIANGLE_FAN
GL_TRIANGLE_STRIP
A First Program
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
Rendering a Cube
We’ll render a cube with colors at each vertex
Our example demonstrates:
initializing vertex data
organizing data for rendering
simple object modeling
building up 3D objects from geometric primitives
building geometric primitives from vertices
Initializing the Cube’s Data
We’ll build each cube face from individual
triangles
Need to determine how much storage is required
(6 faces)(2 triangles/face)(3 vertices/triangle)
const
int
NumVertices
=
36;
To simplify communicating with GLSL, we’ll use a
vec4 class (implemented in C++) similar to GLSL’s
vec4 type
we’ll also typedef it to add logical meaning
typedef
vec4
point4;
typedef
vec4
color4;
Initializing the Cube’s Data (cont’d)
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
GLSL Data Types
Scalar types: float, int, bool
mat4
m;
vec4
a,
b,
c;
b
=
a*m;
c
=
m*a;
Components and Swizzling
For vectors can use [ ], xyzw, rgba or stpq
Example:
vec3
v;
v[1],
v.y,
v.g,
v.t
all refer to the same element
Swizzling:
vec3
a,
b;
a.xy
=
b.yx;
Qualifiers
in, out
Copy vertex attributes and other variables to/from
shaders
in
vec2
tex_coord;
out
vec4
color;
Uniform: variable from application
uniform
float
time;
uniform
vec4
rotation;
Flow Control
if
if else
expression ? true-expression : false-
expression
while, do while
for
Functions
Built in
Arithmetic: sqrt, power, abs
Trigonometric: sin, asin
Graphical: length, reflect
User defined
Built-in Variables
gl_Position: output position from vertex
shader
gl_FragColor: output color from fragment
shader
Only for ES, WebGL and older versions of GLSL
Present version use an out variable
Simple Vertex Shader for Cube
in
vec4
vPosition;
in
vec4
vColor;
out
vec4
color;
void
main()
{
color
=
vColor;
gl_Position
=
vPosition;
}
The Simplest Fragment Shader
in
vec4
color;
out
vec4
FragColor;
void
main()
{
FragColor
=
color;
}
Getting Shaders into OpenGL
Create
glCreateProgram()
Shaders need to be compiled Program
and linked to form an
Create
executable shader program Shader
glCreateShader() These
steps need
OpenGL provides the compiler to be
Load Shader
and linker Source
glShaderSource() repeated
for each
A program must contain type of
Compile shader in
vertex and fragment Shader
glCompileShader() the shader
program
shaders
Attach Shader
other shaders are optional to Program
glAttachShader()
Link
glLinkProgram()
Program
GLint idx =
glGetAttribLocation(program, “name”);
GLint idx =
glGetUniformLocation(program, “name”);
Initializing Uniform Variable Values
Uniform Variables
glUniform4f(index,
x,
y,
z,
w);
Glboolean
transpose
=
GL_TRUE;
//
Since
we’re
C
programmers
Glfloat
mat[3][4][4]
=
{
…
};
viewing
volume
camera
model
tripod
Transformations
" Transformations take us from one “space” to
another
" All of our transforms are 4×4 matrices
Modeling Modeling
Transform" Transform"
Object Coords.
Perspective
Vertex Model-View Projection
Division"
Viewport 2D Window
Data Transform" Transform" Transform" Coordinates
(w)"
Normalized
World Coords. Eye Coords. Clip Coords. Device
Coords.
Camera Analogy Transform Sequence
Modeling transformations
assemble the world and move the objects
Viewing transformations
define position and orientation of the viewing
volume in the world
Projection transformations
adjust the lens of the camera
Viewport transformations
enlarge or reduce the physical photograph
3D Homogeneous Transformations
A vertex is matrices are always
transformed by 4×4 post-multiplied
matrices product of matrix and
vector is
all affine operations
are matrix
Mv
multiplications
all matrices are stored ⎡m0 m4 m8 m12 ⎤
column-major in ⎢ m ⎥
OpenGL ⎢ 1 m5 m9 m13 ⎥
M=
this is opposite of ⎢m2 m6 m10 m14 ⎥
what “C” ⎢ ⎥
programmers expect
⎣ m3 m7 m11 m15 ⎦
View Specification
Set up a viewing frustum to specify how much
of the world we can see
Done in two steps
specify the size of the frustum (projection transform)
specify its location in space (model-view transform)
Anything outside of the viewing frustum is
clipped
primitive is either modified or discarded (if entirely
outside frustum)
View Specification (cont’d)
OpenGL projection model uses eye coordinates
the “eye” is located at the origin
looking down the -z axis
Projection matrices use a six-plane model:
near (image) plane and far (infinite) plane
both are distances from the eye (positive values)
enclosing planes
top & bottom, left & right
Viewing Transformations
Position the camera/eye in the scene
To “fly through” a scene
change viewing transformation and
redraw scene
LookAt(eyex,
eyey,
eyez,
lookx,
looky,
lookz,
upx,
upy,
upz)
up vector determines unique orientation
careful of degenerate positions
Translation
&1 0 0 tx #
$ !
$0 1 0 ty !
T (t x , t y , t z ) = $ !
$0 0 1 tz !
$ !
$0 0 0 1 !"
%
Scale
& sx 0 0 0#
$ !
$0 sy 0 0!
S (sx , s y , sz ) = $ !
$0 0 sz 0!
$ !
$0 0 0 1 !"
% Note, there’s a translation applied here to
make things easier to see
Rotation
Rotate coordinate system about an axis in space
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
Lighting Principles
Lighting simulates how objects reflect light
material composition of object
light’s color and position
global lighting parameters
Lighting functions deprecated in 3.1
Can implement in
Application (per vertex)
Vertex or fragment shaders
Modified Phong Model
Computes a color or shade for each vertex using a
lighting model (the modified Phong model) that takes
into account
Diffuse reflections
Specular reflections
Ambient light
Emission
Vertex shades are interpolated across polygons by the
rasterizer
Modified Phong Model
Property Description
Diffuse Base object color
Specular Highlight color
Ambient Low-light color
Emission Glow color
Surface
Shininess
smoothness
you can have separate materials for front and back
Adding Lighting to Cube
//
vertex
shader
in
vec4
vPosition;
in
vec3
vNormal;
out
vec4
color;
uniform
vec4
AmbientProduct,
DiffuseProduct,
SpecularProduct;
uniform
mat4
ModelView;
uniform
mat4
Projection;
uniform
vec4
LightPosition;
uniform
float
Shininess;
Adding Lighting to Cube
void
main()
{
//
Transform
vertex
position
into
eye
coordinates
vec3
pos
=
(ModelView
*
vPosition).xyz;
vec3
L
=
normalize(LightPosition.xyz
-‐
pos);
vec3
E
=
normalize(-‐pos);
vec3
H
=
normalize(L
+
E);
//
Transform
vertex
normal
into
eye
coordinates
vec3
N
=
normalize(ModelView
*
vec4(vNormal,
0.0)).xyz;
Adding Lighting to Cube
//
Compute
terms
in
the
illumination
equation
vec4
ambient
=
AmbientProduct;
float
Kd
=
max(dot(L,
N),
0.0);
vec4
diffuse
=
Kd*DiffuseProduct;
float
Ks
=
pow(max(dot(N,
H),
0.0),
Shininess);
vec4
specular
=
Ks
*
SpecularProduct;
if(dot(L,
N)
<
0.0)
specular
=
vec4(0.0,
0.0,
0.0,
1.0)
gl_Position
=
Projection
*
ModelView
*
vPosition;
color
=
ambient
+
diffuse
+
specular;
color.a
=
1.0;
}
Shader Examples
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
Fragment Shaders
A shader that’s executed for each “potential” pixel
fragments still need to pass several tests before making it to
the framebuffer
There are lots of effects we can do in fragment shaders
Per-fragment lighting
Bump Mapping
Environment (Reflection) Maps
Per Fragment Lighting
Compute lighting using same model as for per
vertex lighting but for each fragment
Normals and other attributes are sent to vertex
shader and output to rasterizer
Rasterizer interpolates and provides inputs for
fragment shader
Shader Examples
Vertex Shaders
Moving vertices: height fields
Per vertex lighting: height fields
Per vertex lighting: cartoon shading
Fragment Shaders
Per vertex vs. per fragment lighting: cartoon shader
Samplers: reflection Map
Bump mapping
Height Fields
A height field is a function y = f(x, z) where the
y value represents a quantity such as the height
above a point in the x-z plane.
Heights fields are usually rendered by sampling
the function to form a rectangular mesh of
triangles or rectangles from the samples yij =
f(xi, zj)
Displaying a Height Field
Form a quadrilateral mesh
for(i=0;i<N;i++)
for(j=0;j<N;j++)
data[i][j]=f(i,
j,
time);
vertex[Index++]
=
vec3((float)i/N,
data[i][j],
(float)j/N);
vertex[Index++]
=
vec3((float)i/N,
data[i][j],
(float)(j+1)/N);
vertex[Index++]
=
vec3((float)(i+1)/N,
data[i][j],
(float)(j+1)/N);
vertex[Index++]
=
vec3((float)(i+1)/N,
data[i][j],
(float)(j)/N);
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell
Texture Mapping
z x
geometry screen
t
image
s
Texture Mapping in OpenGL
Images and geometry flow through separate
pipelines that join at the rasterizer
“complex” textures do not affect geometric
complexity
Vertices
Geometry
Pipeline
Fragment
Rasterizer
Shader
Pixel
Pixels
Pipeline
Applying Textures
Three basic steps to applying a texture
1. specify the texture
read or generate image
assign to texture
enable texturing
2. assign texture coordinates to vertices
3. specify texture parameters
wrapping, filtering
Applying Textures
1. specify textures in texture objects
2. set texture filter
3. set texture function
4. set texture wrap mode
5. set optional perspective correction hint
6. bind texture object
7. enable texturing
8. supply texture coordinates for vertex
Texture Objects
Have OpenGL store your images
one image per texture object
may be shared by several graphics contexts
c (0.4, 0.2)
b
B C
0, 0 1, 0 s (0.8, 0.4)
Applying the Texture in the Shader
// Declare the sampler
uniform sampler2D diffuse_mat;
// GLSL 3.30 has overloaded texture();
// Apply the material color
vec3 diffuse = intensity *
texture2D(diffuse_mat, coord).rgb;
Texturing the Cube
// add texture coordinate attribute to quad
function
void main() {
color = vColor;
texCoord = vTexCoord;
gl_Position = vPosition;
}
Fragment Shader
in vec4 color;
in vec2 texCoord;
out vec4 FragColor;
void main() {
FragColor = color * texture(texture, texCoord);
}
Next class: Visual Perception
" Topic:
How does the human visual system?
How do humans perceive color?
How do we represent color in computations?
" Read:
• Glassner, Principles of Digital Image Synthesis,
pp. 5-32. [Course reader pp.1-28]
• Watt , Chapter 15.
• Brian Wandell. Foundations of Vision. Sinauer
Associates, Sunderland, MA, pp. 45-50 and
69-97, 1995.
[Course reader pp. 29-34 and pp. 35-63]
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell 106