0% found this document useful (0 votes)
16 views4 pages

pdf24 Converted-30

Uploaded by

thickiscus
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views4 pages

pdf24 Converted-30

Uploaded by

thickiscus
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Reflection mapping

In computer graphics, reflection mapping or


environment mapping[1][2][3] is an efficient image-
based lighting technique for approximating the
appearance of a reflective surface by means of a
precomputed texture. The texture is used to store the
image of the distant environment surrounding the
rendered object.

Several ways of storing the surrounding environment


have been employed. The first technique was sphere
mapping, in which a single texture contains the image
of the surroundings as reflected on a spherical mirror.
It has been almost entirely surpassed by cube mapping, An example of reflection mapping

in which the environment is projected onto the six


faces of a cube and stored as six square textures or
unfolded into six square regions of a single texture. Other projections that have some superior
mathematical or computational properties include the paraboloid mapping, the pyramid mapping, the
octahedron mapping, and the HEALPix mapping.

Reflection mapping is one of several approaches to reflection rendering, alongside e.g. screen space
reflections or ray tracing which computes the exact reflection by tracing a ray of light and following its
optical path. The reflection color used in the shading computation at a pixel is determined by
calculating the reflection vector at the point on the object and mapping it to the texel in the
environment map. This technique often produces results that are superficially similar to those
generated by raytracing, but is less computationally expensive since the radiance value of the
reflection comes from calculating the angles of incidence and reflection, followed by a texture lookup,
rather than followed by tracing a ray against the scene geometry and computing the radiance of the
ray, simplifying the GPU workload.

However, in most circumstances a mapped reflection is only an approximation of the real reflection.
Environment mapping relies on two assumptions that are seldom satisfied:

1. All radiance incident upon the object being shaded comes from an infinite distance. When this is
not the case the reflection of nearby geometry appears in the wrong place on the reflected object.
When this is the case, no parallax is seen in the reflection.
2. The object being shaded is convex, such that it contains no self-interreflections. When this is not
the case the object does not appear in the reflection; only the environment does.
Environment mapping is generally the fastest method of rendering a reflective surface. To further
increase the speed of rendering, the renderer may calculate the position of the reflected ray at each
vertex. Then, the position is interpolated across polygons to which the vertex is attached. This
eliminates the need for recalculating every pixel's reflection direction.
If normal mapping is used, each polygon has many face normals (the direction a given point on a
polygon is facing), which can be used in tandem with an environment map to produce a more realistic
reflection. In this case, the angle of reflection at a given point on a polygon will take the normal map
into consideration. This technique is used to make an otherwise flat surface appear textured, for
example corrugated metal, or brushed aluminium.

Types

Sphere mapping
Sphere mapping represents the sphere of incident illumination as though it were seen in the reflection
of a reflective sphere through an orthographic camera. The texture image can be created by
approximating this ideal setup, or using a fisheye lens or via prerendering a scene with a spherical
mapping.

The spherical mapping suffers from limitations that detract from the realism of resulting renderings.
Because spherical maps are stored as azimuthal projections of the environments they represent, an
abrupt point of singularity (a "black hole" effect) is visible in the reflection on the object where texel
colors at or near the edge of the map are distorted due to inadequate resolution to represent the points
accurately. The spherical mapping also wastes pixels that are in the square but not in the sphere.

The artifacts of the spherical mapping are so severe that it is effective only for viewpoints near that of
the virtual orthographic camera.

Cube mapping
Cube mapping and other polyhedron mappings address the severe distortion of sphere maps. If cube
maps are made and filtered correctly, they have no visible seams, and can be used independent of the
viewpoint of the often-virtual camera acquiring the map. Cube and other polyhedron maps have since
superseded sphere maps in most computer graphics applications, with the exception of acquiring
image-based lighting. Image-based lighting can be done with parallax-corrected cube maps.[4]

Generally, cube mapping uses the same skybox that is used in outdoor renderings. Cube-mapped
reflection is done by determining the vector that the object is being viewed at. This camera ray is
reflected about the surface normal of where the camera vector intersects the object. This results in the
reflected ray which is then passed to the cube map to get the texel which provides the radiance value
used in the lighting calculation. This creates the effect that the object is reflective.

HEALPix mapping
HEALPix environment mapping is similar to the other polyhedron mappings, but can be hierarchical,
thus providing a unified framework for generating polyhedra that better approximate the sphere. This
allows lower distortion at the cost of increased computation.[5]

History
In 1974, Edwin Catmull created an algorithm for "rendering
images of bivariate surface patches"[6][7] which worked directly
with their mathematical definition. Further refinements were
researched and documented by Bui-Tuong Phong in 1975, and
later James Blinn and Martin Newell, who developed environment
mapping in 1976; these developments which refined Catmull's
original algorithms led them to conclude that "these
generalizations result in improved techniques for generating
patterns and texture".[6][8][9]

Gene Miller experimented with spherical environment mapping in


1982 at MAGI.

Wolfgang Heidrich introduced Paraboloid Mapping in 1998.[10]


A diagram depicting an apparent
reflection being provided by cube-
Emil Praun introduced Octahedron Mapping in 2003.[11]
mapped reflection. The map is
actually projected onto the surface
Mauro Steigleder introduced Pyramid Mapping in 2005.[12]
from the point of view of the
observer. Highlights which in
Tien-Tsin Wong, et al. introduced the existing HEALPix mapping
raytracing would be provided by
for rendering in 2006.[5] tracing the ray and determining the
angle made with the normal, can be
See also "fudged", if they are manually
painted into the texture field (or if
Skybox (video games) they already appear there
depending on how the texture map
was obtained), from where they will
References be projected onto the mapped
object along with the rest of the
1. "Higher Education | Pearson" (http://www.pearsonhighered.co texture detail.
m/samplechapter/0321194969.pdf) (PDF).
2. http://web.cse.ohio-state.edu/~whmin/courses/cse5542-2013-
spring/17-env.pdf
3. http://www.ics.uci.edu/~majumder/VC/classes/BEmap.pdf
4. "Image-based Lighting approaches and parallax-corrected
cubemap" (http://seblagarde.wordpress.com/2012/09/29/image
-based-lighting-approaches-and-parallax-corrected-cubemap/).
29 September 2012.
5. Tien-Tsin Wong, Liang Wan, Chi-Sing Leung, and Ping-Man
Lam. Real-time Environment Mapping with Equal Solid-Angle
Spherical Quad-Map (http://appsrv.cse.cuhk.edu.hk/~lwan/pap
er/sphquadmap/sphquadmap.htm), Shader X4: Lighting &
Rendering, Charles River Media, 2006.
6. Blinn, James F.; Newell, Martin E. (October 1976). "Texture An image used in early reflection
and reflection in computer generated images" (https://dl.acm.or mapping, created in 1976 by James
g/doi/10.1145/360349.360353). Communications of the ACM. F. Blinn.
19 (10): 542–547. doi:10.1145/360349.360353 (https://doi.org/
10.1145%2F360349.360353). ISSN 0001-0782 (https://www.w
orldcat.org/issn/0001-0782).
7. Catmull, E.A. Computer display of curved surfaces. Proc.
Conf. on Comptr. Graphics, Pattern Recognition, and Data
Structure, May 1975, pp. 11-17 (IEEE Cat. No. 75CH0981-1C).
8. "Computer Graphics" (http://www.comphist.org/computing_hist
ory/new_page_6.htm).
9. "Reflection Mapping History" (http://www.debevec.org/Reflectio
nMapping/).
10. Heidrich, W., and H.-P. Seidel. "View-Independent
Environment Maps". Eurographics Workshop on Graphics
Hardware 1998, pp. 39–45.
11. Emil Praun and Hugues Hoppe. "Spherical parametrization
and remeshing". ACM Transactions on Graphics, 22(3):340–
Example of a three-dimensional
349, 2003.
model using cube-mapped reflection
12. Mauro Steigleder. "Pencil Light Transport". A thesis presented
to the University of Waterloo, 2005.

External links
The Story of Reflection mapping (http://www.pauldebevec.com/ReflectionMapping/) by Paul
Debevec
NVIDIA's paper (http://www.nvidia.com/object/feature_cube.html) Cube Environment Mapping
Approximation of reflective and transparent objects with environmental maps (http://sunandblackc
at.com/tipFullView.php?l=eng&topicid=16)

Retrieved from "https://en.wikipedia.org/w/index.php?title=Reflection_mapping&oldid=1214873768"

You might also like