UNIT 5 Visible Surface Detection Method
UNIT 5 Visible Surface Detection Method
com
(Graphics)
1/19/2018
Nipun Thapa
Visible Surface Detection
Method
Unit 5
1
https://genuinenotes.com
1/19/2018
algorithms for efficient identification of visible objects for
different types of applications. These various algorithms are
referred to as visible-surface detection methods. Sometimes
1/19/2018
• Compares objects and parts of objects to each other within the scene definition to determine which
surface as a whole we should label as visible.
• Deal with object definition
• E.g. Back-face detection method
• Image-Space methods:
{ Note : Most visible surface detection algorithm use image-space-method but in some cases object space methods are
also used for it.} 3
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
4
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
5
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
6
https://genuinenotes.com
Object-Space methods
1/19/2018
Nipun Thapa (Graphics)
7
https://genuinenotes.com
Object-Space methods
1/19/2018
Nipun Thapa (Graphics)
8
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
9
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
10
https://genuinenotes.com
1/19/2018
TWO METHODS:
First Method:
• A point (x, y, z) is "inside" a polygon surface with plane parameters
1/19/2018
components (A, B, C). In general, if V is a vector in the viewing
direction from the eye (or "camera") position, then this polygon is a
back face
13
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
14
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
15
https://genuinenotes.com
Back – Face Detection Method
A view vector V is constructed from any point on the surface to the
viewpoint, the dot product of this vector and the normal N, indicates visible faces
as follows:
Case-I: (FRONT FACE)
1/19/2018
If V.N < 0 the face is visible else face is hidden
Case-III:
For other objects, such as the concave polyhedron in Fig., more
tests need to be carried out to determine whether there are additional
faces that are totally or partly obscured by other faces. 16
https://genuinenotes.com
Concave polyhedron
1/19/2018
Nipun Thapa (Graphics)
17
https://genuinenotes.com
and three mutually perpendicular coordinate axes, OX ,OY and OZ ,see Figure .
1/19/2018
Nipun Thapa (Graphics)
18
https://genuinenotes.com
Depth Buffer Method (Z-Buffer Method)
• A commonly used image-space approach to detecting visible
surfaces is the depth-buffer method, which compares surface
1/19/2018
depths at each pixel position on the projection plane.
• Also called z-buffer method since depth usually measured
1/19/2018
• Frame buffer (Refresh buffer): Stores the surface-intensity values or color
values for each pixel position.
20
https://genuinenotes.com
Depth Buffer Method (Z-Buffer Method)
1/19/2018
Nipun Thapa (Graphics)
21
https://genuinenotes.com
Depth – Buffer (Z – Buffer Method)
Initially, all positions in the depth buffer are set to 0 (minimum
depth), and the refresh buffer is initialized to the background intensity.
1/19/2018
Each surface listed in the polygon tables is then processed, one scan
line at a time, calculating the depth (z-value) at each (x, y) pixel
1/19/2018
refresh(x, y) = Ibackground,
(where Ibackground is the value for the background intensity.)
2.1. Calculate the depth z for each (x, y) position on the polygon.
2.2. If Z > depth(x, y), then set
depth(x, y)=z
refresh(x, y)= Isurf(x, y),
(where Isurf(x, y) is the intensity value for the surface at pixel position (x, y). )
23
3. After all pixels and surfaces are compared, draw object using X,Y,Z from depth and
intensity refresh buffer.
https://genuinenotes.com
Depth Buffer Method (Z-Buffer Method)
• After all surfaces have been processed the depth buffer
contains depth values for the visible surfaces and the refresh
1/19/2018
buffer contains the corresponding intensity values for those
surfaces
Let depth z’ at (x + 1 , y)
z’ = -A(x+1) – By – D/c
or 24
z’ = z – A/c………..(ii)
https://genuinenotes.com
Depth Buffer Method (Z-Buffer Method)
1/19/2018
Nipun Thapa (Graphics)
25
https://genuinenotes.com
Depth Buffer Method (Z-Buffer Method)
1/19/2018
Nipun Thapa (Graphics)
26
https://genuinenotes.com
Depth Buffer Method (Z-Buffer Method)
1/19/2018
Nipun Thapa (Graphics)
27
https://genuinenotes.com
A – Buffer Method
• The A-buffer (anti-aliased, area-averaged, accumulation buffer) is
an extension of the ideas in the depth-buffer method (other end of
1/19/2018
the alphabet from "z-buffer").
• A drawback of the depth-buffer method is that it deals only with
A – Buffer Method
• The A-buffer expands on the depth buffer method to allow
transparencies. The key data structure in the A-buffer is the
1/19/2018
accumulation buffer
A – Buffer Method
1/19/2018
Each pixel position in the A-Buffer has two fields
Depth Field : stores a positive or negative real number
1/19/2018
If depth is >= 0, then the surface data field stores the depth of that pixel
position as before (SINGLE SURFACE)
If depth < 0 then the data filed stores a pointer to a linked list of surface
data (MULTIPLE SURFACE)
(If the depth field is negative, this indicates multiple-surface contributions to the pixel intensity. The
intensity field then stores a pointer to a linked list of surface data, as in second figure. Data for each surface
in the linked list includes: RGB intensity components, opacity parameter (percent of transparency), depth,
percent of area coverage, surface identifier, other surface-rendering parameters, and pointer to next 31
surface)
https://genuinenotes.com
A – Buffer Method
• The A-buffer can be constructed using methods similar to
those in the depth-buffer algorithm. Scan lines are processed
1/19/2018
to determine surface overlaps of pixels across the individual
scan lines. Surfaces are subdivided into a polygon mesh and
32
https://genuinenotes.com
A – Buffer Method
• The algorithm proceeds just like the depth buffer algorithm
• The depth and opacity values are used to determine the final
1/19/2018
colour of a pixel
• Scan lines are processed to determine surface overlaps of pixels
across the individual scan lines.
1/19/2018
of decreasing depth from viewer.
• Then sorted surface are scan converted in order starting with
34
The conceptual steps that performed
https://genuinenotes.com
1/19/2018
(farthest) Z co-ordinate of each.
2. Resolve any ambiguity this may cause when the
polygons Z extents overlap, splitting polygons if
1/19/2018
completely obscure the previously displayed surface.
Essentially, we are sorting the surface into priority order
such that surface with lower priority (lower z, far objects)
36
https://genuinenotes.com
1/19/2018
starting with the background and then progressively adding
new (nearer) objects to the canvas.
1/19/2018
Nipun Thapa (Graphics)
38
https://genuinenotes.com
1/19/2018
surfaces. As shown in fig. below.
1/19/2018
• For intersecting polygons, we can split one polygon into
two or more polygons which can then be painted from
back to front. This needs more time to compute
40
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
41
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
42
https://genuinenotes.com
1/19/2018
Nipun Thapa (Graphics)
43
Scan-Line Method https://genuinenotes.com
1/19/2018
which is nearest to view plane.
• When the visible surface has been determined , the intensity value for that position is entered
into the refresh buffer.
DATA STRUCTURE
• A. Edge table containing
Scan-Line Method
I. Initialize the necessary data structure
1. Edge table containing end point coordinates, inverse slope and
1/19/2018
polygon pointer.
2. Surface table containing plane coefficients and surface intensity
3. Active Edge List
Scan-Line Method
• For scan line 1
• The active edge list contains edges AB,BC,EH, FG
• Between edges AB and BC, only flags for s1 == on and between edges EH
1/19/2018
and FG, only flags for s2==on
• no depth calculation needed and corresponding surface intensities are
entered in refresh buffer
• For scan line 2
• The active edge list contains edges AD,EH,BC and FG
Scan-Line Method
Problem:
1/19/2018
Dealing with cut through surfaces and
cyclic overlap is problematic when
used coherent properties
47
https://genuinenotes.com
1/19/2018
Unit 5 Finished