0% found this document useful (0 votes)
2 views37 pages

L2 - Image Formation & Representation I

The document provides an overview of computer vision, focusing on image formation and representation, including the steps involved in capturing images and the factors affecting image quality. It discusses the roles of light sources, reflectance, shading models, and the Bidirectional Reflectance Distribution Function (BRDF) in image formation. Additionally, it covers the imaging process, types of image sensors, and factors influencing their performance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views37 pages

L2 - Image Formation & Representation I

The document provides an overview of computer vision, focusing on image formation and representation, including the steps involved in capturing images and the factors affecting image quality. It discusses the roles of light sources, reflectance, shading models, and the Bidirectional Reflectance Distribution Function (BRDF) in image formation. Additionally, it covers the imaging process, types of image sensors, and factors influencing their performance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

EC9580: COMPUTER VISION

IMAGE FORMATION &


REPRESENTATION
Ms. Sujanthika M
Lecturer
Department of Computer Engineering
[email protected]
REFERENCES
1. Reinhard Klette, Concise Computer Vision: An Introduction Into Theory
and Algorithms, 1st edition, Springer.
2. Richard Szeliski, Computer Vision: Algorithms and Applications, 2nd
edition, Springer.

2
CHAPTER OVERVIEW
▪ Lecture – 02 hours
▪ Lab – 00 hours
▪ Assignment – 00 hours

3
IMAGE FORMATION STEPS
World

▪ World - reality
Optics
▪ Optics - focus light from the world on
a sensor
Sensor ▪ Sensor - converts light to electrical
energy

Signal ▪ Signal - representation of incident


light as continuous electrical energy
▪ Digitizer - converts continuous signal
Digitizer to discrete signal
▪ Digital Rep. - final representation of
Digital reality in computer memory
Representation
4
FACTORS AFFECTING IMAGE FORMATION
▪ Geometry
▪ concerned with the relationship between points in the three-dimensional world and their
images
▪ Radiometry
▪ concerned with the relationship between the amount of light radiating from a surface and
the amount incident at its image
▪ Photometry
▪ concerned with ways of measuring the intensity of light

▪ Digitization
▪ concerned with ways of converting continuous signals (in both space and time) to digital
approximations

5
PHOTOMETRIC IMAGE FORMATION
▪ This focuses on, apart from geometric features, how image formation depends on
discrete color and intensity values.
▪ How these values are related to the lighting in the environment, surface properties
and geometry, camera optics, and sensor properties?

6
IT ALL STARTS WITH LIGHT
▪ To being able to see any 3D scene,
our eyes or a digital camera needs
to capture the light radiation
reflected from scene surfaces
▪ To produce an image, the scene
must be illuminated with one or
more light sources

7
LIGHT SOURCES
1. Point light source
o Originates at a single location (sun / light bulb)
o This has an intensity and a color spectrum
o The intensity of a light source falls off with the square of the distance between the
source and the object being lit, because the same light is being spread over a larger
(spherical) area.
2. Area light source
o Complicated light sources
o A simple area light source such as a fluorescent ceiling light fixture with a diffuser
can be modeled as a finite rectangular area emitting light equally in all directions

8
VISIBLE LIGHT - PART OF
ELECTROMAGNETIC SPECTRUM

9
FACTORS THAT AFFECT IMAGE FORMATION
▪ The strength and direction of the light emitted from the source
▪ The material and surface geometry along with other nearby surfaces
▪ Sensor capture properties

10
REFLECTANCE AND SHADING
▪ When the light hits a surface
1. Some light get absorbed, which depends on the factor called ρ
2. Some light gets reflected diffusively, which is independent of viewing direction
3. Some light is reflected specularly, depends on the viewing direction

▪ There are few models available which describe the reflection – diffuse, specular,
Phong shading models and the most general form BRDF (Bidirectional Reflectance
Distribution Function)

11
THE BIDIRECTIONAL REFLECTANCE
DISTRIBUTION FUNCTION (BRDF)
▪ The BRDF is a four-dimensional function that describes how much of each
wavelength arriving at an incident direction is emitted in a reflected direction
▪ This specifies the relationship between the incoming light direction and the
outgoing light direction, depending on the surface’s properties
▪ while with increasing roughness the light tends to diffract into all possible
directions. Eventually, an object will appear equally bright throughout the outgoing
hemisphere if its surface is perfectly diffuse

12
THE BRDF

▪ Properties:
o Always positive
o Helmholtz reciprocity (can exchange
with i and r)
o Conserving energy

13
THE BRDF

▪ BRDF of a given surface can be


obtained through:
o Physical modeling
o Heuristic modeling
o Empirical observation

▪ In practice, the BRDFs are a


combination of diffuse and
specular components

14
DIFFUSE REFLECTION
▪ Diffuse component scatters light uniformly in all directions.
▪ Diffuse reflection also often imparts a strong body color to the light, as it is caused
by selective absorption and re-emission of light inside the object’s material
▪ The amount of light observed depends on the angle between the incident light
direction and surface normal
▪ This is because the surface area exposed to a given amount of light becomes
larger at oblique angles, becoming completely self-shadowed as the outgoing
surface normal points away from the light
▪ Eg: Light reflecting off a paper

15
DIFFUSE REFLECTION

Diffuse reflections give surfaces a matte or non-glossy


appearance

16
SPECULAR REFLECTION
▪ Specular reflection strongly depends on the direction of the outgoing light
▪ The amount of light reflected in a given direction depends on the angle between
the view direction and the specular direction.
▪ The reflected light is concentrated around the reflection direction of the light,
creating shiny spots.
▪ Eg: Light reflecting off a polished metal surface

17
SPECULAR REFLECTION

Specular reflections give surfaces a shiny, reflective


appearance

18
PHONG SHADING
▪ This model combines the diffuse and specular components of reflection – ambient
illumination
▪ This creates the interaction of light with a surface by calculating the color of each
pixel based on the components- diffuse, specular and ambient
▪ The objects are generally illuminated not only by point light sources but also by a
general diffuse illumination corresponding to inter-reflection or distant sources
(like blue sky)
▪ This does not depend on surface orientation, but depends on the color of both the
ambient illumination and the object
▪ The diffuse component depends on the angle of the incoming light source
▪ The specular component depends on the relative angle between the viewer and
the specular reflection direction

19
PHONG SHADING
Advantages Disadvantages

▪ Smooth specular highlights ▪ Computationally expensive


▪ Because it computes the illumination at ▪ Computing at each pixel - high
each pixel computational cost
▪ Realistic rendering ▪ Interpolation issues on very large
▪ The interpolation of normal produces polygons
smooth lighting transitions across ▪ Cause inaccuracies in highly curved
surfaces, which creates a natural look surfaces

20
DI-CHROMATIC REFLECTION MODEL
▪ This model states that the apparent color of a uniform material lit from a single
source depends on the sum of two terms
▪ The radiance of the light reflected at the interface
▪ The radiance reflected at the surface body

▪ This model is used in computer vision to segment specular colored objects with
large variations in shading

21
GLOBAL ILLUMINATION
▪ light sources can be shadowed by occluders and rays can bounce multiple times
around a scene while making their trip from a light source to the camera
▪ If the scene is mostly specular – ray tracing or path tracing
▪ If the scene is composed mostly of uniform albedo simple geometry illuminators
and surfaces – Radiosity
▪ The ray tracing algorithm associates a light ray with each pixel in the camera
image and finds its intersection with the nearest surface
▪ Radiosity associates lightness values with rectangular surface areas in the scene.

22
OPTICS
▪ Once the light from a scene reaches the camera, it must still pass through the lens
before reaching the analog or digital sensor.

23
CHROMATIC ABERRATION
▪ Chromatic aberration is the tendency for light of different colors to focus at slightly
different distances
▪ To reduce chromatic aberrations – compound lenses made of different glass
elements are used

24
VIGNETTING
▪ Vignetting is the tendency for the brightness of the image to fall off towards the
edge of the image
▪ Natural vignetting – due to the foreshortening in the object surface, projected pixel
and lens aperture
▪ Mechanical vignetting – caused by the internal occlusion of rays near the
periphery of lens elements in a compound lens
▪ Mechanical vignetting can be decreased by reducing the camera aperture

25
IMAGE SENSORS

26
IMAGE SENSING
▪ Incoming light radiation reflected from a 3D scene is transformed to a voltage by
an imaging sensor that is sensible to a specific type of energy (wavelength).
▪ Usually, sensors are arranged in linear or 2D arrays

27
IMAGE SENSORS
▪ Light falling on an imaging sensor is picked up by an active sensing area, and then
passed to a set of sense amplifiers
▪ Two main kinds of sensors are used in digital still and video cameras
▪ CCD – Charge-Couple Device
▪ CMOS – Complementary Metal Oxide on Silicon

28
CCD
▪ Photons are accumulated in each active well during the exposure time
▪ In transfer phase, the charges are transferred from well to well in a kind of “bucket
brigade” until they are deposited at the sense amplifiers
▪ Sense amplifiers amplify the signal and pass it to an analog-to-digital converter

29
CMOS
▪ Photons hitting the sensor directly affect the conductivity of a photodetector, which
can be selectively gated to control exposure duration, and locally amplified before
being read out using a multiplexing scheme.

30
A/D conversion from voltage to a digital signal can happen in 2 ways: At the end of
each row/column (CCD) or directly at each sensing cell (CMOS).

31
FACTORS AFFECTING THE PERFORMANCE
OF DIGITAL IMAGE SENSOR
▪ Shutter speed
▪ Sampling pitch
▪ Fill factor
▪ Chip size
▪ Analog gain
▪ Sensor noise
▪ Resolution of the A/D converter

32
FACTORS AFFECTING THE PERFORMANCE
OF DIGITAL IMAGE SENSOR
▪ Shutter speed
▪ Controls the amount of light reaching the sensor which determines the images are under
or over exposed
▪ For bright scenes, slow shutter speed to get a shallow depth of field
▪ For dynamic scenes, highes shutter speed to make analysis easier

▪ Sampling pitch
▪ The physical spacing between adjacent sensor cells on the imaging chip
▪ Sensor with smaller sampling pitch – higher sampling density & higher resolution
▪ Smaller pitch cannot accumulate more photons which make it more prone to noise

33
FACTORS AFFECTING THE PERFORMANCE
OF DIGITAL IMAGE SENSOR
▪ Fill factor
▪ Active sensing area size as a fraction of theoretically available sensing area
▪ Higher fill factors – more light capture & less aliasing

▪ Chip size
▪ Larger chip size – each sensor cell can be more photo-sensitive
▪ But larger chips are expensive to produce

▪ Analog gain
▪ Higher gain allows the camera to perform better under low light conditions

34
FACTORS AFFECTING THE PERFORMANCE
OF DIGITAL IMAGE SENSOR
▪ Sensor noise
▪ Noise is added from various sources, throughout the whole sensing process
▪ Can be reduced by pre-calibrating the camera
▪ Image denoising, edge detection and stereo matching are used in computer vision
algorithms

▪ ADC resolution
▪ Noise calibration of an image sensor could improve the resolution

35
IMAGING PROCESS

36
IMAGING PROCESS

▪ The sensor array, coincident with the


focal plane of the lens, produces
outputs proportional to the integral
of the light received at each sensor
for a certain amount of time

37

You might also like