0% found this document useful (0 votes)
127 views90 pages

Camera Notes Part 2 For Photogrammetry

This document summarizes key concepts about digital camera components and light propagation. It discusses how cameras capture light intensity values for pixels to form images, and the different components involved like lenses, apertures, shutters and sensors. It also covers color filtering techniques and light propagation models including geometric optics, and how lenses approximate the pinhole camera model to form focused images. The aperture controls the amount of light entering the camera by adjusting the effective "pinhole size".

Uploaded by

Anuran Gayali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
127 views90 pages

Camera Notes Part 2 For Photogrammetry

This document summarizes key concepts about digital camera components and light propagation. It discusses how cameras capture light intensity values for pixels to form images, and the different components involved like lenses, apertures, shutters and sensors. It also covers color filtering techniques and light propagation models including geometric optics, and how lenses approximate the pinhole camera model to form focused images. The aperture controls the amount of light entering the camera by adjusting the effective "pinhole size".

Uploaded by

Anuran Gayali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 90

Photogrammetry & Robotics Lab

Camera Basics and


Propagation of Light

Cyrill Stachniss

The slides have been created by Cyrill Stachniss.


1
5 Minute Preparation for Today

https://www.ipb.uni-bonn.de/5min/
2
How to Obtain an Image?

Image Courtesy: Leonardo Garcia 3


What Does a Camera Measure?

Image Courtesy: Jacek 4


What Does a Camera Measure?

Image Courtesy: D. Yun5


What do Cameras Measure?
§  Cameras provide 2D images consisting
of pixels (“picture elements”)
§  Cameras measure the light intensity
for each pixel
§  Each position in an image (=pixel)
corresponds to a specific direction
in the 3D world

Each pixel measures the amount of


light coming from a certain direction.
6
Elements of a Digital Camera
lens and camera body

Lens Aperture Shutter

sensor chip

A/D Post-
Sensor
converter processing

7
Sensor

Image Courtesy: TechHive 8


Sensor
§  The image sensor converts the
incoming light to intensity values
§  Array of light-sensitive cells
§  Larger sensor cells can collect more
light per time interval
§  Larger chips are more expensive to
produce
§  Larger chips require larger (and thus
more expensive) lenses
9
Typical Sensor Sizes

Image Courtesy: MarcusGR 10


How to obtain
color information?

11
Three-Chip Camera
§  Three chips with separate filters for
red, green, and blue
§  Light is separated with a beam splitter

Image Courtesy: Förstner 12


Single-Chip Camera
§  A single chip is used to obtain
the RGB values
§  Uses small, pixel-dependent
color filters

Compared to a three-chip design


§  Cheaper
§  1 vs. 3 chips: few measurements
§  Interpolation leads to lower quality
13
Color Filter Array (CFA)
§  Single chip, alternating sensors are
covered by different colored filters
§  Bayer pattern

CFA layout generated pixel (lower-case


means interpolated values)
14
Bayer Pattern
§  50% green
§  25% red and blue

§  Luminance (perceived relative


brightness) is strongly influenced by
green values
§  Human visual system is very sensitive
to high-frequency details in luminance

15
Other Patterns

Image Courtesy: Frank Klemm 16


Demosaicing
Interpolating the missing color values to
obtain RGB values for all the pixels is
called demosaicing
(a) original full-
resolution image
(b) bilinear
interpolation
(c) the high-quality
linear interpolation
(d) using the local
two-color prior

Image Courtesy:
Szeliski
17
Errors from Demosaicing
§  Interpolating color values obviously
leads to errors
§  Errors typically occur around edges

+ + =

Image Courtesy: Dubois 18


Comparison

Image Courtesy: Dubois 19


Comparison (Zoomed-in view)

Image Courtesy: Dubois 20


Shutter

21
Shutter Speed / Exposure Time
§  Controls the amount of light reaching
the sensor
§  Longer exposure time = more light =
brighter images
§  Long exposure time leads to motion
blur

22
Rolling Shutter
§  The shutter rolls (moves) across the
exposable image area
§  The pixels at the same line of the
image are recorded at the same time
§  Produces distortions in case of fast-
moving objects or cameras
§  Often found in CMOS cameras

23
Rolling Shutter

Image Courtesy:
Red.com, Inc. 24
Rolling Shutter Effects

Image Courtesy:
Image Courtesy:
Axel1963
Richmilliron
(wikipedia)
(wikipedia)

25
Global Shutter
§  The whole image is recorded at
exactly the same time
§  No rolling shutter distortions
§  Preferable for geometric
reconstruction task
§  More expansive to produce

26
Global Shutter

Image Courtesy: Red.com, Inc. 27


Rolling vs. Global Shutter

Image Courtesy: Red.com, Inc. 28


Lens & Aperture

Image Courtesy: A. Chizhov 29


How does light
propagation work?

30
Models for Light Propagation
There are three models to describe light
propagation in physics:
§  Geometric or ray optics
(DE: Geometrische Optik)
§  Wave optics based on Maxwell's
equations (DE: Wellenoptik)
§  Particle/quantum optics based on the
wave–particle duality
(DE: Quantenoptik)
31
Geometric/Ray Optics

32
Four Axioms of Geometric Optics
1.  A light ray is a straight line in
homogenous material
2.  At the border between two
homogenous materials, the light is
reflected (Fresnel reflection) or
refracted (Snell's law; DE: Brechung)
3.  The optical path is reversible
4.  Intersecting light rays do not
influence each other

33
Geometric Optics
§  Light propagation is described by rays
from the light sources
§  Light travels with in
vacuum
§  Different speeds in different materials
§  Each material has an index of
refraction (DE: Brechungsindex)
§  Speed
§  Light travels along the fastest path
34
Image Formation

Let’s design a camera


§  Put a piece of film in front of an object
§  Do we get a reasonable image?

Slide Courtesy: Seitz 35


Pinhole Camera

§  Add a barrier to block off most of the rays


§  This reduces blurring
§  The opening is known as the aperture
§  How does this transform the image?
Slide Courtesy: Seitz 36
Pinhole Camera
§  Pinhole camera is a simple model to
approximate the imaging process
§  If we treat pinhole as a point, only one ray
from any given point can enter the camera

Image
plane

Virtual pinhole
image

Image Courtesy: Forsyth and Ponce 37


Camera Obscura (1544)

In Latin, means
“dark room”

"Reinerus Gemma-Frisius, observed an eclipse of the sun at Louvain on January


24, 1544, and later he used this illustration of the event in his book De Radio
Astronomica et Geometrica, 1545. It is thought to be the first published illustration
of a camera obscura..."
Hammond, John H., The Camera Obscura, A Chronicle

Image Courtesy: http://www.acmi.net.au/AIC/CAMERA_OBSCURA.html 38


Camera Obscura at Home

Sketch from: Image Courtesy:


http://www.funsci.com/fun3_en/sky/sky.htm http://blog.makezine.com/archive/2006/02/
how_to_room_sized_camera_obscu.html

39
Pinhole Camera Model

§  Similarity of the gray triangles


§  Image scale
§  Mapping

Image courtesy: Förstner 40


Pinhole Camera Model
§  Small hole: sharp image but requires large
exposure times
§  Large hole: short exposure times but blurry
images
§  Solution: replace pinhole by lenses

Image courtesy: Förstner 41


Camera with a Thin Lens
§  Hole is replaced by a (thin) lens

law for thin lenses:


(DE: Linsengleichung)

Image courtesy: Förstner 42


Lens Approximates the Pinhole
§  A lens is only an approximation of the
pinhole camera model
§  The corresponding point on the object
and in the image and the center of the
lens should lie on one line
§  The further away a beam passes the
center of the lens, the larger the error
§  Use of an aperture to limit the error
(trade off between the usable light and
price of the lens)
43
Pinhole Model
§  Pinhole camera model is the most
commonly used model for camera
§  Simplicity makes it popular
§  But unsuitable in some cases,
e.g., for large fields of view

44
Three Assumptions Made in the
Pinhole Camera/Thin Lens
1.  All rays from the object point
intersect in a single point
2.  All image points lie on a plane
3.  The ray from the object point to the
image point is a straight line

45
Aperture

Image Courtesy: F. Krejci 46


Aperture is the “Pinhole Size”

Image Courtesy: VERSATILE SCHOOL OF PHOTOGRAPHY 47


Aperture is the “Pinhole Size”

2x light ½x light
¼x light

Image Courtesy: VERSATILE SCHOOL OF PHOTOGRAPHY 48


Aperture Reduces Lens Errors
§  The error of a lens increases with the
distance from the optical axis
§  Aperture limits this maximum distance

Image courtesy: Förstner 49


Aperture and Depth-of-Field
§  The aperture controls the amount of light on
the sensor chip and the depth-of-field
§  Depth-of-field refers to the range of
distance that appears acceptably sharp.

Image Courtesy: http://www.cambridgeincolour.com/tutorials/depth-of-field.htm 50


Depth-of-Field Example

f/8.0 f/5.6 f/2.8

Image Courtesy: http://www.cambridgeincolour.com/tutorials/depth-of-field.htm


Try yourself: http://www.cambridgeincolour.com/tutorials/dof-calculator.htm 51
Lens
Goal is to obtain images that are
§  not distorted
§  sharp
§  contrast intensive
The choice of the lens depends on
§  field of view
§  distance to the object
§  amount of available light
§  price
52
Typical Lenses
Telephoto lens, normal lens, wide-angle
lens, fisheye lens, …

telephoto normal wide-angle fisheye

Image courtesy: Canon 53


Moderate Tele Lens
§  Narrow field of view
§  Minimal perspective distortions
§  Parallel lines remain parallel

Image courtesy: Förstner 54


Wide Angle Lens
§  Useful for application that require a
large field of view (70 and 120 deg)
§  Straight lines in the world are mapped
to roughly straight in the image
§  Perspective
distortions
§  Proportions
are not correct
anymore

Image courtesy: Förstner 55


Fisheye Lens
§  Field of view of 130+ deg
§  Straight lines in the world are not
straight anymore in the image

Image Courtesy:
Ashley Ringrose 56
Three Assumptions Made in the
Pinhole Camera/Thin Lens
1.  All rays from the object point
intersect in a single point
2.  All image points lie on a plane
3.  The ray from the object point to the
image point is a straight line

Often these assumption do not hold


and leads to imperfect images

57
Aberrations
§  A deviation from the ideal mapping
with a thin lens is called aberration
§  Main types of aberrations:
§  Distortion
§  Spherical aberrations
§  Chromatic aberrations
§  Astigmatism
§  Comatic aberrations
§  Vignetting
§  …
58
Distortion
Deviation from rectilinear projection,
a projection in which straight lines in
a scene remain straight in an image

barrel pincushion mustache


distortion distortion distortion
Image courtesy: Wikipedia 59
Spherical Aberration
Effect in a lens due to the increased
refraction of light rays when they strike
a lens

ideal
spherical aberration
(DE: Sphärische Aberration)
Image courtesy: Wikipedia 60
Chromatic Aberration
§  Index of refraction for glass varies
slightly as a function of wavelength
§  Light at different wave length are not
projected to the same point (are
focused with a different focal length)

chromatic aberration
(DE: Chromatische
Aberration)

Image courtesy: Wikipedia 61


Astigmatism
A different focus point in vertical and
horizontal direction

Image courtesy: Wikipedia 62


Comatic Aberration / Coma
Combination of spherical aberration and
astigmatism in case of incoming rays striking
the lens at an angle to the optical axis

Image courtesy: Wikipedia 63


Vignetting
§  The brightness of the image falls off
towards the edge of the image
§  Often compensated by the camera

Image courtesy: Wikipedia 64


Wave Optics

65
Wave Optics
§  Considers light as an electro-magnetic
wave described by the Maxwell
equations
§  Describes interference und diffraction
(DE: Interferenz und Beugung)
§  Visible light from 400nm to 700nm
§  Electro-magnetic waves cover a large
spectrum of wave lengths

66
Spectrum

Image courtesy: Wikipedia 67


Frequency
§  The frequency is defined as
speed of light (vacuum)

wave length

68
Frequency
§  The frequency is defined as
speed of light (vacuum)

wave length

§  and depends on the material

refraction index
69
We Are Mainly Using 3 Bands

+ + =

red green blue

70
Near the Visible Spectrum
Infrared light (λ≈1mm) is strongly
reflected by chlorophyll and thus often
used for monitoring vegetation

Image courtesy: Wikipedia (left), Förstner (right) 71


Hyperspectral Images
Hyperspectral images are three-
dimensional data cubes

72
Particle/Quantum Optics

73
Light as Particles
§  Quantum mechanics/optics introduces
the wave-particle duality
§  Certain properties of light can be
described by particles
§  Alternative description that tries to
explain phenomena that cannot be
explained using wave optics
§  Useful for describing the interactions
between light and matter

74
Photon
§  A photon is an elementary particle
§  It is the “quantum of light”
§  Energy of a photon is

§  where h is the Planck constant

75
Photons and Intensity
cell
(pixel)

§  Quantum optics can model the


interaction of light and matter
§  Every sensor element of a camera chip
turns photons into electric charge
§  Intensity is proportional to the number
of photons reaching the sensor (pixel)
76
Pixels are Photon Counters
§  Each pixel is a photon counter
§  How many quanta of light reach the
pixel through the pinhole within the
exposure time
§  Larger values = more photons

77
Intensity Values
External
§  Amount of light reflected from a scene
to the camera

Camera
§  Exposure time (“Tv”)
§  Aperture/pinhole size (“Av”)
§  Sensitivity of the chip (“ISO”)

78
Exposure Triangle

motion blur
sharp/blurry
depth of field

sensor noise
See: https://actioncamera.blog/
2017/02/22/the-exposure-triangle/ Image courtesy: M. Walsh 79
Lighting and Reflectivity

80
Lighting and Reflectivity
§  Lighting is essential
§  Light intensity depends on the light
source, the reflection properties of the
material, and relative locations

Image courtesy: Szeliski 81


Albedo
§  Measure of the diffuse reflection of
solar radiation
§  Value in [0,1]
§  1 = material reflects
all radiation
§  0 = black body

Image courtesy: Wereon 82


Reflectivity
§  BRDF: Bidirectional Reflectance
Distribution Function
§  General model of light scattering

geometry wavelength

§  Describes how much light of each


wavelength arriving at an incident
direction is emitted in a direction
83
BRDF
Describes how much of each wavelength
arriving at an incident direction is
emitted in a reflected direction

Image courtesy: Szeliski 84


Reflected Light
Amount of light exiting a surface point
in a direction is

reflected incoming
with

85
Example: BRDF Estimation

Video courtesy: Proesmans and Van Gool 86


Example: Rendering with BRDFs

Video courtesy:
Proesmans and Van Gool 87
Summary
§  Basic elements of a camera
§  What a camera measures
§  What impacts the measurements
§  Different physical models to describe
light (ray, wave, particle)
§  Pinhole camera model
§  Aberrations
§  Reflectivity of objects

88
Literature
§  Förstner, Scriptum Photogrammetrie I,
Chapters 2 & 3
§  Szeliski, Computer Vision: Algorithms
and Applications, Chapters 2.2 & 2.3

89
Slide Information
§  The slides have been created by Cyrill Stachniss as part of the
photogrammetry and robotics courses.
§  I tried to acknowledge all people from whom I used
images or videos. In case I made a mistake or missed
someone, please let me know.
§  The photogrammetry material heavily relies on the very well
written lecture notes by Wolfgang Förstner and the
Photogrammetric Computer Vision book by Förstner & Wrobel.
§  Parts of the robotics material stems from the great
Probabilistic Robotics book by Thrun, Burgard and Fox.
§  If you are a university lecturer, feel free to use the course
material. If you adapt the course material, please make sure
that you keep the acknowledgements to others and please
acknowledge me as well. To satisfy my own curiosity, please
send me email notice if you use my slides.

Cyrill Stachniss, [email protected] 90

You might also like