0% found this document useful (0 votes)
6 views

Lecture02 Optics Image Formation

Lecture 2 covers the principles of optics and image formation, including the concept of image mapping, the Helmholtz reciprocity theorem, and the properties of thin lenses. It discusses the relationship between object and image distances, magnification, field of view, and depth of field, as well as the effects of exposure and diffraction on image quality. Key equations and examples illustrate these concepts, highlighting the importance of aperture size and lens properties in imaging systems.

Uploaded by

hemel.debnath123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Lecture02 Optics Image Formation

Lecture 2 covers the principles of optics and image formation, including the concept of image mapping, the Helmholtz reciprocity theorem, and the properties of thin lenses. It discusses the relationship between object and image distances, magnification, field of view, and depth of field, as well as the effects of exposure and diffraction on image quality. Key equations and examples illustrate these concepts, highlighting the importance of aperture size and lens properties in imaging systems.

Uploaded by

hemel.debnath123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Lecture 2

Optics and Image Formation

1. General image properties

Figure 1 The concept of image formation.

The concept of image formation is shown in Figure 1. Light from a point passes
through some optical system and is measured at a point by a detector. An
image is:

A mapping such that a source point is focused back to a point at the detector.

At this general level, it is also important to note that there exists reciprocity
between the source and detector. That is, if the source and detector are
exchanged, the intensity at the detector is unchanged. This relationship is called
the Helmholtz reciprocity theorem and follows from the basic laws of
electromagnetic theory. Helmholtz reciprocity is illustrated in Figure 2 from the
paper by Zickler et al 1. As can be seen, the intensity on the object surface is
the same in the two images except where points are occluded by the object
itself. This invariant intensity property simplifies many computer vision analyses
such as stereo matching and the detection of specularities.

1
T. Zickler, P. Bellheumeur and D. Kriegman, Helmholtz stereopsis: exploiting reciprocity for
surface reconstruction, Proc. European Conf on Computer Vision, 2002.
Figure 2 An illustration of the Helmholtz reciprocity theorem.

2. Simple lens theory


The simplest form of image formation is the thin lens. In this case rays from the
source (object) that are parallel to the optical axis pass through the focal point of
the lens. A second property of the thin lens is that rays though the lens center
are not deflected. These two properties permit a geometric analysis of the
image as shown in Figure 3.

Figure 3 Geometric properties of the thin lens.

The relationship between the object and image distances is expressed by


Newton’s lens equation:

1 1 1
 
do di f
For example, when d o  , d i  f , as required by the first thin lens
property. The ratio of the image size to the object size, or magnification, is:

xi d i
M 
xo d o
The field of view of an image is constrained by the size of the imaging device
and the focal length of the lens as illustrated in Figure 4. As can be seen, the
shorter the focal length, the larger the field of view. For example a 20mm lens
with a 1cm imaging chip has a 30o field of view at large viewing distances, while
a 10mm lens captures over 50o.

Figure 4 The relation between field of view and focal length.

Another important consideration is depth of field. Newton’s lens equation


predicts that an image will be in focus only at a single plane in the scene.
However, in real scenes the objects are arranged at multiple depths and it would
be desirable that they were all in focus at the same time. The depth of field is
controlled by the aperture as shown in Figure 5. The depth of field is defined by
the distance from the plane of focus at which the image becomes blurred by a
given diameter circle. This circle is called the circle of confusion, or blur circle.
The commonly accepted standard for the threshold of blur is 0.035mm and is
based on the resolution of the human eye. Note that as the size of the aperture
decreases, the depth of field increases. The aperture of a lens is typically
quantified by f /number:

Figure 5 The depth of field for two different apertures.

f
f / number  f # 
D
Applying the lens equations and the definition of f/number:

cd o f# ( d o  f )
df 
f 2  cf# ( d o  f )

cd o f# ( d o  f )
dr 
f 2  cf# ( d o  f )
where c is the diameter of the blur circle. A plot of the depth of field for several
f/number settings is shown in Figure 6.

It would seem that the depth of field can be made as large as needed by simply
increasing the f/number. However, two effects limit the size of the aperture: 1)
the amount of light reaching the imaging chip surface is reduced as the aperture
becomes smaller; 2) the effects of diffraction become significant for small
apertures.
Figure 6 The front and rear depth of field limits for a 20 mm lens.

3. Exposure
The amount of light reaching an imager array detector depends on the radiant
intensity of the source as well as the aperture of the lens. The simplest situation
to analyze is when the emitter is a point source. It will be assumed that light
from a single point is imaged onto a single detector in a CCD array. This
arrangement is shown in Figure 7.

Figure 7 Focusing a point source onto a photo detector. Detector array insert © 1998-2003 by
Michael W. Davidson, Mortimer Abramowitz, Olympus America Inc., and The Florida State University
Assuming that the lens aperture is a circle with diameter, D the solid angle with
respect to the point source intercepted by the aperture is:
 D2
d 
4 d o2

Revisiting the example discussed in Lecture 1:

Assume a lens with f  20 mm and D  10 mm , so f / number 2.0

Intercepted steradians for the lens at a viewing distance of 1m,


 0 . 01 2
d  ( )  7 . 8  10  3 steradians
4 1
Thus the lens intercepts
2 . 5  10  11
3
 3 . 2  10 7
7 . 8  10
times the amount of light than falls on an imager array at 1 meter without a lens.

Typical lens f/number settings, called stops, are:

f/1 f/1.4 f/2 f/2.8 f/4 f/5.6 f/8 f/11 f/16 f/22 f/32

They are arranged so that each setting allows twice or one half the light of its
neighboring setting. That is, the light intensity is proportional to the square of
the aperture opening, so f/1.4 reduces the intensity to one half the level of f/1.

4. Diffraction
Another important effect that must be considered in image formation is
diffraction. This effect is due to the wave nature of light. Suppose we have a
lens focused on a distant point source. One might expect that the image
intensity would be a spike, or delta function. Due to diffraction the image
intensity has a finite extent called the point spread function as shown in the
following applet:
http://www.microscopy.fsu.edu/primer/java/imageformation/airydiskformation/

This finite response is similar to that of any linear system. For example for an
amplifier with finite bandwidth its response to a spike will produce a finite pulse,
perhaps with some ringing. A simple estimate for the minimum discernable
spacing of two points due to diffraction is given by:
f
   f # ,
D
where  is the wavelength of the illumination and D is the diameter of the lens
aperture. So for a wavelength of 550 nm and a resolution of 0.035 mm it follows
that

3 . 5  10  5
f#  64 ,
5 . 5  10  7
so diffraction is not much of a limit in conventional viewing situations.

Diffraction does arise significantly in microscopy were resolutions in microns are


required. So for a resolution of 1 micron,
1 . 0  10  6
f #  1 .8
5 . 5  10  7
and now depth of field becomes a big issue.

You might also like