Linear Imaging Systems Example: The Pinhole Camera: Outline
Linear Imaging Systems Example: The Pinhole Camera: Outline
Imaging Definitions
Object function - the real space description of the actual object. Resolution - the collected image is only an approximation of the actual object. The resolution describes how accurate the spatial mapping is. Distortions - describes any important nonlinearities in the image. If there are no distortions, then the resolution is the same everywhere. Fuzziness - describes how well we have described the object we wish to image. Contrast - describes how clearly we can differentiate various parts of the object in the image. Signal to Noise ratio
Imaging Definitions
There are two very basic problems in image analysis: 1. 2. Given the data, an estimation of the instrument function and a (statistical) model of the noise, recover the information (an estimation of the object function), Employing a suitable model, interpret this information.
So here our goals are: 1. Design an algorithm to compute the object function given certain data on the scattering field, an estimation of the point spread function, and an estimation of the experimental noise. 2. Find a way of recovering an appropriate set of material parameters from the object function, 3. Set up the experiment such that object function reflects the parameter of interest, 4. Design the experiment such that the imaging experiment closely approximates a linear system (or that the non-linearities are dealt with correctly) and such that the point spread function is narrow and of definite shape.
A model of the imaging process is needed to extract spatial information from a measured signal. For most of this course we will be concerned with a deceptively simple linear model Image = Object Function Point Spread Function + Noise This is the basis expression for linear imaging. The Point Spread Function depends on the type and properties of the imaging system, and the Object Function depends on the physical interactions of the object and the scattering wave. The noise is an important consideration since it limits the usefulness of deconvolution procedures aimed at reversing the blurring effects of the image measurement. If the blurring of the object function that is introduced by the imaging processes is spatially uniform, then the image may be described as a linear mapping of the object function. This mapping is of course at lower resolution, and the blurring is readily described as a convolution of the object function with a Point Spread Function.
OP=
O( x)P( x'x)dx
A convolution is a linear blurring. Every point in P is shifted, mapped and added to the output corresponding to the shape of O.
Consider the simple model, a plane of sources I(x,y) mapped onto a plane of
detectors E(x,y).
The detectors measure photon intensity (energy) and do so in a linear fashion (if
twice the photon intensity impinges on the detector it returns a signal twice as
large).
Question: what would happen if the detectors saturated?
E( x, y) = S{I ( x, y)}
and the mapping is a linear functional so,
The delta function allows a simple formal approach to the decomposition of the
image.
In 1 dimension,
0; x 0 ( x) = ; x = 0
The delta function is thus a singularity and has the following convenient properties:
(x)dx = 1
f ( x) (x)dx = f (0)
0; x x0 ( x x 0 ) = ; x = x0
f ( x) (x x0 )dx = f (x0 )
So we can use the delta function to sample the object function anywhere in space.
lim
2 x 2
sin(x )
) lim sinc(x); sinc(x
= x
5
4 3 2
2 1
-2
-1
-2
-1 -1
Using these definitions you can show the various properties of the delta function.
In[1]:=
g @ D : = Plot @ Exp @ Pi a ^ 2 x ^ 2 D, 8x , - 2 , 2 <, a_ a 8PlotRange -> All , PlotStyle -> Thickness @ 0.01 D, DisplayFunction -> Identity <D; Show @ Table @ @ H2 ^ n LD, 8n , 1 , 5 <D, g 4 DisplayFunction -> $DisplayFunction
2
In[2]:=
1.5
0.5
-2
Out[2]=
-1
Graphics
In[3]:= In[4]:=
sinc @ D : = Sin @ D x ; x_ x s @ D : = Plot @ sinc @ x D Pi , 8x , - 2 , 2 <, a_ a a 8PlotRange -> All , PlotStyle -> Thickness DisplayFunction -> Identity <D; Show @ Table @ @ H2 ^ n LD, 8n , 1 , 5 <D, s 32 DisplayFunction -> $DisplayFunction
5 4 3 2 1
@ 0.01 D,
In[7]:=
-2
-1 -1
Out[7]=
Graphics
where
0; r r0 ( r r0 ) = ; r = r0
2
r0 = x0 x + y0 y
2 (r r0 ) = (x x0 ) ( y y0 )
We can also define the 2D delta function in cylindrical coordinates and this will be important for projection reconstruction, vida infra.
Now that we have this useful construct of a delta function, let us return to our imaging system and decompose the plane of sources I(x,y). So any point in the source can be extracted as:
I ( x0 , y0 ) =
I ( x, y) (x x0 ) ( y y0 )dxdy
Now notice that E is a continuous function over the detector plane, and so I have labeled the function by the position in the source plane.
E0 (x, y) = S I (x, y) (x x0 ) (y y0 )dxdy
Since S and the integrations are all linear we can change their order.
E0 (x, y) =
Now we see that the mapping is described for each point in the object function, and that the object function itself simply provides a weight for that point. Of course it is essential that S be linear.
Now we picture every point as being mapped onto the detector independently. The mapping is called the instrument response function (IRF).
E0 (x, y) =
E0 (x, y) =
IRF(x, y | x0 , y0 ) = S{ (x x0 ) (y y0 )}
This is often given the symbol h(r|r). Of course we want the entire output from the whole object function,
E( x, y) =
E( x, y) =
Space Invariance
Now in addition to every point being mapped independently onto the detector, imaging that the form of the mapping does not vary over space (is independent of r0). Such a mapping is called isoplantic. For this case the instrument response function is not conditional.
IRF(x, y | x0 , y0 ) = PSF(x x0 , y y0 )
The Point Spread Function (PSF) is a spatially invariant approximation of the IRF.
So now in terms of the Point Spread Function we see that the image is a convolution of the object function and the Point Spread Function.
E( x, y) =
Here I have neglected noise, but in real systems we can not do that.
Magnification
A system that magnifies initially looks like it should not be linear, but the mapping can also be written as a convolution. The IRF of course must include the magnification (M), IRF(x, y | x0 , y0 ) = PSF(x Mx0 , y My0 ) and the image is,
E( x, y) =
Magnification
x0 Mx0 y0 My0
E( x, y) =
1 M
And the imaging system is again correctly described as a convolution. We also directly see that as the image is magnified its intensity decreases.
An Example, the Pin-hole Camera One of the most familiar imaging devices is a pin-hole camera.
a b
pin-hole
image
The object is magnified and inverted. Magnification = -b/a Known prior to della Porta ca. 1600.
@ arrow @ , y D, 8x , - 128 , 128 <, 8y , - 128 , 128 <, x -> 8128 , 128 <, Mesh -> False <D
100
100
50
50
-50
-50
-100
-100
-100
-50
50
100
-100
-50
50
100
DensityGraphics
DensityGraphics
An Example, the Pin-hole Camera 2 Notice, however, that the object function is also blurred due to the finite width of the pin-hole.
a b
pin-hole
image
The extent of blurring is to multiply each element of the source by the source magnification factor of (a+b)/a x diameter of the pin-hole.
-10
-5
10
Graphics
10 5 0
In[8]:=
Plot3D
, PlotPoints
5 4 3 2 1
-10
-5
10
10 5 0
In[8]:=
Plot3D
, PlotPoints
0.3 0.6 0.2 0.4 0.1 0.2 -100 -50 50 100 -100 -50 50 100
Distortions of a Pin-hole Camera Even as simple a device as the pin-hole camera has distortions 1. Limited field of view due to the finite thickness of the screen.
a b
pin-hole
image
As the object becomes too large, the ray approaches the pin-hole too steeply to make it through.
Distortions of a Pin-hole Camera 2 Also, as the object moves off the center line, the shadow on the detector grows in area, (and the solid angle is decreased) so the image intensity is reduced.
a b
pin-hole
image
There are three effects, radial distance cos^2, oblique angle cos, and effective size of pinhole cos. Therefore cos^4. The general oblique angle effect goes as cos^3.
Distortions of a Pin-hole Camera Reduction in field of view due to oblique angle effect.
1 0.3 0.8 0.6 0.4 0.1 0.2 -100 -50 50 100
0.2
Cos^3
50 100
-100
-50
Cos^4
50
100
Contrast and Noise in a Pin-hole Camera For a screen of finite thickness some light will penetrate.
1 0.6204 0.8 0.62035 0.6 0.6203 0.4 0.62025 0.2 -100 -50 50 100
-100
-50
50
100
Object function
Full 2D analysis
250
250
200
200
150
150
Object function
100 100 50 50