100% found this document useful (1 vote)
99 views74 pages

Computational Optics

The document discusses computational photography and coded apertures. It describes how an imaging system maps a 4D light field to a 2D image, and how changing camera parameters results in different mappings. Coded apertures are introduced as a way to control how wavefronts from a scene interfere at the sensor to allow various computational photography applications. Examples of coded apertures that vary the modulation transfer function with respect to defocus are shown, which can provide both an all-focus image and depth cues for generating a depth map.

Uploaded by

vadim fustii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
99 views74 pages

Computational Optics

The document discusses computational photography and coded apertures. It describes how an imaging system maps a 4D light field to a 2D image, and how changing camera parameters results in different mappings. Coded apertures are introduced as a way to control how wavefronts from a scene interfere at the sensor to allow various computational photography applications. Examples of coded apertures that vary the modulation transfer function with respect to defocus are shown, which can provide both an all-focus image and depth cues for generating a depth map.

Uploaded by

vadim fustii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 74

⊕⊖ Computational ⊗⊘

Photography
Computational Optics

Jongmin Baek
CS 478 Lecture
Feb 29, 2012

Wednesday, February 29, 12


Camera as a Black Box
World Sensor

v t
Imaging
System
u s

4D Light Field 2D Image

An imaging system is a function that


maps 4D input to 2D output.
Wednesday, February 29, 12
Camera as a Black Box
World ⋮ Sensor
t Imaging
v
System
Imaging
System
Imaging
u s System
4D Light Field ⋮ 2D Image

By changing parameters (e.g. focus),


we can obtain a different mapping.
Wednesday, February 29, 12
Camera as a Black Box
• What is the space of all mappings we can
reasonably obtain?

• Clearly not all f:(ℝ4→ℝ)→(ℝ2→ℝ)

• Are all mappings useful?


• Consider f: x ↦ (g: y ↦ 0), ∀x∈(ℝ →ℝ). 4

• Do all mappings yield “images”?


Wednesday, February 29, 12
Overview
• Coded Aperture
• Spatial coding
• Amplitude
• Phase
• Temporal coding
• Wavelength coding
• Other stuff
Wednesday, February 29, 12
“Hand-wavy”
Wave Optics Tutorial
Thin Lens
Isotropic
emitter Pixel

We want all these waves to interfere constructively


at the pixel.
Wednesday, February 29, 12
“Hand-wavy”
Wave Optics Tutorial
Isotropic
emitter Thin Lens
Pixel

We want all these waves to interfere destructively at


the pixel.
Wednesday, February 29, 12
“Hand-wavy”
Wave Optics Tutorial
• Lens
• Controls how wavefronts from the scene
interfere at the sensor.
• Ideally, all wavefronts from a single point source
interfere constructively at a pixel, and other
wavefronts interfere destructively at that pixel.

• “Perfect” imaging system.

Wednesday, February 29, 12


“Hand-wavy”
Wave Optics Tutorial
• Perfect imaging system is impossible.
• Defocus blur: It’s hard to make all the waves
interfere 100% constructively, for objects at
arbitrary depth.

• Diffraction: It’s hard to make something interfere


100% constructively, and something 𝜀-away
interfere 100% destructively.

• But...

Wednesday, February 29, 12


“Hand-wavy”
Wave Optics Tutorial

... after some math later ...


(Refer to any optics textbook)

Wednesday, February 29, 12


“Hand-wavy”
Wave Optics Tutorial
• Sinusoidal patterns are perfectly imaged.
• Same frequency, potentially lower magnitude

Wednesday, February 29, 12


Imaging in Fourier
Domain
• Any signal can be written as a sum of
sinusoids.
• We know how each sinusoid is imaged.
• Imaging is linear.
• Figure out what the imaging system
does to each signal, and add up results!

Wednesday, February 29, 12


Imaging in Fourier
weights
Domain
α1 α1

α2 α2

α3 α3

Decompose Mapping via Recompose


imaging system
FOURIER
TRANSFORM INVERSE
FOURIER
TRANSFORM

Wednesday, February 29, 12


Imaging in Fourier
Domain
• (Traditional) Imaging system
• A multiplicative filter in Fourier domain.
• This filter is called the
Optical Transfer Function (OTF).

• The magnitude of the filter is called the


Modulation Transfer Function (MTF).

• A convolution in the spatial domain.


• This kernel is called the
Point Spread Function (PSF).
Wednesday, February 29, 12
Aperture Coding
• Why insist on a circular aperture?

(Levin 2007)

• What kind of aperture should we use?


Wednesday, February 29, 12
Circular Aperture
• Let’s consider the circular aperture.
• Imagine 2D world.
• The aperture is now a 1D slit.
Transmittance

Wednesday, February 29, 12


Circular Aperture

Point Spread Function Modulation Transfer Function


(Focused)

Wednesday, February 29, 12


Circular Aperture
MTF for a 1D slit at various misfocus ψ

(figures stolen from self)

Wednesday, February 29, 12


Circular Aperture
MTF as a function of misfocus, at various frequency

Wednesday, February 29, 12


Stopping Down
MTF as a function of misfocus, at various frequency

Aperture size at 100%, 80%, 60%, 40% with equal exposure.


Wednesday, February 29, 12
Desiderata
• Given an aperture, we can generate these
plots mathematically*.
• What kind of aperture do we want?
• What kind of plot is ideal?

*Are you sure you want to know?


OTF(x, y,ψ) = ∬p(t1-fx/2,t2-fy/2)p*(t1+fx/2,t2+fy/2)e2i(t1fx+t2fy)ψdt1dt2.
For aperture with large features, one can estimate the PSF by the aperture shape, scaled by misfocus.

Wednesday, February 29, 12


Depth Invariance?

• Do we want the frequency response to be


constant w.r.t. misfocus (equivalently, depth)?
• Would be useful for all-focus image.

• Do we want the frequency response to vary wildly


w.r.t. misfocus (equivalently, depth)?
• Would be useful as depth cues, for depthmap generation.

Wednesday, February 29, 12


Depth Invariance?

• Do we want the frequency response to be


constant w.r.t. misfocus?
• Would be useful for all-focus image.

• Do we want the frequency response to vary wildly


w.r.t. misfocus?
• Would be useful as depth cues, for depthmap generation.

Wednesday, February 29, 12


Image and Depth from a Conventional
Camera with a Coded Aperture
Levin et al., SIGGRAPH 2007

• Pick an aperture whose OTF


varies much with depth.
• Random search.

• Restrict search to
binary 11x11 patterns.

• Maximize
K-L divergence among OTFs.

• Calculate PSF for each depth.

Wednesday, February 29, 12


Image and Depth from a Conventional
Camera with a Coded Aperture
Levin et al., SIGGRAPH 2007

• Steps

• Take a picture.

Wednesday, February 29, 12


Image and Depth from a Conventional
Camera with a Coded Aperture
Levin et al., SIGGRAPH 2007

• Steps

• Try deconvolving with each candidate PSF.

• Convolve again with PSF, subtract from picture


to compute error.

• For each region, pick the PSF (hence depth) that


gives the minimal error.

• Regularize depthmap.

Wednesday, February 29, 12


Image and Depth from a Conventional
Camera with a Coded Aperture
Levin et al., SIGGRAPH 2007

Resulting Depthmap

Wednesday, February 29, 12


Image and Depth from a Conventional
Camera with a Coded Aperture
Levin et al., SIGGRAPH 2007

• You can do all this with a circular aperture.

• The result won’t be as good, though.

Wednesday, February 29, 12


Next Step

• Instead of modulating the aperture


amplitude (transmittance), we could
modulate the phase as well.
• Upside: No light lost.
• Downside: Larger space of unknowns.

Wednesday, February 29, 12


Phase Coding

• (Parabolic) Lens already modulates the phase.


• Add an additional refractive element.
Lens

Phase plate
Wednesday, February 29, 12
Depth Invariance?

• Do we want the frequency response to be


constant w.r.t. misfocus?
• Would be useful for all-focus image.

• Do we want the frequency response to vary wildly


w.r.t. misfocus?
• Would be useful as depth cues, for depthmap generation.

Wednesday, February 29, 12


Extended Depth of Field through
Wavefront Coding
Dowski et al., Applied Optics 1995

• Design a phase plate such that the MTF is the same


across depth.

• A regular lens is parabolic, or quadratic.

• Instead, use a lens whose profile is cubic.

regular lens cubic phase plate


Wednesday, February 29, 12
Extended Depth of Field through
Wavefront Coding
Dowski et al., Applied Optics 1995

• How does it work?

• A regular lens is parabolic, or quadratic.

• The 2nd derivative determines plane of focus.

Wednesday, February 29, 12


Extended Depth of Field through
Wavefront Coding
Dowski et al., Applied Optics 1995

• How does it work?

• A regular lens is parabolic, or quadratic.

• The 2nd derivative determines plane of focus.

Parabolic lens profiles


with varying 2nd deriative.

Wednesday, February 29, 12


Extended Depth of Field through
Wavefront Coding
Dowski et al., Applied Optics 1995

• How does it work?

• A regular lens is parabolic, or quadratic.

• The 2nd derivative determines plane of focus.

• A cubic lens is locally quadratic with varying 2nd


derivative.

• Different parts of the lens “focus” at different


depth!

Wednesday, February 29, 12


Extended Depth of Field through
Wavefront Coding
Dowski et al., Applied Optics 1995

• How does it work?

• Therefore, regardless of depth, the object will be:

• in focus (small PSF) for some parts of the lens

• blurry (large PSF) for other parts of the lens

• The overall PSF will be the sum.

• More or less depth-invariant.

• Deconvolve with a single PSF to recover


scene.
Wednesday, February 29, 12
Extended Depth of Field through
Wavefront Coding
Dowski et al., Applied Optics 1995

Cubic phase
Regular
plate
lens
(deblurred)

Wednesday, February 29, 12


Depth from Diffracted Rotation
Greengard et al., Optics Letters 2006

Aside: Can also design phase plate to be depth-variant.

Wednesday, February 29, 12


4D Frequency Analysis of Computational
Cameras for Depth of Field Extension
Levin et al., SIGGRAPH 2009

• Similar idea

• Have parts of the lens focus at different depths.

“Lattice Focal Lens”


Wednesday, February 29, 12
4D Frequency Analysis of Computational
Cameras for Depth of Field Extension
Levin et al., SIGGRAPH 2009

• Similar idea

• Have parts of the lens focus at different depths.

Regular lens Lattice Focal Lens Deconvolved

Wednesday, February 29, 12


Diffusion Coded Photography for
Extended Depth of Field
Cossairt et al., SIGGRAPH 2010

• Put a radial diffuser in front of the lens.

Time
Sensor Position

Wednesday, February 29, 12


Diffusion Coded Photography for
Extended Depth of Field
Cossairt et al., SIGGRAPH 2010
• Idea

• Add a random diffuser (surface gradient is sampled


randomly from a probability distribution)

• This makes the PSF stochastic, and ultimately less


dependent on ray angles, leading to depth invariance.

Sensor Position

Wednesday, February 29, 12


Diffusion Coded Photography for
Extended Depth of Field
Cossairt et al., SIGGRAPH 2010

PSF is indeed depth-invariant.


Sensor Position

Wednesday, February 29, 12


Diffusion Coded Photography for
Extended Depth of Field
Cossairt et al., SIGGRAPH 2010
Regular photos

Deblurred output Sensor Position

Wednesday, February 29, 12


Next Step

• We’ve looked at techniques that modulate the


aperture spatially.
• Why not try temporally?
• Change modulation over time.

Wednesday, February 29, 12


Flexible Depth of Field Photography
Nagahara et al., ECCV 2008

• Translate the sensor over the exposure time.

• Equivalent to simulating lens of different focal


lengths over time.

Wednesday, February 29, 12


Flexible Depth of Field Photography
Nagahara et al., ECCV 2008

• Translate the sensor over the exposure time.

• Equivalent to simulating lens of different focal


lengths over time.

Wednesday, February 29, 12


Flexible Depth of Field Photography
Nagahara et al., ECCV 2008

• Other applications

• Could move the sensor non-linearly

• Discontinuous depth of field?

• Combine with rolling shutter

• Tilt-shift

Wednesday, February 29, 12


Flexible Depth of Field Photography
Nagahara et al., ECCV 2008

Wednesday, February 29, 12


More ways for
Temporal Coding
• One can also temporally code aperture by
engaging the shutter over time.
• Could even use electronic shutter.

Time
Wednesday, February 29, 12
Coded Exposure Photography: Motion
Deblurring using Fluttered Shutter
Raskar et al., SIGGRAPH 2006
• LCD shutter flutters in order to block/unblock light
during exposure.

Wednesday, February 29, 12


Coded Exposure Photography: Motion
Deblurring using Fluttered Shutter slide stolen from
Raskar et al., SIGGRAPH 2006 Ramesh Raskar

Wednesday, February 29, 12


Coded Exposure Photography: Motion
Deblurring using Fluttered Shutter slide stolen from
Raskar et al., SIGGRAPH 2006 Ramesh Raskar

Creates a better-conditioned motion blur!


Wednesday, February 29, 12
Coded Exposure Photography: Motion
Deblurring using Fluttered Shutter
Raskar et al., SIGGRAPH 2006

Motion blur
can be
inverted
easily!

Wednesday, February 29, 12


Next Step

• We’ve tried modulating capture based on


• where the ray passes through the aperture
• when the ray passes through the aperture
• Instead, let’s move the entire camera.

Wednesday, February 29, 12


Motion Invariant Photography
Levin et al., SIGGRAPH 2008

• Motivation

• If there is an object that travels at a constant


speed, you can image it sharply by moving the

Time
camera linearly at some velocity.

Sensor Position

Wednesday, February 29, 12


Motion Invariant Photography
Levin et al., SIGGRAPH 2008

• The entire camera moves during exposure in a parabola.

Time
Sensor Position

Wednesday, February 29, 12


Motion Invariant Photography
Levin et al., SIGGRAPH 2008

• If there is an object that travels parallel to the image


plane, at some point in time the camera motion will
mirror the object exactly.

Time
• Object is momentarily imaged sharply.

• At other times, it will be somewhat blurry, blurry, very


blurry, etc.

• Above happens independent of object speed!

• Motion-invariant motion blur!


Sensor Position

Wednesday, February 29, 12


Motion Invariant Photography
Levin et al., SIGGRAPH 2008

Alt-tab to video?

Wednesday, February 29, 12


Motion Invariant Photography
Levin et al., SIGGRAPH 2008

Time
Sensor Position
Scene Captured Deblurred
Wednesday, February 29, 12
Next Step

• While we are at it, let’s move both lens and


the sensor, independently.

Wednesday, February 29, 12


Image Destabilization: Programmable
Defocus using Lens and Sensor Motion
Mohan et al., ICCP 2009

Time
Sensor Position

Wednesday, February 29, 12


Image Destabilization: Programmable
Defocus using Lens and Sensor Motion
Mohan et al., ICCP 2009

• Translate both the lens and the sensor laterally.

Time
• Depending on their relative speed, there exists a
3D point in the scene that is imaged by the
same pixel.

• Remains sharp.

• Other points are effectively motion-blurred


Sensor Position

Wednesday, February 29, 12


Image Destabilization: Programmable
Defocus using Lens and Sensor Motion
Mohan et al., ICCP 2009

Time
regular camera

result

Sensor Position

Wednesday, February 29, 12


Next Step

• Can we modulate capture based on


something entirely different?
• Wavelength?

Wednesday, February 29, 12


Spectral Focal Sweep: Extended Depth of
Field from Chromatic Aberration
Cossairt et al., ICCP 2010

• Have a lens that maximizes axial chromatic aberration.

• Different wavelength focuses at different depth!

• If the scene spectrum is broadband,

• We’re effectively doing a focal sweep!

Sensor Position

Wednesday, February 29, 12


Spectral Focal Sweep: Extended Depth of
Field from Chromatic Aberration
Cossairt et al., ICCP 2010

Sensor Position

Wednesday, February 29, 12


Spectral Focal Sweep: Extended Depth of
Field from Chromatic Aberration
Cossairt et al., ICCP 2010
Conventional camera

SFS Lens

Deblurred output

Sensor Position

Wednesday, February 29, 12


Spectral Focal Sweep: Extended Depth of
Field from Chromatic Aberration
Cossairt et al., ICCP 2010
Conventional camera

Deblurred output

Sensor Position
For color, transform to YUV and deblur Y only.
Wednesday, February 29, 12
Other Cool Stuff

• Coded aperture projection


• Periodic motion
• http://www.umiacs.umd.edu/~dikpal/
Projects/codedstrobing.html
• Interaction with rolling shutter

Wednesday, February 29, 12


Coded Aperture Projection
Grosse et al., SIGGRAPH 2010

• Pick coded aperture that creates depth-invariant blur.

• Could be adaptive.

• Before projecting, convolve image with the inverse of


that aperture. (Ensures that the image looks fine.)

Sensor Position

Wednesday, February 29, 12


Coded Aperture Projection
Grosse et al., SIGGRAPH 2010

• Pick coded aperture that creates depth-invariant blur.

• Could be adaptive.

• Before projecting, convolve image with the inverse of


that aperture. (Ensures that the image looks fine.)

Sensor Position
Depth of field increased!
Wednesday, February 29, 12
Questions?

Wednesday, February 29, 12


Cited Papers
• Levin et al., “Image and Depth from a Conventional Camera with a Coded Aperture.” SIGGRAPH, 2007.

• Dowski et al., “Extended Depth of Field through Wavefront Coding.” Applied Optics, 1995.

• Greengard et al., “Depth from Diffracted Rotation.” Optics Letters, 2006.

• Levin et al., “4D Frequency Analysis of Computational Cameras for Depth of Field Extension.” SIGGRAPH, 2009.

• Cossairt et al., “Diffusion Coded Photography for Extended Depth of Field.” SIGGRAPH, 2011.

• Nagahara et al., “Flexible Depth-of-Field Photography.” ECCV, 2008.

• Raskar et al., “Coded Exposure Photography: Motion Deblurring using Fluttered Shutter.” SIGGRAPH, 2006.

• Levin et al., “Motion Invariant Photography.” SIGGRAPH, 2010.

• Mohan et al., “Image Destabilization: Programmable Defocus using Lens and Sensor Motion.” ICCP, 2009.

• Cossairt and Nayar. “Spectral Focal Sweep: Extended Depth of Field from Chromatic Aberration.” ICCP, 2010.

• Grosse et al. “Coded Aperture Projection.” SIGGRAPH, 2010.

Wednesday, February 29, 12

You might also like