A Practical Method For Measuring The Spatial Frequency Response of Light Field Cameras

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

A PRACTICAL METHOD FOR MEASURING THE SPATIAL FREQUENCY RESPONSE OF

LIGHT FIELD CAMERAS

Damien Firmenich, Loı̈c Baboulaz, Sabine Süsstrunk

School of Computer and Communication Sciences,


Ecole Polytechnique Fédérale de Lausanne
{damien.firmenich, loic.baboulaz, sabine.susstrunk}@epfl.ch

ABSTRACT 1.0

SFR at focal plane


The spatial frequency response (SFR) is one of the most 0.8
SFR behind focal plane

Modulation
important and unbiased image quality measures of a digital 0.6
SFR in front of the focal plane

camera. It evaluates to which extent a lens/sensor combina-


0.4
tion can resolve scene details. In this paper, we propose a sim-
ple and practical method to measure the SFR of microlens- 0.2

based light field cameras. The particularity of such cameras 0.0

resides in their ability to capture both spatial and angular in- 0.0 0.1 0.2 0.3 0.4 0.5

Nyquist
cycles/px
formation of the incoming light field thanks to an array of
microlenses located in front of the sensor. Existing meth-
Fig. 1: The spatial frequency responses (SFRs) of a
ods for measuring the SFR of conventional cameras are thus
microlens-based light field camera. The SFR varies according
no longer applicable as the interaction between the main lens
to scene depth, and is lowest at the focal plane.
and the micro-lenses results in different resolving powers over
the image plane that depend on the scene depths. By us-
ing a 3-dimensional target made of vertical lines printed on limited by the sensor resolution. In the case of first genera-
an inclined planar surface, we are able to measure the SFR tion Lytro camera, the sensor has a resolution of 3280 × 3280
across multiple depths in a single exposure. Our method al- pixels, with 330 × 330 microlenses arranged in a hexagonal
lows SFR measurements from the raw light field itself as cap- lattice. The microimages thus have a resolution of approxi-
tured by the camera, and is thus independent of subsequent mately 10 × 10 pixels.
post-processing algorithms such as image reconstruction, dig- Given the early stage of development of light field cam-
ital refocusing or depth estimation. Our experimental results eras, not many methods have yet been proposed to assess
are consistent with theoretical bounds and reproducible. their intrinsic quality using an objective criterion such as their
spatial frequency response (SFR). The SFR, analogue to the
Index Terms— Spatial frequency response, Modulation modulation transfer function of an optical system, is a resolu-
transfer function, Plenoptic cameras, Light field, Computa- tion measure that reports how well a lens/sensor combination
tional photography is able to resolve scene details. The SFR describes the mod-
1. INTRODUCTION ulation at different spatial frequencies, usually expressed in
cycles/pixel. The modulation expresses how well the original
Light field cameras, such as the ones developed by Lytro [1], contrast of the target is reproduced. When the spatial frequen-
are able to capture a 4D light field from a single exposure. cies are normalized, as illustrated in Fig. 1, the Nyquist limit
This is achieved by inserting a microlens array between the is at 0.5 cycles/pixel.
sensor and the main lens. Each microlens projects a low res- For conventional cameras, SFR measurement standards
olution microimage on the sensor, which contains directional exist such as ISO12233 [2], which define a planar reference
samples for a single spatial location. From this information, it target and an evaluation method. These methods are not only
is possible to retrace the light rays in space and develop new used to assess the quality of a given camera, but to also ob-
post-processing applications such as single exposure digital jectively compare different lens/sensor combinations among
refocusing or depth estimation. each other. They all evaluate the SFR from the acquired raw
The spatial sampling is determined by the microlens array image, before any post-processing is applied, in order to avoid
whereas the angular sampling is determined by the pixels on any modification of the modulation that would be induced by
the sensor. This implies a trade-off between spatial and an- processing the data.
gular resolution as the total number of captured light rays is Light field cameras, however, have two main optical parts

978-1-4799-5751-4/14/$31.00 ©2014 IEEE 5776 ICIP 2014

Authorized licensed use limited to: Apple. Downloaded on April 04,2023 at 15:38:17 UTC from IEEE Xplore. Restrictions apply.
Fig. 2: SFR measurement workflow. Each row of the acquired light field corresponds to a different depth on the observed
target. For each row, our method starts by selecting which microimages are valid for line spread function (LSF) estimation. The
SFR is then computed by taking the modulus of the Fourier transform of the average LSF.

in front of the sensor: the main lens and the microlens ar- 3. SFR MEASUREMENT
ray. The SFR of such cameras is subject to the interaction
between those two and varies according to the depth of the We describe here the proposed workflow to measure the SFR
scene. Consequently, the SFR also varies across the acquired of a microlens-based light field camera from a captured raw
image. This particular characteristic makes traditional meth- light field. The workflow is illustrated in Figure 2 and sum-
ods not applicable anymore. In this paper, we thus propose a marized in Algorithm 1.
method to measure the SFR of a microlens-based light field
camera across multiple depths using a 3-dimensional target Input: Raw light field captured from target
and a single image acquisition. As seen in Fig. 1, the resolv- Output: SFR for each depth of the target
ing power varies across the image and is the lowest at the focal Pre-processing (linearization, devignetting);
point [3, 4]. We conducted experiments with the Lytro and ob- Derive slanted edge angle using Hough transform;
tained results that are consistent with the theoretical bounds. for all microimages do
Our method is straightforward to implement and reproducible Fit Gaussian function to line spread function (LSF);
for different camera parameters. end
for all rows of microimages do
Classify microimages by Gaussian fit;
2. RELATED WORK
Combine LSF and compute SFR;
end
In recent literature, researchers proposed theoretical SFR Algorithm 1: High-level description of the proposed SFR
measures for light field cameras by analyzing the geometry measurement method.
of the captured light field. Ng [1] derived the output reso-
lution of photographs from the theory behind his refocusing
algorithm while Lumsdaine et al. [5] studied the sampling 3.1. Target setup and pre-processing
pattern of different microlens-based designs by comparing
their theoretical resolution floor. Our method is independent The camera setup described in Figure 3 shows how the target
of image rendering algorithms and provides a practical ap- is positioned in space with respect to the camera.The target
proach to measure the resolving power of the optics/sensor itself is made of a set of black lines printed over a white back-
combination. ground on a planar surface. The camera is rotated and tilted
SFR measurement methods for conventional cameras use with respect to the target so that slanted edges are captured at
different targets, such as a slanted edge [2], the Siemens star different depths. Those slanted edges are observed by the mi-
[6], or the dead leaves target [7]. The slanted edge measures crolenses and projected on the sensor as microimages, form-
the edge response of the camera with an oversampled step ing the acquired raw light field. Each row of the light field
function. This target is appealing for our application because corresponds to a single depths on the target. The light field is
it can be used even when only a limited number of pixels is first devignetted by dividing the input image by a normalized
available, as is the case for each microimage in light field calibration image that represents the light attenuation at each
cameras. The other targets provide more comprehensive mea- pixel due to vignetting. The light field is then preprocessed to
surements but are not suitable for low resolution images, as linearize the digital output level according to the input lumi-
they require a larger sensor area. Using those targets with low nance using the sensor’s opto-electronic conversion function
resolution images can produce results affected by aliasing. (OECF) [8], similarly to traditional SFR measurement.

5777 ICIP 2014

Authorized licensed use limited to: Apple. Downloaded on April 04,2023 at 15:38:17 UTC from IEEE Xplore. Restrictions apply.







(a) Top view (a) At the focal point, the blur (b) Away from the focal point,
is maximal and no edge is multiple microlenses capture
present in the microimages. a slanted edge for which the
SFR is computable.

  
Fig. 4: Slanted edges in microimages.
(b) Side view
  
Fig. 3: Target capture setup. The camera is tilted vertically  1  
 
and rotated horizontally with respect to the target in order to SF R(ω, r) = F LSFi (x)  (1)
capture slanted edges at different depths.  |Vr | 
i∈Vr

For the rows corresponding to the depth of the focal plane


3.2. Microimages analysis and classification of the main lens, the blur radius is maximal and no edge is
The angle of the slanted edges changes throughout the im- present in the microimages (see Figure 4). In this special case,
age because of perspective distortion. It is therefore unknown the SFR cannot be estimated using microimages. This case
and needs to be retrieved for each microimage. The angles represent the resolution floor of the camera for which the SFR
are found using the Hough transform of the rendered image can be measured from the spatial sampling of the microlens
and assigned, respectively, to each microimage based on its array (Figure 4a).
spatial position. Then, the line spread function (LSF) of the 4. EXPERIMENTS & ANALYSIS
microimage is computed by integrating the pixel rows along
We conducted experiments with the first generation Lytro
this angle. The discrete derivative of the resulting edge spread
camera [10]. The results in Figure 6a represent the SFR for a
function (ESF) is computed as suggested in [2, 9].
range of depths. The blue curve shows the average resolving
For a given row of the acquired light field, not all mi-
power at 50% modulation, and the green curve shows the
croimages possess a well-defined edge that can be used for
normalized frequency at 10% modulation, representing the
SFR measurements. Consequently, to determine which mi-
limiting resolution value consistent with the Rayleigh crite-
croimages are usable, we need to classify them into two
rion [11, 12]. The image row number on the horizontal axis is
groups based on whether they represent a valid LSF or not.
inversely proportional to the relative depth of the target with
For this purpose, a Gaussian function is fitted to the LSF
respect to the camera. The shaded area around the curves
obtained for each microimage. Based on the spread σ and
represent the standard deviation resulting from averaging the
the fitting error  of the Gaussian fit, the microimages are
LSFs. Figure 6b shows the number of slanted edges used for
clustered within each row of the light field and labelled as
each row. The values are proportional to the radius of the blur
valid when they belong to the cluster with the smallest .
projected by the main lens on the microlens array.
Clustering is performed using an Expectation-Maximization
The frequency values for the limiting resolving power
(EM) algorithm robust to outliers to find a Gaussian Mixture
(10% modulation) show that the camera does not achieve
Model fitting the data. When the two clusters overlap, the
Nyquist frequency (shown in red at 0.5 cycles/px). Thus, the
EM algorithm does not converge as the camera reached its
camera does not take advantage of the full resolution of the
resolution floor and the blur radius is maximal.
sensor, and the highest frequencies of the observed scene will
In the next step, only the valid microimages are taken into not be represented in the light field.
account for SFR estimation. The classification is done row- The shape of the SFR across the scene depth is consistent
wise as it matches the depth change of our target during the with the resolution gap identified in [3, 4] where the resolv-
acquisition. ing power drops exactly at the focal point. The ratio between
3.3. SFR computation the maximum and minimum limiting frequencies is approx-
imately 10, which is consistent with the different sampling
For each row of the light field, the LSF of the valid microim- periods between the microlenses and the pixels in the Lytro
ages are averaged and the SFR is computed as the modulus of camera design.
the LSF’s in the Fourier domain. A mathematical representa- Figure 5 shows three measurements where the focal plane
tion of the operation is shown in Equation 1, where Vr refers is shifted away from the camera position, all other camera
to the set of valid microimages at row r, F is the Fourier parameters remaining constant. The results are coherent as
transform operator and ω is the frequency in cycles/pixel. the shape of the curves is stable across the three plots, thus

5778 ICIP 2014

Authorized licensed use limited to: Apple. Downloaded on April 04,2023 at 15:38:17 UTC from IEEE Xplore. Restrictions apply.
0.5 0.5 0.5
0.4 0.4 0.4
0.3 0.3 0.3
cycles/px

cycles/px

cycles/px
0.2 0.2 0.2
0.1 SFR at 10% modulation 0.1 SFR at 10% modulation 0.1 SFR at 10% modulation
SFR at 50% modulation SFR at 50% modulation SFR at 50% modulation
0.0 50 100 150 200 250 300 0.0 50 100 150 200 250 300 0.0 50 100 150 200 250 300
Image row (decreasing depth) Image row (decreasing depth) Image row (decreasing depth)

(a) Focal point at row 150 (b) Focal point at row 100 (c) Focal point at row 65

Fig. 5: SFR measurement for different focal points. As the focal point is shifted further from the camera (i.e. towards the
top of the target), the depth of the resolution floor follows its location.
0.5
confirming the reproducibility of the proposed measurement
method. 0.4
0.3

cycles/px
5. DISCUSSION

The algorithm presented does not depend on parameters, 0.2


however modifying the target-camera setup can affect the 0.1
results to a certain extent. The vertical tilt angle of the cam- SFR at 10% modulation
SFR at 50% modulation
era delimits the depth range that will be sampled and the 0.0 50 100 150 200 250 300
horizontal rotation angle of the target defines the amount of Image row (decreasing depth)
slant of the edges. If the edges are not slanted enough, some
microimages may contain straight edges that are unusable (a) Focal point at row 100
for SFR measurement as the edge is not oversampled by the 50
# of slanted edges

sensor and the measurement is subject to aliasing. Vignetting 40


of the light field (Section 3.1) may produce inexact SFR mea-
surement as well, as it causes unequal light distribution over
30
the microimage and reduces the sharpness of the edge. 20
Our motivation for using the slanted edge method is due 10
to the low resolution of the microimages. Its simplicity comes
0 50 100 150 200 250 300
at the cost of retrieving only a 1D slice of the PSF.
The accuracy of the results can be improved with an in- Image row (decreasing depth)
creased number of usable microimages, for example by cap- (b) Number of slanted edges used for SFR measurement
turing the target multiple times and shifting the camera hori-
zontally for each exposure. Also, the lines on the target could Fig. 6: SFR measurement results. Figure 6a: the frequen-
be designed in such a way that it anamorphically compensate cies at 10% and 50% SFR over a range of depths. The shaded
for perspective distortion, thus eliminating the angle detection area is the standard deviation from the LSF averaging. The
step. In [13], Williams et al. analyze the reliability of modi- red line indicates the Nyquist frequency. Figure 6b: the num-
fications of the slanted edge method, some of which could be ber of usable slanted edges is proportional to the main lens
taken into consideration to increase accuracy when applied to blur radius.
our method.
By considering the resolution of each microlens indepen- 6. CONCLUSION
dently, this method cannot estimate the periodical dip in reso-
lution where information is lost due to the overlapping of light
field samples [14]. To overcome this issue, the disparity be- In this paper, we developed a method for measuring the
tween the slanted edge locations in neighboring microimages SFR of a microlens-based light field camera. We used a 3-
should be taken into account. dimensional target to capture the evolution of the SFR across
The method presented here was designed specifically for different depths and through the resolution floor of the cam-
microlens-based light field cameras, such as the Lytro [10] or era. We found that the experimental results obtained with
the Raytrix [15]. However there are different designs such as our technique were qualitatively consistent with the theoret-
camera arrays [16] for which a comparable method can be de- ical results derived in [4]. Additionally, the method is not
veloped to compare the performance of the different designs. dependent on image rendering algorithms as we measure the
The MATLAB code used for these experiments is pro- camera’s performance directly from the acquired raw light
vided at http://ivrg.epfl.ch/research/sfr. field.

5779 ICIP 2014

Authorized licensed use limited to: Apple. Downloaded on April 04,2023 at 15:38:17 UTC from IEEE Xplore. Restrictions apply.
7. REFERENCES [16] Kartik Venkataraman, Dan Lelescu, Jacques Duparr,
Andrew Mcmahon, Gabriel Molina, Priyam Chatterjee,
[1] Ren Ng, Digital light field photography, Ph.D. thesis, and Robert Mullis, “PiCam : An Ultra-Thin High Per-
Stanford University, 2006. formance Monolithic Camera Array,” ACM Transac-
tions on Graphics (TOG), vol. 32, no. 6, pp. 166, 2013.
[2] ISO12233, “ISO 12233:2000: Photography–Electronic
Still-Picture Cameras–Resolution Measurements,”
2000.

[3] Todor Georgiev and Andrew Lumsdaine, “Depth of field


in plenoptic cameras,” Eurographics, pp. 5–8, 2009.

[4] Tom E. Bishop, Sara Zanetti, and Paolo Favaro, “Light


field superresolution,” IEEE International Conference
on Computational Photography (ICCP), pp. 1–9, 2009.

[5] Andrew Lumsdaine, Todor Georgiev, and Georgi


Chunev, “Spatial analysis of discrete plenoptic sam-
pling,” IS&T/SPIE Electronic Imaging, vol. 8299, pp.
829909, 2012.

[6] Christian Loebich and Dietmar Wueller, “Digital cam-


era resolution measurement using sinusoidal Siemens
stars,” IS&T/SPIE Electronic Imaging, vol. 6502, pp.
65020N–1–65020N–11, 2007.

[7] Georges Matheron, Random sets and integral geometry,


Wiley New York, 1975.

[8] ISO 14524, “ISO 14524:2009 Photography – Elec-


tronic still-picture cameras – Methods for measuring
opto-electronic conversion functions (OECFs),” 2009.

[9] Peter D. Burns, “Slanted-edge MTF for digital camera


and scanner analysis,” IS&T PICS Conference, pp. 135–
138, 2000.

[10] Lytro, “www.lytro.com,” 2011.

[11] Peter D. Burns and Don Williams, “Sampling efficiency


in digital camera performance standards,” IS&T/SPIE
Electronic Imaging, vol. 6808, pp. 680805, 2008.

[12] Max Born and Emil Wolf, Principles of Optics, pp. 333–
335, Pergamon, Oxford, UK, 6th edition, 1980.

[13] Don Williams and Peter D. Burns, “Evolution of slanted


edge gradient SFR measurement,” IS&T/SPIE Elec-
tronic Imaging, vol. 9016, pp. 901605, 2014.

[14] Tom E. Bishop and Paolo Favaro, “The light field cam-
era: Extended depth of field, aliasing, and superresolu-
tion,” Pattern Analysis and Machine Intelligence, IEEE
Transactions on, vol. 34, no. 5, pp. 972–986, 2012.

[15] Raytrix, “www.raytrix.de,” 2010.

5780 ICIP 2014

Authorized licensed use limited to: Apple. Downloaded on April 04,2023 at 15:38:17 UTC from IEEE Xplore. Restrictions apply.

You might also like