Technical Concepts of Automotive Lidar Sensors: A Review
Technical Concepts of Automotive Lidar Sensors: A Review
Abstract. Automotive LiDAR sensors are seen by many as the enabling technology for higher-
level autonomous driving functionalities. Different concepts to design such a sensor can be
found in the industry. Some have already been integrated into consumer cars while many others
promise to be in mass production soon to become cost-effective enough for broad deployment.
However, automotive LiDAR sensors are still evolving and a variety of sensor designs are pur-
sued by different companies. Here, we construct the automotive LiDAR design space to visually
depict system design options for these sensors. Subsequently, we exemplify the concepts with
drawings that can be found in published patent applications (focusing on scanning mechanisms
and scan patterns) before discussing their advantages and challenges. © 2023 Society of Photo-
Optical Instrumentation Engineers (SPIE) [DOI: 10.1117/1.OE.62.3.031213]
Keywords: automotive LiDAR design space; technical concepts; patent application drawings;
scanning mechanisms.
Paper 20221060SSV received Sep. 20, 2022; accepted for publication Dec. 8, 2022; published
online Jan. 10, 2023.
1 Introduction
Over the past years, a variety of different automotive light detection and ranging (LiDAR) con-
cepts and sensors have emerged. All aim at fulfilling the challenging requirements of car makers
outlined in Table 1. This work concentrates on system design choices with the intention to give
an overview of available options when building an automotive LiDAR sensor. The “automotive
LiDAR design space” of Sec. 2 depicts these options visually. In subsequent Secs. 3–5, we
examine concepts from drawings of published patent applications. These drawings are some-
times difficult to grasp. We therefore illustrate our understanding with a number of accompany-
ing figures. The procedure is shown in Fig. 1 with the help of a notional LiDAR concept. For
demonstration purposes, the drawing in Fig. 1(a) from published patent application Ref. 2 was
taken out of context. We pretend it shows a LiDAR sensor mounted in the grill of a car. The cone
indicates its field of view (FOV).
To visualize the overlap O between emitter (red) and receiver (turquoise) beams, we present
an overlap plot, as in Fig. 1(b), where useful (mostly for systems with separated emitters and
receivers as introduced later in Sec. 2.4). We also provide a basic principles illustration to high-
light the aspects of a sensor concept that are covered in this work [see Fig. 1(c)]. The setup to
visualize the temporal data acquisition/scan sequence is shown in Fig. 1(d). It shows a sensor
pointing toward a perpendicular wall to illustrate how we derive the scan pattern plotted in
Fig. 1(e) (here, the scanning direction is from left to right).
Excluded from this work are extensive discussions about individual components of LiDAR
sensors, such as emitters, receivers or optics and details on achievable distance resolution, signal-
to-noise ratio (SNR) derivations, and so on. For more on these topics, see Refs. 3–6. For general
introductions to the field of automotive LiDAR sensors, see Refs. 7–14.
FOV FOVhor × FOVvert 120 deg ×30 deg 40 deg ×20 deg
Hor. and vert. resolution Δφ; Δϑ ∼1 deg 0.1 deg to 0.15 deg
Frame rate f 25 Hz
Reliability AEC-Q100
(a)
(b) (c)
Perpendicular wall
(e)
Sensor
(d)
Fig. 1 Illustration of an imaginary LiDAR concept. (a) Drawing of a sensor concept from published
patent application Ref. 2. Labels (numbers 2, 4, 6,. . . ) will be annotated when helpful for the
explanations but otherwise left blank. (b) Overlap plot in the far field with receiver spot in turquoise
and emitter spot in red. (c) Image of the concept basics showing only a minimal number of com-
ponents to highlight important aspects of patent application drawings. Black arrows indicate how
and which parts are moving. (d) Visualization of the setup to explain how data acquisition pattern
as in Fig. 1(e) are derived. (e) Depiction of a data acquisition pattern on a perpendicular wall.
Sequence in time is illustrated by shades of red. t i indicates current data acquisition point, t i−1
the previous and t iþ1 the next point. Here, a scan from left to right is shown. The entire FOV
is conceptually represented by all black circles.
Wavelength
Semi
Solid-state Classical
solid-state
Flash
SWIR, e.g.,
MEMS Mechanical
OPA Spectral def lection
ion
NIR, e.g.,
TOF digital
pl
ci
in
pr
s.
ea
M
FMCW
detection of light from different angles of their FOV (Sec. 5). Representatives of solid-state
scanning mechanisms in Fig. 2 are flash (no scanning), spectral deflection, as well as optical
phased array (OPA).
Fig. 3 Artist illustration of the TOF measurement principle. (a) Pulse emission and start of time
measurement. (b) Pulse before reflection. (c) Reflected pulse on its way back to sensor.
(d) Detection of pulse, determination of Δt .
Lambertian target
Complete overlap
(1)
Receiver beam
Emitter beam
Photocurrent
Threshold
High
Voltage signal
U / a.u.
I / a.u.
Low
Time / a.u.
bold line shown in Fig. 5. It is set from low to high whenever the photocurrent rises above the
detection threshold and vice versa. TDCs24 are used to determine when the threshold crossing
happened. Distance resolution of Δd ¼ 5 cm (Table 1) requires TDC clock frequencies of
f TDC ¼ 2Δd
c
¼ 3331 ps ≈ 3 GHz.
Unfortunately, abstracting photocurrent with a box-like voltage signal leads to loss of infor-
mation. Using analog-to-digital converter (ADC) to digitize the photocurrent requires them to
have a sampling rate of at least 1 to 10 GHz to be able to sample nanosecond pulses and provide
the required distance resolution. Such fast ADCs tend to be too expensive for automotive LiDAR
applications, cf., system cost from Table 1. However, there is an alternative way to obtain a
digital signal from TOF measurements.
Photocurrent Photocurrent
Ampl. / a.u.
Ampl. / a.u.
Analog signal Digital signal
Fig. 6 Graphical comparison of TOF analog and digital signals. (a) Processed photocurrent in
analog TOF. (b) Processed photocurrent in digital TOF.
2.3 Wavelength
The operating wavelength of a LiDAR sensor has direct implications not only on the sensor’s
emitter but also on its receiver. Automotive LiDARs are generally operated with a wavelength
that is above the spectral sensitivity range of the human eye.
Near infrared (NIR) operating wavelengths typically range from 850 to 940 nm. It is silicon
as detector material that makes this wavelength range attractive. Silicon can be processed with
highly optimized and cost efficient semi-conductor manufacturing techniques. Its peak sensitiv-
ity is around 900 to 1000 nm with a steep drop-off at 1100 nm36 where silicon becomes trans-
parent. The major downside of NIR wavelength are eye safety limitations that prevail in this
regime (close proximity to the visible range of the human eye) and restrict the optical energy
output.
Amp. / a.u.
Freq. / a.u.
Amp. / a.u.
Fig. 7 Plots of FMCW signals with received signal shifted in time and frequency. (a) Visualization
of FMCW signals in the time domain: outgoing (red) and incoming (turquoise) light, both mixed in
gray with envelope in black. (b) Signal plots in the frequency domain: outgoing f out (red) and
incoming f in (turquoise) as well as envelope frequency f beat (black).
Receiver
Receiver
Emitter
Splitter
Emitter
(a) (b) (c) (d)
Fig. 8 Visualization of coaxial and biaxial measurement channel configurations. (a) Illustration of a
coaxial measurement channel. (b) Overlap plot coaxial channel. (c) Illustration of a biaxial meas-
urement channel. (d) Overlap plot biaxial channel.
Incoming light
Splitter
Outgoing light
Optical interference
on detector
Fig. 9 Concept view on FMCW measurement channel with heterodyne optical mixing.
(a) (b)
(c) (d)
Fig. 10 Illustration of a spinning sensor concept with drawings from Ref. 46. (a) Front side view on
opened sensor. (b) Back side view. (c) Display of a conceptual scan pattern for a rotating sensor or
mirror. All vertical channels measure simultaneously. The opening angle between channels
causes a characteristic cushion shape with lines that expand with increasing distance to the sur-
face normal. (d) Concept basics drawing of a spinning sensor.
Table 1) requires stacking and aligning 200 individual measurement channels. A horizontal FOV
of 360 deg can be covered by mounting the sensor, e.g., on the cars roof.
A similar concept (analog TOF and NIR wavelength) with a mirror instead of a rotating
sensor is shown in Fig. 11 with drawings from published patent application Ref. 47. The beams
of both receiver and emitter stacks (here, positioned above each other) are simultaneously
scanned by the rotating mirror. The plate separator visible in Fig. 11(a) and also shown in the
sensors concept view of Fig. 11(b) minimizes crosstalk (optical shorts, cf., Sec. 2.4). Reaching
N points;vert on the order of hundreds is challenging due to hardware/assembly limitations, but the
compact design shown in Fig. 11(c) allows for a seamless integration into the body of a car.
However, covering a wider FOVvert > 120 − 140 deg is basically impossible for sensors that
shall not stick out of the car. The scan pattern of a sensor with rotating mirror is similar to the
one of a rotating sensor [cf., Fig. 10(c)].
Figure 12 shows a LiDAR concept from published patent application Ref. 48 that utilizes a
polygon mirror to scan in horizontal direction. Vertical scanning is realized with a motor that
controls a mirror via pulleys and a belt. This is a point-wise scanning concept that scans a single,
coaxial measurement channel (indicated by the “emitter,” “receiver,” and “splitter” annotations).
High frame rates of f ≈ 25 Hz for FOV, Δφ, Δϑ from Table 1 and a maximum measurement
range of dmax ¼ 300 m are challenging to achieve as the calculation of the acquisition time for a
single frame, tframe;ac , in Eq. (2) shows as follows:
(b)
(c)
(a)
Fig. 11 Illustration of a rotating mirror implementation that scans biaxial stacks of emitters and
receivers (Ref. 47). (a) Drawing of a sensor with scanning mirror. (b) Illustration of sensor concept
basics. (c) Sensor sketch with closed cover.
Receiver
Fig. 12 Sketch of a sensor concept with polygon mirror and coaxial measurement channel from
published patent application Ref. 48.
For f ¼ 25 Hz the acquisition time is required to be four times shorter. The concept realizes a
large receiver aperture Ade which is beneficial for maximum measurement range since it
increases Prec in Eq. (1). To further improve d10% (cf., Table 1), this sensor is operated with
a SWIR wavelength that allows for higher optical output power as outlined in Sec. 2.3.
Fig. 13 Illustration of a sensor concept with double galvano scanner and coaxial measurement
channel. (a) Concept view and illustration of the galvano mirror movements. (b) Display of emitter
and receiver beams in galvano scanner concept. (c) Plot of a possible scan pattern realized with a
double galvano scanner.
Vertical scanning of
beam receiving area
Lateral scanning by
transmitted-beam
(a) (b)
Fig. 14 Representation of a sensor concept with biaxial galvano scanner. (a) Drawing of a LiDAR
sensor with biaxial galvano mirrors scanning a emitter and receiver line (from Ref. 49). (b) Far field
overlap of emitter and receiver.
mirrors are dictated by rotation axes and number of channels. A check pattern, for example, is
possible [shown in Fig. 13(c)] to ease data analysis with perception algorithms originating from
image processing. A single coaxial measurement channel with a double galvano scanner has the
same limitations that were previously derived in Eq. (2).
A biaxial, double galvano scanner sensor concept is visualized in Fig. 14. The drawing from
published patent application Ref. 49 in Fig. 14(a) shows one scanner on the emitter and the other
one on the receiver side. Separated emitter and receiver paths are beneficial when it comes to
avoiding optical shorts but signal can only be generated where emitter and receiver lines cross
each other, see Fig. 14(b). Hence, this concept scans point-wise with the challenge of achieving
f ¼ 25 Hz satisfying FOV and Δφ; Δϑ requirements from Table 1. A line-wise detector does not
only receive signal photons from the crossing point of emitter and receiver line, but also addi-
tional background light outside the crossing point. Like-wise does a line emitter consume more
(optical) power to emit photons along a whole line instead of a point. Nonetheless, two galvano
scanners allow for flexible scan patterns that can be tailored to specific use-cases.
(b)
Prism inside
(a)
(c) (d)
Fig. 15 Depiction of a sensor concept with Risley prism scanner and coaxial measurement chan-
nel. (a) Coaxial measurement channel with Risley prism scanner taken from published patent
application Ref. 51. (b) Risley prism scanner with bearings and middle shaft from Ref. 52.
(c) Rotation visualization of a scanner with Risley prisms. (d) Coaxial emitter and receiver path
of a Risley prism scanner.
(a) (b)
Fig. 16 Plots of a data acquisition pattern for a sensor with a Risley prisms scanner. (a) Hypo-
trochoid scan pattern from a Risley prism scanner. (b) Illustration of the construction of
hypotrochoids.
The pattern of a Risley prism scanner can be plotted with hypotrochoids.53 Figure 16(a)
shows such a rosette-like scan pattern. They are derived from the rotation of a circle [gray
in Fig. 16(b)] on the inside of a bigger circle (black). The drawing point for hypotrochoids
is located at the end of a handle (red) that is fixated on the inner, rotating circle. A classical
frame acquisition with a Risley prism scanner is not possible. Constant f rep leads to a scan pat-
tern as shown in Fig. 16(a). Point density is high in the center as well as the outer regions but
more sparse in between. Frames also do not have a given number of horizontal and vertical points
but rather build up over time.
(a) (b)
504: emitter
505: housing
508: detector
509: optical element
511: aperture
515: analyzer
518: axle
521: power unit
522: memory
523: motor
(c) (d)
Fig. 17 Drawings of spinning LiDAR concept with biaxial, digital TOF measurement channels
operating in the NIR and SWIR wavelength regime. (a) Tansmitter unit drawing of a digital
TOF sensor concept from Ref. 55. (b) Schematical drawing of the mechanical scanning mecha-
nism (Ref. 55). (c) Concept view of digital counter rotation. (d) Sketch of a digital TOF sensor
operating with SWIR wavelength from Ref. 56.
Fig. 18 Illustration of a sensor concept that uses a polygon mirror to scan multiple FMCW mea-
surement channels in parallel. (a) Sensor concept diagram of an FMCW LiDAR with mechanical
scanning mechanism from published patent application Ref. 57. (b) Visualization of the basic
movements of a scanner consisting of a polygon and galvano mirror. (c) Beam path of an
FMCW measurement channel with scanner.
500
502
rotational
velocity 543
501
(a) (b)
Fig. 19 Illustration of a sensor concept with an FMCW measurement channel that is scanned by a
rotating polygon lens. (a) Sketch of an FMCW LiDAR concept using a polygon lens with tilted
facets from Ref. 58. (b) Scan pattern visualization of a polygon mirror either with tilted facets
or an additional galvano scanner.
Another FMCW concept is shown in Fig. 19. It does not have an additional galvano scanner
but instead uses tilted facets of a polygon lens to scan vertically as depicted in the drawing of
Fig. 19(a) from published patent application Ref. 58. Here, the light is deflected while passing
through a rotating polygon lens instead of being reflected from a polygon mirror. An exemplary
scan pattern for both polygon concepts is shown in Fig. 19(b). Ideas how to achieve a polygon
scan pattern that is comparable to the one of rotating sensors [cf., Fig. 10(c)] are provided
in Ref. 59.
Dmems
at ¼ −rθmax ω2 sinðωtÞ ¼ −
EQ-TARGET;temp:intralink-;sec4.1.1;116;132 θmax ð2πf mems Þ2 sinð2πf mems tÞ:
2
The above equation reaches maxima, at;max , at sinð2πf mems tÞ ¼ 1. Hence, we can derive
maximum g-force by dividing at;max with the gravitational acceleration g
at;max Dmems 1
EQ-TARGET;temp:intralink-;sec4.1.1;116;735 ¼ θmax ð2πf mems Þ2 ≈ 6000 g:
g 2 g
Such g-force at the edges of the mirror make it practically immune to vibrations in an automotive
context where the forces of most shocks do not exceed a couple 100 g. The natural frequency of
resonating MEMS mirrors depends on their mass and spring constant. The bigger the mirror
(more mass) the smaller its natural frequency which results in less resistance against shocks
(Sec. 4.1.2). Hence, for systems that scan a coaxial measurement channel there is a trade-off
between size of the mirror (equal to Ade from Eq. (1) and directly linked to Prec ) and robustness
(increases with smaller Dmems i.e. higher f mems ).
A sensor concept with two resonant mirrors is shown in Fig. 20 with drawings from pub-
lished patent applications Refs. 66 and 67. Figure 20(a) shows a scanning emitter that incor-
porates the mirrors shown in Fig. 20(b) to form a biaxial sensor with a non-scanning
detector as shown in Figs. 20(c) and 20(d). Since the FOV of the detector is significantly larger
than the emitter spot [see Fig. 20(e)], there is a need to optimize SNR by operating at a SWIR
wavelength allowing for more optical output power.
Scan patterns generated by such a scanning mechanism can be described with Lissajous68
figures. Examples of these figures are shown in Fig. 21. If pulses are emitted with a constant f rep ,
point density is high at the turning points of one or the other MEMS mirror and low where
maximum oscillation speed is reached [Fig. 21(a)]. However, one can synchronize pulse emis-
sions with one of the two mirrors to have evenly spaced points in one of the scanning directions
as shown in Fig. 21(b). Ideas how to derive an application oriented scan pattern with two
oscillating mirrors are described in Ref. 69.
Since a point-wise scanning mechanism has its limitations when optimizing for a higher
frame rate f, there are other sensor concepts that parallelize multiple measurement channels.
Some make use of bigger mirrors with one or two slower scanning axes and, consequently,
smaller natural frequencies f mems . However, smaller f mems shift the scanning mechanisms away
from being considered solid-state since they become more prone to vibrations in this mode of
operation.
(a) (b)
1D MEMS mirrors
Fig. 20 Depiction of a sensor concept utilizing two resonant MEMS mirrors to scan the emitter.
(a) Schematical drawing of a scanning emitter with the help of two resonating MEMS mirrors (pub-
lished patent application Ref. 66). (b) Design of two MEMS mirrors for scanning horizontally and
vertically (published patent application Ref. 67). (c) Visualization of the mirror movements.
(d) Illustration of emitter and receiver path for biaxial sensor concept with two MEMS mirrors.
(e) Overlap in far field for this biaxial sensor concept.
(a) (b)
Fig. 21 Plot of possible a Lissajous pattern from a scanner that utilizes two oscillating MEMS
mirrors. (a) Lissajous scan pattern with constant f rep . (b) Evenly spaced (in the horizontal direction)
scan pattern on a Lissajous curve.
(b)
(a)
2D MEMS mirror
(c) (d)
Fig. 22 Display of a sensor concept with 2D MEMS mirror scanner. (a) Sketch of a LiDAR concept
with a single MEMS mirror scanner and four parallel coaxial, analog TOF channels (from published
patent application Ref. 70). (b) Scan pattern illustration of a 2D MEMS mirror with four measure-
ment channels. (c) Visualization of the MEMS mirror movement. (d) Illustration of four coaxial
emitter and receiver paths scanned by a single 2D MEMS.
Figure 22, with drawings from published patent application Ref. 70, illustrates a sensor
concept incorporating four coaxial measurement channels all scanned by a single 2D
MEMS mirror [see Fig. 22(a)]. The mirror has a slow vertical and fast horizontal axis. A cor-
responding scan pattern can be seen in Fig. 22(b). The rows of this scan pattern are curved since
the optical paths of each coaxial measurement channel are off axis with respect to the mirror’s
scanning axis. For demonstration purposes we assumed instant switching from one row to the
next which, in reality, is dependent on the slow scanning axis. Hence, the rows are bent up and
down toward the horizontal turning points. Other than in the concept of Fig. 20, here Ade [from
Eq. (1)] is equal to the mirror size as depicted in the concept basics illustrations of Figs. 22(c)
and 22(d).
(a) (b)
(c) (d)
Fig. 23 Visualization of a sensor concept with a scanner utilizing centimeter-scale MEMS mirrors.
(a) Illustration of a coaxial measurement channel from Ref. 71. (b) Drawing of scanner with two
centimeter-sized MEMS mirrors from published patent application Ref. 72. (c) Scan pattern of
two independent MEMS mirrors. (d) Conceptual display of a sensor with coaxial measurement
channel.
channel, as shown in Fig. 23(a). The scanner design differs from other MEMS mirror concepts
in the sense that the mirror itself is not embedded in its peripherals [see, e.g., Fig. 20(b)] but
rather mounted on a spring-like arm as can be seen in Fig. 23(b). Since both mirrors can have
different scanning frequencies, a combination of slow and fast scanning axis are possible to
generate a scan pattern as displayed in Fig. 23(c). A conceptual representation of the sensor
is visualized in Fig. 23(d).
Another large MEMS mirror design is depicted in Fig. 24 with drawings from published
patent application Ref. 73. The size of the mirrors allow for multiple (e.g., four) measurement
channels to be scanned at once, effectively increasing frame rate. An illustration of the emitter
side is shown in Fig. 24(a) with a close-up of one of the mirrors in Fig. 24(b). Note how this
concept utilizes biaxial measurement channels where both emitter and receiver beams are
scanned by separated mirrors as shown in Figs. 24(c) and 24(d). Such a scanner requires accurate
synchronization of the mirror oscillations to guarantee the overlap of emitter and receiver FOVs
at every scan angle. Due to individually controllable horizontal and vertical oscillations, an oval
scan pattern as shown in Fig. 23(c) can be realized.
Similar types of mirrors can also be seen in conjunction with FMCW measurement channels.
An example is shown in Fig. 25 with drawings from published patent application Ref. 74.
Figure 25(a) shows a conceptual view on a system combining large MEMS mirrors with
FMCW measurement channels. Higher tmc;fmcw and the ability to realize precise controlling favor
a implementation of larger, 2D quasi-static mirrors [see Fig. 25(b)] over smaller MEMS mirrors
with high scanning frequency f mems.
810: mirror
820: MEMS device
840: conductive coil
(a) (b)
Synced 1D MEMS
mirrors ver.
Synced 1D MEMS
mirrors hor.
(c) (d)
Fig. 24 Visualization of a sensor concept with two synchronized mirrors. (a) Emitter side sketch of
a biaxial MEMS mirror LiDAR concept with four measurement channels (receiver paths are
scanned similarly), from Ref. 73. (b) Drawing of a MEMS mirror from Ref. 73. (c) Illustration of
the MEMS mirror movements. (d) Depiction of biaxial emitter and receiver paths scanned by four
1D MEMS mirrors.
optics and their emitter/receivers as shown in Fig. 26. The drawings of published patent appli-
cation Ref. 75 in Figs. 26(a) and 26(b) reveal mounts with springs that are operated in the elastic
regime of their stress-strain curve and can be pulled toward or pushed away from each other. The
movement is indicated by the arrows in Fig. 26(a) as well as Fig. 26(c). The concept utilizes
biaxial, analog TOF measurement channels. Other than in the concept of Fig. 24 where a syn-
chronization of emitter and receiver FOVs was realized by controlling both MEMS mirrors, here
the synchronization is achieved mechanically by mounting emitter and receiver on a rigid plate
that moves both of them together. The resulting scan pattern consists of parallelized Lissajous
curves76 drawn (exemplary for four measurement channels) in Fig. 27. A constant pulse repetition
(c) (d)
Fig. 26 Sensor concept based on oscillating optics and transceiver plate. (a) Display of spring-like
mounts of optics and emitter plus receiver carrier board from Ref. 75. (b) Illustration of carrier board
with array of emitters and receivers (Ref. 75). (c) Visualization of the basic movements.
(d) Conceptual display of four emitter and receiver channels.
(a) (b)
Fig. 27 Scan pattern depiction of a sensor concept with an oscillating carrier. (a) Scan pattern
plotted with constant f rep . (b) Evenly spaced scan pattern (horizontal).
rate yields a scan pattern shown in Fig. 27(a) whereas a synchronized emission of pulses can lead
to evenly spaced measurements, e.g., in horizontal direction as indicated in Fig. 27(b).
14: prism
15: MEMS device
6: deflection device
7: light deflection elements
Binary MEMS
mirror array
2D MEMS mirror
(a) (b)
Fig. 28 Display of a biaxial sensor concept with 2D MEMS mirror and binary MEMS mirror
array. (a) Illustration of a sensor concept with binary MEMS mirror array in the receiver path
(from Ref. 77). (b) A visualization of the basics from Fig. 28(a).
On the emitter side, a 2D MEMS mirror is used to scan the outgoing pulses into a selected angle
as outlined in Fig. 28(b). The scan pattern is linked to the control of the MEMS mirror array in
combination with the 2D MEMS mirror on the emitter side. It can, in principle, be chosen arbi-
trarily, e.g., as shown in Fig. 13(c). A different implementation of a MEMS mirror array in a TOF
LiDAR is described in Ref. 78.
5 Solid-State Sensors
Solid-state sensors do not have any moving parts. This results in more robustness, the potential to
highly integrate components but also impose the challenge of FOV coverage without rotation,
actuation or oscillation. We divide the concepts into flashing (Sec. 5.1) and solid-state scanning
LiDAR concepts (Sec. 5.2).
5.1.1 Full
Full flash LiDARs emit a single flash of light to illuminate their entire FOV at once. These solid-
state concepts require highly energetic pulses to not suffer from photon starvation, i.e., short
measurement ranges. Therefore, the concept shown in Fig. 29 is operated in the SWIR wave-
length regime (allowing for more optical output power cf., Sec. 2.3). The drawing of published
patent application Ref. 79 in Fig. 29(a) shows a flash emitter, namely the solid-state laser,
which is pumped by multiple pump lasers. The corresponding concept basics are illustrated in
Fig. 29(b) with the flash emitter and an FPA detector. Figure 29(c) displays a close-up of the FPA
detector with its mount, whereas Fig. 29(d) shows the overlap in the far field.
An FPA is located in the focal point of its lens with individual receivers that are offset from
the optical axis. There are physical limits to the FOV an FPA can cover, which we want to outline
here (following Ref. 80). Figure 30 shows a schematical view on an FPA with size Sdetector , its
lens of diameter Dlens and focal length f. The angle α is equal to two times the FOV that can be
covered i.e. 2α ¼ FOV. It is given as
Sdetector
EQ-TARGET;temp:intralink-;e003;116;101 tanðαÞ ¼ ; (3)
2f
FPA detector
(a) (b)
64: integrated circuit
66: detector array
268: lens elements
(c) (d)
Fig. 29 Illustration of a flash LiDAR concept from Ref. 79 with SWIR operating wavelength and
analog TOF measurement channels. (a) Drawings of a flash LiDAR concept. Top and side view
from Ref. 79. (b) Illustration of the concept basics from a full flash LiDAR (only four detector paths
drawn). (c) Drawing of a detector FPA (Ref. 79). (d) Display of the overlap for a flash LiDAR in the far
field.
Fig. 30 Sketch of an FPA with focal length f and lens diameter D lens .
and poses a constraint to the ratio between Sdetector and f. Another constraint, the f-number F,
originates from lens design and is defined as f divided by Dlens. There is a theoretical lower limit
of F ¼ Dflens ≥ 0.5. Manufacturable lenses typically have an f-number not smaller than F ≈ 1.
This implies that f and Dlens are at most equal, i.e., Dflens ≈ 1 (in many cases f is longer than Dlens
and F > 1). If we insert f ≈ Dlens into Eq. (3), we find that FOV ¼ 2α and Dlens are inversely
proportional to each other, tanðαÞ ¼ S2D
detector
lens
. Hence, for a given FPA detector array of size Sdetector
one can only increase the FOV by reducing Dlens . However, Dlens is equal to Ade from Eq. (1),
which in turn is directly proportional to the received power Prec. The result are contradicting
requests. On the one hand, one would like to enhance Dlens (Ade ) for long measurement ranges
(which comes at the cost of reducing FOV ¼ 2α), on the other hand, the FOV is desired to be
as large as possible (requiring smaller Dlens ). For a sensor utilizing a detector FPA a trade-off
between size of Dlens and FOV coverage has to be made.
A flash LiDAR can achieve high frame rate, f since all measurement channels are active in
parallel. However, data processing can become challenging for the same reason.
5.1.2 Sequential
The sequential flash LiDAR concept (shown in Fig. 31) has a one-to-one correlation of emitters
and receivers. Both emitter and receiver FPAs have identical physical dimensions and are opti-
cally aligned to each other to form multiple digital TOF measurement channels. The parallel
measurement channels are shown in Fig. 31(a) with a drawing from published patent application
Ref. 81 and additionally visualized in the conceptual view of Fig. 31(c). Figure 31(b) shows a
compact integration of all components into a LiDAR module. Comparing the overlap indication
from Fig. 31(d) to the overlap of a full flash LiDAR in Fig. 29(d), one can see that a one-to-one
correlation between emitters and receivers has the potential to reduce the number of photons lost
in the gaps between receiver FOVs. This concept is not called a full flash but a “sequential flash”
LiDAR since scanning is achieved by electronically activating lines in both arrays one after the
other, see Fig. 31(e). The use of digital TOF (cf., Sec. 2.2.2) measurement channels imposes
challenging time constraints but also allows for the implementation of multiple thousands
of them.
(a)
(b)
FPA emitter
FPA detector
(c)
(d) (e)
Fig. 31 Display of a sequential flash LiDAR concept. Drawings taken from Ref. 81 and 82. (a) Top
view of sequential flash LiDAR from Ref. 81. (b) Design of a sequential flash LiDAR module
(Ref. 82). (c) Visualization of LiDAR concept with emitter and receiver FPAs (only four measure-
ment paths drawn). (d) Plot of overlap for a sequential flash LiDAR in the far field. (e) Illustration of
a line-wise scan pattern.
all ideas is to manipulate light (its wavelength, phase, intensity, polarization, etc.) in such a way
that the interfering waves can be steered into different angles. Many focus on the emitter side of
measurement channels since the light wave as well as the properties of photons can be easier
controlled during emission. An implementation on the receiver side where the collection of
back-scattered light with undefined polarization states requires large apertures is much more
demanding.
Constructive interference
Waveguides
Phase delay
Fig. 32 Illustration of how to steer a beam by manipulating the phase in waveguides. There is a
constant shift from one waveguide to the other which results in a beam deflection toward the
bottom.
FPA detector
OPA
(a) (b)
10: laser
20: Y-branch tree
30: interference couplers
40: ohmic heating electrodes
120: out-of-plane optical couplers
(c) (d)
Fig. 33 Illustration of a solid-state sensor concept with an emitter scanned by an OPA and an FPA
detector. (a) Drawing of a sensor design incorporating an OPA (from Ref. 86). (b) Concept view
explaining the sensor drawing of Fig. 33(a). (c) Depiction of an OPA setup.85 (d) Schematical illus-
tration of an OPA structure.85
LCs can also be used in combination with metasurfaces103–105 that consist of arrays of two
dimensional quasi-periodic sub-wavelength-scale unit elements, so called meta-atoms (metallic
or dielectric). By changing the meta-atoms geometrically in size, shape or orientation across the
surface, one can locally modify the phase of the incoming light to shape the wavefront. There are
many different options to steer the light with meta materials as outline in Ref. 106. An example
concept with scanning emitter and receiver that combines copper rails with LC layers in between
can be found in published patent application Ref. 107 and is shown in Fig. 35. Here, two in-
dependent scanners are foreseen for a biaxial emitter and receiver configuration. It is a point-wise
scanning sensor as indicated in Figs. 35(a) and 35(b) allowing for basically arbitrary scan pat-
tern. Close ups of the so-called metasurface scanner can be found in Fig. 35(c) (the entire scan-
ner) as well as its structure in Fig. 35(d).
(a)
201: polarized light
: steering stages
501: view : heating layers
504: associated optics
505: 1D resonant MEMS mirror
506: LCPG beam steering element
508: optical receiver
(b)
(c)
Fig. 34 Illustration of a sensor concept utilizing LCs in combination with polarized light. (a) Block
diagram of a senor concept from Ref. 102 utilizing LCs. (b) Sketch of a sensor concept that
extends its FOV by utilizing an LCPG beam steering element (Ref. 102). (c) Model of a scanner
combining LCs and PGs.102
6 Discussion
In this section, we want to revisit the introduced concepts (grouped by section and subsection
headlines) and discuss their advantages and challenges.
The FOV coverage axis of the automotive LiDAR design space from Fig. 2 served as a guide-
line for the introduction of sensor concepts. The chosen order from classical/mechanical scan-
ning systems over MEMS-based solutions to solid-state approaches also represents (in first
approximation) their technology readiness level (TRL). Scanning solutions based on well estab-
lished motors generally have the highest maturity. They are relatively simple to use and enable a
wide FOV coverage. However, car maker requirements regarding durability and cost efficiency
(mass-producibility) raise the need for alternative scanning approaches. One of the biggest chal-
lenges for these new approaches is providing enough FOV coverage while being (semi) solid-
state. MEMS-based concepts described in Sec. 4 try to avoid the negative aspects of mechanical
scanning systems, such as friction and abrasion while keeping the benefit of moving parts to
scan. Aspects like their long term durability and performance stability over the automotive tem-
perature range (cf., Table 1) are fields of ongoing work. Although solid-state sensors have the
potential to be highly integrated, with less components in the bill of material and automatic
assembly lines, they–as of today–often lack performance when compared to classical
(i.e., mechanical) or MEMS scanning LiDARs.
The spinning LiDAR concepts from Figs. 10 and 17 are probably the most known repre-
sentatives of automotive LiDAR sensors. Their unique capability of being able to cover
FOVhor ¼ 360 deg makes them especially interesting for applications where complete surround
perception is of predominant importance. However, they cannot be as seamlessly integrated into
204: optics
206: laser diode
207: receive sensor
210: trans. metasurface
215: rec. metasurface
Metasurfaces
(a) (b)
100: metasurface
125: optical radiation
126: “reflected optical radiation”
420: dielectric material
150: optical resonant antennas
430: copper antenna rails
440: insulator
(c) (d)
Fig. 35 Display of a sensor concept that includes a meta surface scanning mechanism.
(a) Depiction of sensors schematics (Ref. 107). (b) Conceptual view of a sensor design with meta-
surfaces. (c) Illustration of the meta surface working principle from Ref. 107. (d) Close-up on the
metasurfaces structure.107
e.g. the bumper of a car as the rotating mirror concept from Fig. 11. We divided TOF meas-
urement channels in Sec. 3 into mechanical scanning concepts with analog TOF (Sec. 3.1) and
digital TOF (Sec. 3.2) utilizing either NIR or SWIR operating wavelengths. An analog TOF
channel in the NIR typically consist of an edge-emitting laser (EEL) in combination with an
APD (for explanations on the working principle of these components see, e.g., Refs. 3 and
31). Both are mature components, cost efficient and extensively in use for decades. Higher opti-
cal output power within eye safety limits (cf., Sec. 2.3) motivate a switch to SWIR wavelengths
(concept from Fig. 12). Emitters that generate such high optical output power typically utilize
fiber lasers. The choice between NIR EEL and SWIR fiber lasers can be summarized in a sim-
plified way as a trade-off between cost efficiency and performance. Moving away from silicon as
detector material to, e.g., InGaAs (for SWIR wavelength) comes at the cost of higher component
prices as well as noisier detectors (cf., Sec. 2.3). For analog TOF with APD detectors this results
in a tolerable increase in dark current, but in digital TOF noisier detectors usually necessitate a
change in the TCSPC measurement procedure. InGaAs SPADs tend to have higher dark count
rates and afterpulsing which cause unwanted triggers during the acquisition of histograms. A
possible counter measure to mitigate these negative effects are gating schemes as presented in
Ref. 109 for the concept from Fig. 17(d).
FMCW measurement channels with their ability to measure relative velocity provide an
additional feature that can be of important help when it comes to the segmentation of point
clouds with perception algorithms. Another benefit is the possible optical signal amplification
by enhancing the power of the portion of the outgoing that is optically mixed with the incoming
(a) (b)
Angular dispersive
element
Galvano scanner
(c) (d)
Fig. 36 Spectral deflection scanning with lens arrangement and scan pattern. (a) Conceptual dia-
gram for a spectral deflection LiDAR (Ref. 108). (b) Drawing of lens arrangement for 2D spectral
deflection scanning.108 (c) Sensor concept view utilizing prism scanning. (d) Achievable scan pat-
tern sketch with lens arrangement from Ref. 108.
light (cf., Fig. 9). It is, however, challenging to parallelize multiple FMCW measurement chan-
nels which makes wide FOV coverage at a frame rate of f ¼ 25 Hz and a high angular resolution
difficult. Many of the sensor concepts currently available utilize polygon scanners as illustrated
in Figs. 18 and 19. FMCW channels are also more complex (e.g., number of components Fig. 9
compared to Fig. 8) and prone to misalignment or phase noise as well as shot noise on the emitter
side. The use of photonic integrated circuits (PIC) has the potential to overcome these
challenges110,111 but additional work on, e.g., compact integration is required. We, therefore,
see many research activities in this field.
The summary table displayed in Table 2 rates aspects like cost, FOV coverage, size/power,
TRL and durability of concept groups against the requirements listed in Table 1. Qualitative
ratings range from “+” over “○” to “−” and indicate how we see the agreement between merits
of a concept group and desired specifications. In short, classical/mechanical scanners provide
mature options to have flexible scan pattern (adaptive scan angles, e.g., Fig. 13 or rotation
speeds, e.g., Fig. 15) with large apertures [Ade in Eq. (1)]. Durability, size and cost efficiency
FOV Size/
Cost coverage power TRL Durability
Classical scanning − + − + ○
Flash illumination + − + ○ +
Solid-state scanning + ○ ○ − +
7 Conclusion
We presented a visual depiction of the automotive LiDAR design space followed by an intro-
duction to each of the possible options. Different LiDAR concepts were outlined with drawings
from published patent applications which we explained in accompanying figures. We covered
many mechanical scanning techniques that became flagships during first use of automotive
LiDAR sensors. With the push toward higher level autonomous driving functionalities new alter-
native scanning methods emerged. Concepts for automotive solid-state LiDAR promise to fulfill
carmakers requirements while being cost-efficient, robust and compact enough to be integrated
into consumer cars. A field of many ongoing research activities. We presented an overview of
existing and future automotive LiDAR scanning concepts and concluded with a discussion of
their merits and disadvantages. This work provides orientation to the reader and serves as a
starting point for further research in the field of automotive LiDAR sensors.
References
1. M. E. Warren, “Automotive LiDAR technology,” in Symp. VLSI Circ., pp. 254–255 (2019).
2. R. Stettner et al., “Ladar enabled impact mitigation system, ” Published patent application
EP3663793A1, Advanced Scientific Concepts, Inc. (2020).
3. P. F. McManamon, LiDAR Technologies and Systems, SPIE, Bellingham, Washington
(2019).
4. M.-C. Amann et al., “Laser ranging: a critical review of unusual techniques for distance
measurement,” Opt. Eng. 40, 10–19 (2001).
5. S. Royo and M. Ballesta-Garcia, “An overview of lidar imaging systems for autonomous
vehicles,” Appl. Sci. 9(19), 4093 (2019).
6. Z. Dai et al., “LiDAR s for vehicles: from the requirements to the technical evaluation,”
2021, https://www.repo.uni-hannover.de/handle/123456789/11439 (accessed 2022-05-09).
7. C. Rablau, “LIDAR: a new self-driving vehicle for introducing optics to broader engineer-
ing and non-engineering audiences,” Proc. SPIE 11143, 111430C (2019).
8. Y. Li and J. Ibanez-Guzman, “LiDAR for autonomous driving: the principles, challenges,
and trends for automotive lidar and perception systems,” IEEE Signal Process. Mag. 37(4),
50–61 (2020).
9. R. Thakur, “Scanning lidar in advanced driver assistance systems and beyond: building a
road map for next-generation LiDAR technology,” IEEE Consum. Electron. Mag. 5, 48–54
(2016).
10. J. Hecht, “LiDAR for self-driving cars,” Opt. Photonics News 29, 26–33 (2018).
11. H. Gotzig and G. O. Geduld, LIDAR-Sensorik: Grundlagen, Komponenten und Systeme
für aktive Sicherheit und Komfort, pp. 317–334, Springer Fachmedien Wiesbaden,
Wiesbaden (2015).
12. D. Bastos et al., “An overview of LiDAR requirements and techniques for autonomous
driving,” in Telecoms Conf. (ConfTELE), pp. 1–6 (2021).
13. J. Lambert et al., “Performance analysis of 10 models of 3D LiDARs for automated
driving,” IEEE Access 8, 131699–131722 (2020).
14. J. Liu et al., “TOF LiDAR development in autonomous vehicle,” in IEEE 3rd
Optoelectron. Global Conf. (OGC), pp. 185–190 (2018).
15. B. Behroozpour et al., “LiDAR system architectures and circuits,” IEEE Commun. Mag.
55, 135–142 (2017).
16. M. Hansard et al., Characterization of Time-of-Flight Data: Principles, Methods and
Applications, pp. 1–28, Springer, London (2013).
17. D. L. Shumaker and J. S. Accetta, The Infrared and Electro-Optical Systems Handbook,
J. S. Accetta and D. L. Shumaker, Eds., Vol. 6, Infrared Information Analysis Center;
SPIE Optical Engineering Press, Ann Arbor, MI; Bellingham, WA (1993).
18. R. D. Richmond and S. C. Cain, Direct-Detection LADAR Systems, SPIE Press,
Bellingham, WA (2010).
19. P. McManamon, “Review of ladar: a historic, yet emerging, sensor technology with rich
phenomenology,” Opt. Eng. 51, 060901 (2012).
20. W. Wagner et al., “Gaussian decomposition and calibration of a novel small-footprint
full-waveform digitising airborne laser scanner,” ISPRS J. Photogramm. Remote Sens. 60,
100–112 (2006).
21. T. Halldórsson and J. Langerholc, “Geometrical form factors for the LiDAR function,”
Appl. Opt. 17, 240–244 (1978).
22. J. Harms, “LiDAR return signals for coaxial and noncoaxial systems with central obstruc-
tion,” Appl. Opt. 18, 1559–1566 (1979).
23. T. Fersch, R. Weigel, and A. Koelpin, “Challenges in miniaturized automotive long-range
LiDAR system design,” Proc. SPIE 10219, 102190T (2017).
24. R. Machado, J. Cabral, and F. S. Alves, “Recent developments and challenges in FPGA-
based time-to-digital converters,” IEEE Trans. Instrum. Meas. 68, 4205–4221 (2019).
25. M.-J. Lee, “Single-photon avalanche diodes in CMOS technology: towards low-cost and
compact solid-state lidar sensors,” in Opt. Sens. and Sens. Congr., Optica Publishing
Group, p. ETu3E.2 (2020).
26. E. Charbon, “Introduction to time-of-flight imaging,” in Sensors, 2014 IEEE, pp. 610–613
(2014).
27. W. Becker, Advanced Time-Correlated Single Photon Counting Techniques, Springer,
Berlin, Heidelberg (2005).
28. K. A. Zachariasse, “Einzelphotonenzählung: time-correlated single photon counting. von
d. v. O’Connor und d. Phillips. Academic Press, London—New York 1984. VIII, 288 s.,”
Nachrich. Chem. Tech. Lab. 33(10), 896–896 (1985).
29. S. W. Hutchings et al., “A reconfigurable 3-D-stacked SPAD imager with in-pixel histo-
gramming for flash LiDAR or high-speed time-of-flight imaging,” IEEE J. Solid-State
Circ. 54, 2947–2956 (2019).
30. J. Massa et al., “Optical design and evaluation of a three-dimensional imaging and ranging
system-based on time-correlated single-photon counting,” Appl. Opt. 41, 1063–1070
(2002).
31. G. Rieke, Detection of Light: From the Ultraviolet to the Submillimeter, 2nd ed.,
Cambridge University Press, Cambridge (2002).
32. D. Pierrottet et al., “Linear FMCW laser radar for precision range and vector velocity
measurements,” MRS Proc. 1076, 10760406 (2008).
33. D. Uttam and B. Culshaw, “Precision time domain reflectometry in optical fiber systems
using a frequency modulated continuous wave ranging technique,” J. Lightwave Technol.
3, 971–977 (1985).
34. A. J. Hymans and J. Lait, “Analysis of a frequency-modulated continuous-wave ranging
system,” Proc. IEE - Part B: Electron. Commun. Eng. 107, 365–372 (1960).
35. M. Kronauge and H. Rohling, “New chirp sequence radar waveform,” IEEE Trans.
Aerosp. Electron. Syst. 50, 2870–2877 (2014).
36. M. Vollmer, K.-P. Möllmann, and J. A. Shaw, “The optics and physics of near infrared
imaging,” Proc. SPIE 9793, 97930Z (2015).
37. DIN Deutsches Institut für Normung e.V., “Sicherheit von Lasereinrichtungen—Teil 1:
Klassifizierung von Anlagen und Anforderungen (IEC 60825-1:2014),” Beuth Verlag
(2014).
38. I. S. Amiri et al., “Temperature effects on characteristics and performance of near-infrared wide
bandwidth for different avalanche photodiodes structures,” Results Phys. 14, 102399 (2019).
39. G. M. Williams, “Optimization of eyesafe avalanche photodiode lidar for automobile
safety and autonomous navigation systems,” Opt. Eng. 56, 031224 (2017).
40. R. H. Rasshofer, M. Spies, and H. Spies, “Influences of weather phenomena on automotive
laser radar systems,” Adv. Radio Sci. 9, 49–60 (2011).
41. M. Kutila et al., “Automotive LiDAR performance verification in fog and rain,” in 21st
Int. Conf. Intell. Transp. Syst. (ITSC), pp. 1695–1701 (2018).
42. C. F. Bohren and D. R. Huffman, Absorption and Scattering of Light by Small Particles,
Wiley Science Paperback Series, Wiley, New York (1998).
43. E. J. K. Isaac, I. Kim, and B. McArthur, “Comparison of laser beam propagation at 785 nm
and 1550 nm in fog and haze for optical wireless communications,” Proc. SPIE 4214,
26–37 (2001).
44. J. Wojtanowski et al., “Comparison of 905 nm and 1550 nm semiconductor laser range-
finders’ performance deterioration due to adverse environmental conditions,” Opto-
Electron. Rev. 22, 183–190 (2014).
45. K. Sassen and G. C. Dodd, “LiDAR crossover function and misalignment effects,” Appl.
Opt. 21, 3162–3165 (1982).
46. D. Hall, “High definition LiDAR system,” Published patent application EP2388615A1,
Velodyne Acoustics, Inc. (2011).
47. S. Suzuki, “Lidarvorrichtung, fahrassistenzsystem und fahrzeug,” Published patent appli-
cation DE112019000517T5, Denso Corporation (2020).
48. J. E. McWhirter, “Manufacturing a balanced polygon mirror,” Published patent application
US10663585B2, Luminar Technologies, Inc. (2020).
49. H. Kikuchi, “Optical scanning radar system,” Published patent application EP1416292B1,
Honda Giken Kogyo Kabushiki Kaisha (2005).
50. W. L. Wolfe, Introduction to Infrared System Design, SPIE Tutorial Texts, Vol. TT24,
SPIE Optical Engineering Press, Bellingham, WA (1996).
51. X. Hong et al., “System and method for supporting LiDAR applications,” Published patent
application WO2018176275A1, SZ DJI Technology Co., Ltd. (2018).
52. J. Wu et al., “Small bearings for multi-element optical scanning devices, and associated
systems and methods,” Published patent application WO2021031206A1, SZ DJI Tech-
nology Co., Ltd. (2021).
53. G. F. Marshall, “Risley prism scan patterns,” Proc. SPIE 3787, 74–86 (1999).
54. A. Pacala et al., “Optical system for collecting distance information within a field,”
Published patent application US2018217236A1, Ouster, Inc. (2018).
55. A. Pacala and M. Frichtl, “Multispectral ranging/imaging sensor arrays and systems,”
Published patent application EP3834003A1, Ouster, Inc. (2021).
56. G. Kamerman, C. Trowbridge, and V. Negoita, “Polarization filtering in LiDAR system,”
Published patent application WO2021007023A1, ARGO AI, LLC (2021).
57. B. J. Roxworthy, P. Srinivasan, and A. Samarao, “FMCW lidar using array waveguide
receivers and optical frequency shifting,” Published patent application US11372105B1,
Aeva, Inc. (2022).
58. E. J. Angus and R. M. Galloway, “LiDAR apparatus with rotatable polygon deflector
having refractive facets,” Published patent application WO2020142316A1, Blackmore
Sensors & Analytics LLC (2020).
59. R. M. Galloway, E. Angus, and Z. W. Barber, “LiDAR system including multifaceted
deflector,” Published patent application US10754012B2, Blackmore Sensors & Analytics,
LLC (2020).
60. S. P. Timoshenko and J. N. Goodier, Theory of Elasticity, McGraw-Hill, New York (1982).
61. P. R. Patterson et al., “Scanning micromirrors: an overview,” Proc. SPIE 5604, 195–207
(2004).
62. H. W. Yoo et al., “MEMS-based lidar for autonomous driving,” E&I Elektrotech.
Informationstech. 135, 408–415 (2018).
63. D. Wang, C. Watkins, and H. Xie, “MEMS mirrors for LiDAR: a review,” Micromachines
11(5), 456 (2020).
64. U. Hofmann et al., “Biaxial resonant 7 mm-MEMS mirror for automotive lidar applica-
tion,” in Int. Conf. Opt. MEMS and Nanophotonics, pp. 150–151 (2012).
65. L. Ye et al., “A 2D resonant MEMS scanner with an ultra-compact wedge-like multiplied
angle amplification for miniature lidar application,” in IEEE Sens., pp. 1–3 (2016).
66. L. C. Dussan, “Methods and systems for ladar transmission,” Published patent application
EP3195010A4, Aeye, Inc. (2018).
67. L. C. Dussan, “Ladar transmitter with feedback control of dynamic scan patterns,”
Published patent application US2019025407A1, Aeye, Inc. (2019).
68. T. B. Greenslade, “All about Lissajous figures,” Phys. Teach. 31(6), 364–370 (1993).
69. D. Cook et al., “Ladar transmitter with ellipsoidal reimager,” Published patent application
US2020333587A1, Aeye, Inc. (2020).
70. M. Shani et al., “LiDAR systems and methods,” Published patent application
WO2019106429A2, Innoviz Technologies Ltd. (2019).
71. M. Müller and M. Schardt, “Coaxial optical system of a frictionless scan system for
light detection and ranging, LiDAR, measurements,” Published patent application
WO2019234123A1, Blickfeld GMBH (2019).
72. M. Müller, “Aligning a resonant scanning system,” Published patent application
WO2019154466A1, Blickfeld GMBH (2019).
73. E. Matthew, “Scanning mirror system with attached coil,” Published patent application
US2021033845A1, Microvision, Inc. (2021).
74. S. Singer, “Lichtführung in einem lidarsystem mit einer monozentrischen lines,”
Published patent application DE102019107563A1, GM Global Technology Operations
LLC (2019).
75. J. Pei et al., “Scanning LiDAR system,” Published patent application EP3842831A1,
Cepton Technologies, Inc. (2021).
76. J. Pei et al., “Methods and apparatuses for scanning a LiDAR system in two dimensions,”
Published patent application WO2019079091A1, Cepton Technologies, Inc. (2019).
77. S. Royo et al., “A vision system and a vision method for a vehicle,” Published patent
application WO2019012081A1, Veoneer Sweden AB and Beamagine, S.L. (2019).
78. Y. Takashima et al., “MEMS-based imaging LiDAR,” in Light, Energy and the Environ.
2018 (E2, FTS, HISE, SOLAR, SSL), p. ET4A.1, Optica Publishing Group (2018).
79. R. Stettner, P. Gilliland, and A. Duerner, “Automotive auxiliary ladar sensor,” Published
patent application EP3173818B1, Advanced Scientific Concepts, Inc. (2020).
80. W. J. Smith, Modern Optical Engineering – The Design of Optical Systems, 3rd ed.,
McGraw-Hill, New York (2000).
81. R. Beuschel and M. Kiehn, “Lidar receiving unit,” Published patent application
WO2019115148A1, ZF Friedrichshafen AG; Ibeo Automotive Systems GmbH (2019).
82. S. Frick et al., “Lidar messsystem und verfahren zur montage eines lidar messsystems,”
Published patent application DE102018207297A1, Ibeo Automotive Systems GmbH;
ZF Friedrichshafen AG (2019).
83. P. F. McManamon et al., “Optical phased array technology,” Proc. IEEE 84, 268–298
(1996).
84. C.-P. Hsu et al., “A review and perspective on optical phased array for automotive lidar,”
IEEE J. Sel. Top. Quantum Electron. 27, 1–16 (2021).
85. L. Eldada, “Planar beam forming and steering optical phased array chip and method of
using same,” Published patent application EP3161533A4, Quanergy Systems, Inc. (2018).
86. L. Eldada, T. Yu, and A. Pacala, “Optical phased array LiDAR system and method of
using same,” Published patent application US2016161600A1, Quanergy Systems, Inc.
(2016).
87. S. Chung, H. Abediasl, and H. Hashemi, “A monolithically integrated large-scale optical
phased array in silicon-on-insulator CMOS,” IEEE J. Solid-State Circ. 53, 275–296
(2018).
88. S. A. Miller et al., “Large-scale optical phased array using a low-power multi-pass silicon
photonic platform,” Optica 7, 3–6 (2020).
89. R. Fatemi, A. Khachaturian, and A. Hajimiri, “A nonuniform sparse 2-D large-FOVoptical
phased array with a low-power PWM drive,” IEEE J. Solid-State Circ. 54, 1200–1215
(2019).
90. K. V. Acoleyen et al., “Off-chip beam steering with a one-dimensional optical phased array
on silicon-on-insulator,” Opt. Lett. 34, 1477–1479 (2009).
91. D. N. Hutchison et al., “High-resolution aliasing-free optical beam steering,” Optica 3,
887–890 (2016).
92. C. V. Poulton et al., “Long-range lidar and free-space data communication with high-
performance optical phased arrays,” IEEE J. Sel. Top. Quantum Electron. 25, 1–8 (2019).
93. C. Errando-Herranz et al., “MEMS for photonic integrated circuits,” IEEE J. Sel. Top.
Quantum Electron. 26, 1–16 (2020).
94. Y. Wang et al., “2D broadband beamsteering with large-scale MEMS optical phased array,”
Optica 6, 557–562 (2019).
95. J. K. Doylend et al., “Two-dimensional free-space beam steering with an optical phased
array on silicon-on-insulator,” Opt. Express 19, 21595–21604 (2011).
96. D. Kwong et al., “On-chip silicon optical phased array for two-dimensional beam steer-
ing,” Opt. Lett. 39, 941–944 (2014).
97. T. Kim et al., “A single-chip optical phased array in a wafer-scale silicon photonics/CMOS
3D-integration platform,” IEEE J. Solid-State Circ. 54, 3061–3074 (2019).
98. K. Nakamura et al., “Liquid crystal-tunable optical phased array for lidar applications,”
Proc. SPIE 11690, 116900W (2021).
99. R. Jansen et al., “Integrated calibration-free scannable structured light for fast high-
resolution lidar,” in OSA Adv. Photonics Congr. (AP) 2020 (IPR, NP, NOMA, Networks,
PVLED, PSC, SPPCom, SOF), p. Th2H.5, Optica Publishing Group (2020).
100. P. A. Blanche et al., “Holographic three-dimensional telepresence using large-area photo-
refractive polymer,” Nature 468, 80–83 (2010).
101. G. Thalhammer et al., “Speeding up liquid crystal slms using overdrive with phase change
reduction,” Opt. Express 21, 1779–1797 (2013).
102. R. Baribault and P. Olivier, “Beam-steering devices and methods for lidar applications,”
Published patent application WO2022016274A1, Leddartech Inc. (2022).
103. M. Khorasaninejad and F. Capasso, “Metalenses: versatile multifunctional photonic com-
ponents,” Science 358(6367), eaam8100 (2017).
104. J. Engelberg and U. Levy, “The advantages of metalenses over diffractive lenses,”
Nat. Commun. 11, 1991 (2020).
105. L. Zhang et al., “Advances in full control of electromagnetic waves with metasurfaces,”
Adv. Opt. Mater. 4(6), 818–833 (2016).
106. J. Kim et al., “Tunable metasurfaces towards versatile metalenses and metaholograms:
a review,” Adv. Photonics 4(2), 024001 (2022).
107. G. M. Akselrod, P. Bowen, and Y. Yang, “Tunable liquid crystal metasurfaces,” Published
patent application WO2020190704A1, Lumotive, LLC (2020).
108. F. Collarte Bondy et al., “An optical beam director,” Published patent application
EP3542209B1, Baraja Pty Ltd. (2022).
Hanno Holzhüter works as a research project manager at Ibeo Automotive Systems and is
also a PhD student with focus on DSP in LiDAR sensors at the Institute for Microelectronic
Systems (IMS), Leibniz University Hannover and Ibeo AS. Before joining Ibeo in 2016, he
worked as a scientific assistant in the engineering education research group at Technical Uni-
versity of Hamburg after finishing his master in 2015 in physics at the Georg-August-University
Göttingen.
Jörn Bödewadt has been working at Ibeo Automotive Systems since 2018 as an optical design
engineer. His tasks range from the investigation of optical effects in LIDAR sensors over sim-
ulation and experiments to specify, test, and analyze optical components. He has a background in
accelerator and free-electron laser physics where he received his PhD in 2011 at the University of
Hamburg. After his PhD, he joined Deutsches Elektronen-Synchrotron to work on fundamental
research on improving the coherence properties of free-electron lasers.
Shima Bayesteh received her BSc degree in physics in 2006 from the University of Isfahan
and her MSc degree in astrophysics in 2008 from the University of Zanjan in Iran. She completed
her PhD in accelerator physics at Deutsches Elektronen-Synchrotron (DESY) and received her
degree in accelerator physics in 2014 from the University of Hamburg. She joined Ibeo in 2016
as optics development engineer. She is currently working as LiDAR R&D engineer, dealing with
novel technological solutions for Lidar systems.
Andreas Aschinger received his diploma in physics in 2008 and completed his PhD in plasma
physics in 2012 at the Ruhr-University of Bochum. The topic of the PhD thesis was Dynamic
Light Scattering on Complex Plasmas. After his PhD, he worked at Leopold Kostal GmbH & Co.
KG in the field of driver assistance cameras. Since 2019, he has been dedicated to the develop-
ment of future LiDAR sensors at Ibeo Automotive Systems GmbH.
Holger Blume received his Dipl-Ing and PhD degrees in electrical engineering from the
University of Dortmund in 1992 and 1997, respectively. Until 2008, he worked as a senior engi-
neer at RWTH Aachen University. There he finished his habilitation in 2008. Since then he is
professor for architectures and systems at Leibniz University Hannover. His research interests
are in design space exploration for algorithms and architectures for DSP with applications in
biomedical and automotive systems.