Next Article in Journal
Curved and Annular Diaphragm Coupled Piezoelectric Micromachined Ultrasonic Transducers for High Transmit Biomedical Applications
Next Article in Special Issue
Direction-of-Arrival Estimation for Unmanned Aerial Vehicles and Aircraft Transponders Using a Multi-Mode Multi-Port Antenna
Previous Article in Journal
Color and Luminance Separated Enhancement for Low-Light Images with Brightness Guidance
Previous Article in Special Issue
The Uncertainty Assessment by the Monte Carlo Analysis of NDVI Measurements Based on Multispectral UAV Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Impact of Rainfall on the Detection Performance of Non-Contact Safety Sensors for UAVs/UGVs

1
National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba 305-8568, Japan
2
Altech Corporation, Yokohama 220-6218, Japan
3
National Research Institute for Earth Science and Disaster Resilience (NIED), Tsukuba 305-0006, Japan
*
Author to whom correspondence should be addressed.
Submission received: 26 March 2024 / Revised: 22 April 2024 / Accepted: 22 April 2024 / Published: 24 April 2024
(This article belongs to the Special Issue Advanced UAV-Based Sensor Technologies)

Abstract

:
This study comprehensively investigates how rain and drizzle affect the object-detection performance of non-contact safety sensors, which are essential for the operation of unmanned aerial vehicles and ground vehicles in adverse weather conditions. In contrast to conventional sensor-performance evaluation based on the amount of precipitation, this paper proposes spatial transmittance and particle density as more appropriate metrics for rain environments. Through detailed experiments conducted under a variety of precipitation conditions, it is shown that sensor performance is significantly affected by the density of small raindrops rather than the total amount of precipitation. This finding challenges traditional sensor-evaluation metrics in rainfall environments and suggests a paradigm shift toward the use of spatial transmittance as a universal metric for evaluating sensor performance in rain, drizzle, and potentially other adverse weather scenarios.

1. Introduction

The aim of this paper is to explore a method for quantitatively assessing how rain and drizzle affect the detection capabilities of non-contact safety sensors in adverse weather conditions. In recent years, unmanned aerial vehicles (UAVs) and unmanned ground vehicles (UGVs), such as drones, delivery robots, autonomous vehicles, and agricultural machines, have reached the stage of practical application. These autonomous systems, which coexist with humans, must incorporate non-contact safety sensors to accurately perceive their surroundings, detect approaching people or hazardous objects, and perform safe stopping or avoiding maneuvers.
Outdoor operations present unique challenges, such as light interference from the sun and reduced visibility during adverse weather (rain, snow, fog, etc.), which can affect sensors’ detection capabilities, as illustrated in Figure 1. Consequently, sensors that maintain performance integrity in outdoor environments are essential. However, robust sensor technology alone is not enough. Establishing precise evaluation criteria and test methodologies is crucial to ensure consistent performance under all expected environmental conditions.
This study focuses on the effects of rain and drizzle, the most common severe weather events in warm, humid climates. For safety sensors designed to operate in rainy conditions, it is important to inform users of the operational limitations associated with rainfall. Specifically, it must be clearly specified under which conditions the sensor retains the ability to detect objects and measure distance and when these functions become compromised.
The precipitation rate is a metric that has been used to evaluate the performance of sensors in rainfall environments. However, this paper argues that precipitation rates may not accurately reflect sensor detection capabilities. For example, a sensor may work properly in heavy rain (80 mm/h) but fail in light drizzle (2 mm/h).
The main contributions of this paper include demonstrating that evaluating sensors based solely on precipitation rate can be misleading in certain scenarios and proposing a more effective method for performance evaluation based on spatial transmittance, or raindrop density. Using this novel metric, we quantitatively demonstrate the varying impacts of different rainfall types on sensor efficiency and explore the behavior of common sensing technologies in rainfall conditions.
This paper is organized as follows: After clarifying the scope of this paper in Section 1.1, we first outline the physical properties of rainfall based on meteorological studies in Section 2.1. We then review relevant international standards and core sensing technologies in rainfall environments and present our previous work on sensor evaluation techniques in adverse weather conditions in Section 2.2, Section 2.3 and Section 2.4.
In Section 3, we propose a method for evaluating sensor performance in rain and drizzle environments. First, in Section 3.1, we compare precipitation rates and spatial transmittance as evaluation metrics. The problems associated with evaluating sensors based on precipitation rate are described, and the superiority of spatial transmittance is demonstrated. Section 3.2 then defines the object-detection rate in a precipitation environment, and Section 3.3, Section 3.4 and Section 3.5 detail the predefined precipitation conditions reproduced in the experiment, the test piece and experimental setup, and the sensor device under test.
Section 4 presents the experimental results, including measurements of the physical properties of the precipitation conditions and the object-detection rates of the sensors under these conditions. It also discusses the impact of precipitation on detection performance. Finally, we discuss metrics for evaluating sensor performance in rain and drizzle environments in Section 5. Section 6 is the conclusion.
Our experimental results lead us to conclude that while small raindrops primarily affect visibility, large raindrops affect precipitation rates. Therefore, as a metric for evaluating sensor performance, we recommend using the spatial transmittance or the particle density of the rain environment.

1.1. Scope

Before considering the safety of machines, including UAVs and UGVs, it is important to clearly define the scope of the application [1]. In this paper, we will focus on the behavior of sensors in rain and drizzle as meteorological phenomena. We will not discuss fog or mist, which are caused by the same water particles.
Water droplets on the optical windows of sensors will not be discussed in this paper. Although water droplets on windows have a significant impact on sensor performance, this issue relates to the “pollution” of the optical window and should be distinguished from the effects of raindrops or drizzle in the observation space. The issue of the optical window pollution is covered in detail in the IEC 61496 series, which specifies detailed design requirements and test methods [2].
The effect of wind on rain and drizzle, while potentially significant, is not treated here and is left for future study.
This paper focuses on the evaluation of non-contact safety sensors based on optical principles, similar to those discussed in our previous paper [3]. Specifically, we examine “depth cameras” (also known as 3D cameras, distance imaging cameras, or RGBD sensors). These cameras capture scenes at relatively short distances as depth images or 2D arrays, where each pixel contains distance information from the point of measurement. They are relatively inexpensive and can also capture RGB images with the same field of view as depth images, making them highly compatible with deep-learning-based object recognition and identification. As such, they are expected to find applications in both safety and non-safety systems across a variety of fields.
This paper covers the three main sensing principles used in depth camera products: Time of Flight, Stereo Vision, and Structured Light. Time-of-Flight (TOF) sensors calculate the distance to an object by measuring the time it takes for emitted periodic light to reflect back from the object. Both Stereo Vision and Structured Light sensors use triangulation, a principle of multiple-view geometry [4], to determine the distance to feature points on an object. We distinguish between systems that use two or more cameras for triangulation, referred to as Stereo Vision (SV), and those that combine a single camera with a pattern light projector, referred to as Structured Light (SL) [5]. Some SV-based sensors incorporate an auxiliary pattern light projector for enhanced performance.
This paper does not specifically address LiDAR technology, which measures distance over medium to long ranges by planar or spatial scanning using single or multiple laser beams. However, since LiDAR- and TOF-based depth cameras share similar sensing principles, the discussion in this paper is likely to be applicable to LiDAR in many scenarios. Radar and ultrasonic sensors are outside the scope of this paper due to their use of different sensing wavelengths, distance ranges, and the relatively low resolution of their measurements. Similarly, proximity and contact sensors are not discussed as they do not align with the paper’s focus on optical sensing technologies.

2. Background and Related Work

2.1. Meteorology

2.1.1. Precipitation

Precipitation as a meteorological observation value, also known as precipitation rate, intensity, or amount, is defined as “the volume of water, usually measured in millimeters or inches, that falls to the ground at a given location during a given period of time” [6]. This metric, widely accepted and used in daily weather forecasting, offers a straightforward measure of rainfall amount and intensity. Consequently, it has been adopted as a metric to evaluate the performance of outdoor safety sensors, typically to denote the operational limits of the sensor.
However, this paper argues against the use of precipitation rate as a metric for sensor detection capability. The rationale is that precipitation rate does not necessarily correlate accurately with sensor performance. In some cases, it can mislead by misrepresenting sensor performance. Sensor performance should be evaluated by the physical characteristics of the precipitation space from the sensor to the object, not by the rate of precipitation accumulated on the ground.

2.1.2. Particle Density and Drop Size Distribution

Precipitation, a meteorological phenomenon in which liquid water droplets fall from clouds to the ground, is conveniently divided into drizzle and rain based on particle size. Drizzle is precipitation consisting of water particles 0.1 to 0.5 mm in diameter [7]. Particles smaller than 0.1 mm in diameter do not reach the ground because they evaporate during their descent from the cloud. Rain, on the other hand, is precipitation composed mainly of water particles larger than 0.5 mm in diameter, with the maximum particles being 5 to 8 mm in diameter. Water droplets larger than this are not stable and break up as they fall and do not occur naturally [8]. Note that normal rainfall also includes drizzle-sized particles, which are more numerous than large raindrops.
The particle sizes, along with their fall velocities, are measured by a disdrometer, a meteorological instrument. This instrument records the particle size and fall velocity of raindrops as they pass through its measurement area, which is defined by a line laser beam plane, and outputs these data as a particle size and velocity distribution (PSVD) matrix [9]. This matrix is a 2D frequency distribution that details the discrete information of the particle size and drop velocity distribution. It is presented as a 2D matrix that records the cumulative number of raindrops per unit time for raindrops of a given particle size class D i and a velocity class V j . Figure 2a shows a heat map visualizing the PSVD matrix from a 300 mm/h heavy rain event observed in a subsequent experiment.
By analyzing the PSVD matrix, we can derive useful physical properties of the rainfall environment. The particle density (PD), defined as the total number of drops per volume of rainfall space [m−3], and the drop size distribution (DSD), which represents the number of drops per unit diameter per volume in the size class D i [m−3mm−1], are calculated using Equations (1) and (2), respectively. The corresponding precipitation rate R [mm/h] is determined using Equation (3):
P D = Σ i P D D i = 1 A t i , j n i , j V j ,
D S D D i = 1 A t D i j n i , j V j ,
R = Σ i R D i = π 10 5 A t i , j n i , j D i 3 ,
where A is the sampling area [m2], t is the sampling time [s], n i , j is the number of raindrops as an element (I, j) of the PSVD matrix, V j is the drop velocity class of the matrix [m/s], and Δ D i is the bin width of the drop size class D i [mm].
The graph in Figure 2b shows the DSD derived from the PSVD matrix. Here, the horizontal axis represents the raindrop size D [mm], and the vertical axis represents the number of raindrops per unit volume [m−3mm−1], for particle sizes ranging from D i to D i + Δ D i , plotted on a logarithmic scale.
The size distribution of raindrops in natural rainfall is known to follow a specific pattern; Marshall and Palmer [10] demonstrated in 1948 that the DSD can be effectively parameterized by an exponential function,
D S D D = N 0 e Λ D ,
where Λ = 4.1 R 2.1 [mm−1], N 0 = 8000 [m−3mm−1], and R is the precipitation rate [mm/h].
The Marshall–Palmer distribution tends to overestimate the number of small raindrops. In addition, as raindrops fall from clouds to the ground, the number of small raindrops decreases due to coalescence or adsorption into larger raindrops, as well as evaporation [11]. Therefore, a more general formula involving the gamma function has been proposed [12].
The dotted line in the graph in Figure 2b represents the Marshall–Palmer distribution for R = 300 . In this observation, the size distribution closely matches the Marshall–Palmer distribution for particle sizes greater than 1 mm, with sizes greater than 6 mm considered outliers.

2.1.3. Visibility of Rainfall

Visibility is the maximum distance at which an object can be clearly seen with the naked eye, and it is an important meteorological observation for aircraft and ship operations. The mechanical measurement of visibility is defined by the World Meteorological Organization (WMO) as the Meteorological Optical Range (MOR), which is “the length of path in the atmosphere required to reduce the luminous flux in a collimated beam from an incandescent lamp, at a color temperature of 2700 K, to 5% of its original value” [13]. MOR [m] can be calculated according to Koschmieder’s law using
M O R = d × ln c 0 ln T d ,
where T d is the spatial transmittance observed at a distance d [m] from the light source, and c 0 is a constant denoting the threshold of light attenuation. Empirically, c 0 is 0.02 (2%) in most cases but is set to 0.05 (5%) for rainfall, which is the same as the MOR definition [14]. Throughout this paper, the term “visibility” will refer to the perceptual quality of a space, while “MOR” will refer to the precise physical measurement of visibility.
MOR is commonly used to evaluate sensor performance in fog and similar environments. However, its suitability for technical evaluation of sensor performance is questionable because MOR is tailored to the characteristics of the human eye, as evidenced by the use of different c 0 values for rainfall and other conditions. Therefore, in our previous work, we demonstrated the advantages of using spatial transmittance over MOR for such evaluations [3].
The spatial transmittance of a rainfall environment can be formulated using Lambert-Beer’s law as
T d = e γ d   ,
where γ [m−1] is the absorption coefficient of the rainfall environment, which quantifies the rate at which light is absorbed per unit length in space. This coefficient can be approximated from the PSVD matrix using
γ = Σ i γ D i = 2 π Σ i P D D i D i 2 2 = 2 π A t i , j n i , j V j D i 2 2 ,  
which indicates that the transmittance in a rainfall environment is influenced by the total cross-sectional area of the raindrops rather than the precipitation rate [15]. Observations of actual weather conditions show that drizzle, despite its low precipitation rate, causes less visibility [16]. Conversely, as discussed below, even very high precipitation rates may not significantly reduce spatial transmittance.
In this paper, we demonstrate with experimental results that this formulation is practical. We then elucidate the correlation between raindrop size distribution and sensor detection performance and propose a new metric for sensor evaluation to replace precipitation rate.
In the discussion of visibility in this paper, it is not necessary to consider the wavelength of light. This is because sensor devices operating on optical principles primarily use wavelengths within the visible to near-infrared spectrum, and the spatial transmittance of rainfall does not depend on wavelength across the spectrum [17]. It should be noted, however, that the wavelength should be considered for sensor devices that use light beams beyond the infrared spectrum or in contexts involving aerosols other than precipitation, where the transmittance does exhibit a wavelength dependence.

2.2. Standardization

This section reviews international standards related to sensor-performance evaluation and test methods under rain conditions.
The IEC 61496 series is an important international standard for safety sensors, covering technologies such as light curtains, LiDARs, TOF cameras, monocular and Stereo-Vision-based cameras, and radars [2]. This group of standards is primarily intended for indoor factory environments and does not address outdoor conditions, including rain.
For outdoor safety sensors, the IEC 62998 series specifies design requirements. And IEC TS 62998-1 refers to the IEC 60721 series as a quantitative reference for outdoor environments [18].
The IEC 60721 series categorizes the severity of environmental conditions to which electrical and electronic products are exposed, including an extreme rainfall intensity of 6 mm/min (360 mm/h) for short periods of time [19]. However, it lacks details on the physical characteristics of the rain, such as raindrop size distribution or visibility, because it focuses more on product design for environmental resistances than on sensor detection capabilities.
Efforts to standardize ground service robots, including transport and other types, continue under ISO 13482 and its revision projects [20]. While the sensing requirements for these robots include collision avoidance and road surface detection, the standards do not detail specific environmental conditions.
The growing demand for drones has spurred active international standardization, with various guidelines and test methods being developed. For example, ISO 5110 defines test methods for drone flight stability in wind and rain conditions [21]. However, standardization for sensor performance in rain conditions has not been initiated.
In summary, current international standardization efforts are primarily focused on hardware durability in a rainfall environment rather than the impact of rainfall on sensor detection performance. It is anticipated that future regulations will require a focus on the latter. This paper contributes to filling this gap by focusing on particle size distribution and spatial transmittance.

2.3. Computer Vision and UAV/UGV

In the fields of computer vision and robotics, a variety of sensing and image-processing techniques have been developed for use in rainy conditions.
Garg and Nayar proposed an efficient algorithm to mitigate the visual effects of rain in both computer vision and computer graphics, based on a detailed analysis [22].
Charette et al. developed a “smart headlight” system. It uses a camera to determine the location of precipitation, such as raindrops and snowflakes, in front of a vehicle. The system then segments the headlight beam with a beam splitter to bypass these particles. This method reduces light scattering and reflection from the particles, thereby improving driver visibility [23].
Lee et al. use sensor fusion techniques with cameras and LiDARs to achieve road detection for autonomous vehicles in various weather conditions, including rain [24]. Murase’s research team has presented several driver assistance technologies. Using cameras and multiple sensors, these technologies accurately detect pedestrians, obstacles, driver status, and weather conditions in rainy and foggy environments [25].
Xi et al. propose a collaborative learning network, CoDerainNet, to improve the accuracy of object detection by drones in rainy weather. The network effectively reduces image noise caused by rain and improves object-detection accuracy and computational cost [26]. Li and Wu propose a framework for extracting highly accurate vehicle motion data from video captured by UAVs. They improve YOLOv5, a typical deep learning algorithm, to improve the accuracy of vehicle detection under various weather conditions, including rainfall [27].
These studies lay the groundwork for both theoretical and practical approaches to improving sensor performance in rainy conditions, which are critical for the deployment of automated machines in outdoor environments.
In addition to these advanced core sensing technologies, evaluation methodologies are also essential to verify the functionality of safety sensor products as intended. This paper addresses this issue by proposing criteria for evaluating sensor performance in rainfall environments.

2.4. Our Previous Work

Our research group has actively pursued the evaluation of human detection performance of safety sensors for service robots, leading to the development of a light interference test method under direct sunlight [28] and a simulated snowfall chamber that mimics the visibility reduction due to snow in a room temperature environment [29].
Regarding fog, our experiments with different sensor technologies have revealed significant differences in sensor behavior in foggy conditions. We have demonstrated that fog can drastically affect the detection capabilities of optical sensors to the extent that even minimal fog that is barely perceptible to the human eye can severely degrade the detection capabilities of some sensors [30]. These findings led us to question the suitability of the commonly used visibility metric, MOR, for evaluating sensor performance. As an alternative, we proposed MOT, Minimum Object-detectable Transmittance, a novel metric based on spatial transmittance [3].
In this paper, we extend our focus to include rain and drizzle, which are critical weather phenomena affecting sensor performance along with fog and snowfall. Based on experiments at a large rainfall test facility, we will show that sensor evaluation based on precipitation rates can be misleading in some cases, and we discuss more appropriate methods of performance evaluation.

3. Sensor Evaluation in Rain/Drizzle Environments

3.1. Precipitation Rate vs. Spatial Transmittance

As mentioned above, precipitation rate is commonly used as an indicator of sensor performance in rain conditions. It is widely recognized as intuitive weather information for rainfall intensity, which is important to sensor users, including designers and integrators working on outdoor drone and robotic applications. Therefore, the specifications for standards-compliant outdoor safety sensors should always include operating limits in precipitation conditions, such as “This sensor product is designed for use in precipitation rates up to 80 mm/h”.
However, relying solely on precipitation rate to measure sensor performance is fundamentally flawed. The precipitation rate measures the amount of water that falls to the ground and accumulates, losing information about the physical properties of the precipitation space between the sensor and the target. In fact, as previously discussed, a high precipitation rate does not always correlate with poor visibility.
Sensor performance should be evaluated based on the physical characteristics of the space, i.e., the “visibility”. To quantitatively evaluate the visibility, this paper adopts the spatial transmittance T d at distance d [m] intended for sensor use, as introduced in a previous study [3]. To evaluate the visibility of the rainfall space itself without relying on a specific sensor, we use T 5 , which uses 5 m as a representative distance. For added compatibility, the MOR calculated by Equation (5) can also be used in conjunction with spatial transmittance.

3.2. Object-Detection Rate in Rainfall Environments

This section presents the methodology used to measure the detection capability of the sensor under rain conditions, focusing on the object-detection rate. A test piece is strategically placed at d S meters within the field of view of the sensor S in a controlled precipitation condition, PC, and observations and detections are executed repeatedly over f frames. The detection rate, DR, is then simply defined as follows:
D R S , d s , P C = f d e t f = f d e t f d e t + f f a i l ,
where f d e t is the number of frames out of all f frames in which the test piece is successfully detected, while f f a i l is the number of frames in which detection failed.
We then investigated how the physical properties of rain, such as precipitation rate and particle density, affect the performance of the sensor. This is carried out by comparing the detection rate D R S , d s , P C 0 of the sensor S in a clear environment with no precipitation to the detection rate D R S , d s , P C x under varying precipitation conditions PCx, which represent different combinations of precipitation rate and particle density.

3.3. Simulated Precipitation Conditions

The experiments described in this paper were conducted at the Large-scale Rainfall Simulator, a test facility of the National Research Institute for Earth Science and Disaster Resilience (NIED) [31]. A photograph of the facility is shown in Figure 3a. This facility is one of the world’s largest and most powerful water spray facilities capable of reproducing nature-like rainfall and has the following specifications:
  • Nozzles with four different spray diameters installed 16 m above the ground (2176 nozzles in total);
  • Capable of spraying up to approximately 3000 m2 (44 m × 72 m);
  • Maximum precipitation rate of approximately 300 mm/h;
  • Raindrops up to approximately 6 mm in diameter.
Using a nozzle with a smaller spray diameter and increasing the water pressure (i.e., increasing the precipitation rate) increases the distribution of the number of small diameter raindrops. By changing the combination of nozzle and water pressure, it is possible to reproduce rainfall with different precipitation rates and drop size distributions.
In this experiment, three of the four nozzle types were used to reproduce four different precipitation conditions, which were expected to have very different combinations of precipitation and transmittance (particle density) from each other, in order to facilitate verification of the claims made in this paper. Measurements were made under the following five precipitation conditions, including the clear condition:
  • PC0: Clear condition with no rainfall.
  • PC1: Rainfall at a pre-set precipitation rate of 45 mm/h using Nozzle 1, which has the smallest spray diameter. Precipitation consisting mainly of drizzle-sized particles is reproduced.
  • PC2: Rainfall at 80 mm/h using Nozzle 2, which has a medium-diameter spray nozzle. Reproduces heavy rainfall.
  • PC3: Rainfall at 200 mm/h with Nozzle 2. Reproduces a cloudburst with many drizzle-sized particles.
  • PC4: Rainfall at 300 mm/h using Nozzle 4, which has the largest spray diameter. Reproduces the largest cloudburst consisting mainly of large raindrops.
The particle size and velocity distribution, PSVD, of the simulated rainfall was measured using an OTT Parsivel2 laser precipitation disdrometer (OTT HydroMet, Kempten, Germany), an instrument widely used in both practical and research applications in meteorology.
In this paper, we describe two types of precipitation rates for rainfall simulated at the facility:
  • Values pre-set by the facility;
  • Values calculated from the observed PSVD matrix using Equation (3).
Due to the inherent non-uniformity of the simulated rainfall, these two values often do not coincide and can sometimes differ significantly.

3.4. Test Piece and Experimental Setup

In this experiment, the test piece for sensor detection was a matte black acrylic plate with dimensions of H: 400 mm × W: 200 mm. The width of 200 mm is indicative of the minimum human torso size considered in the design of safety sensors, as specified in IEC 61496-3 [32].
The diffuse surface reflectance of the test piece under dry conditions was approximately 5% across the visible to near-infrared spectrum. During the rainfall experiments, the test piece was exposed to rain, and water droplets were deposited on its surface. To ensure consistent surface conditions across all experiments, including those under PC0 without rain, the test piece surface was intentionally maintained with water droplets, as shown in the photograph in Figure 4a.
This black test piece with water droplets presents a challenging detection target for optical sensors. In fact, a preliminary evaluation revealed that some sensor products could not detect the test piece even under the clear condition PC0 (these products were subsequently excluded from the main experiment).
The positioning of the test piece and the sensor is shown in Figure 4b. To prevent water droplets from affecting the optical window of the sensor as described in Section 1.1 and to prevent electrical damage from water, the sensor was placed under a tent to ensure a distance of 500 mm between the optical window and the rainfall. Consequently, the portion of the distance d between the test piece and the sensor’s optical window that was exposed to the rain was ( d 500 ) mm. The test piece was oriented perpendicular to both the ground and the optical axis of the sensor. The height of the bottom edge of the test piece and the optical axis of the sensor was set to 1000 mm above the ground reference plane.
The selection of the distance d S from the sensor’s optical window to the test piece required careful consideration due to the varying basic performance of the sensors used in this experiment. We determined d to be the maximum distance at which the sensor could reliably detect the test piece in a clear environment. This was operationalized as finding a position where the detection rate D R ( S , d S , P C 0 ) for the sensor S in the clear environment PC0 does not reach 100% but remains as high as possible. This implies finding a distance d S that satisfies
sup D R S , d S , P C 0 d S A < 1.0 ,
where A represents the set of distances at which the sensor S can detect the test object in the clear environment PC0.
With this setup, the impact of rain on the sensor can be sensitively reflected in the detection rate. That is, if the sensor S can almost detect the target at PC0, i.e., D R S , d S , P C 0 1.0 , and the detection rate decreases significantly under precipitation conditions PCx, i.e., 0.0 D R S , d S , P C x 1.0 , then the effect of rainfall PCx on the sensor can be estimated from these differences. Conversely, if the DR is hardly reduced by PCx, then the sensor is hardly affected by rainfall.

3.5. Sensor Devices

In this experiment, we tested the following four depth cameras, each based on one of the three sensing technologies described in Section 1.1, to investigate how the impact of rainfall varies with the sensing technology:
  • S1: Stereo Vision (SV);
  • S2: Structured Light (SL);
  • S3: Time of Flight (TOF);
  • S4: Time of Flight (TOF).
It should be noted that all sensors are general-purpose products, not designed for safety applications, and were not originally intended for use in a rainy environment. Therefore, since none of the sensors are considered to be technically prepared for rainfall, we expect to be able to directly observe the effects of rainfall on the performance of the sensor devices.
The detection rates described in Section 3.2 were calculated for the depth images produced by these sensors. In order to measure the detection rate under the same conditions for all sensors, we used software implementing our original object-detection algorithm instead of the proprietary software bundled with the sensors. The software is available online under an open-source license as a common platform for object detection [33].
In this experiment, the depth image frames continuously output by the sensor were processed using this common software to identify candidate object regions. Object detection was considered successful if the area of the candidate region reached 400 pixels or more in the image coordinate system. Such frames were counted as successful detection frames f d e t in Equation (8). However, for sensor S1, which has a wide viewing angle, the threshold was lowered to 100 pixels due to the relatively small object area compared to the depth image size. Areas detected away from the position of the test piece, such as the ground and the tripod legs supporting the test piece, were excluded using the masking function of the software. For more details, refer to [33].

4. Experimental Results

4.1. Physical Properties of Precipitation

Table 1 shows the measurement results for the following physical properties of rainfall reproduced under the four predefined precipitation conditions:
  • Precipitation rate, R (pre-set and measured);
  • Particle density, PD (calculated by Equation (1));
  • Absorption coefficient γ, the corresponding 5 m transmittance T 5 and MOR (Equations (5)–(7)).
Figure 5 shows plots of the drop size distribution (DSD) for each of the precipitation conditions and the distribution of the absorption coefficient γ, as listed in Table 1, for each raindrop particle size class.
These data show the following:
  • The 45 mm/h rainfall in PC1 was characterized by mainly drizzle mixed with rain, with the majority of drops being 1 mm or less in particle size. Despite having the lowest precipitation rate among the four types, its particle density, PD, was about 65,000, which was not significantly different from PC3 at 200 mm/h. The absorption coefficient was also the second highest among the four types, with a 5 m transmittance, T 5 , of 79.8% (corresponding MOR: 66.6 m), making the visibility in this condition the second worst after PC3.
  • The 80 mm/h rainfall in PC2 is characterized as typical heavy rainfall. However, its PD was the lowest among the four types, being less than half of that observed in PC1. The absorption coefficients were significantly lower than those in PC1 for all particle sizes. In addition, the T 5 was 92.4% (MOR: 189.6 m), making it almost indistinguishable from conditions that are clear to the human eye at close range.
  • The 200 mm/h rainfall in PC3 had a distribution of raindrops of 1 mm or less in particle size similar to that of PC1, but it reached a cloudburst level due to the significantly higher number of 1 to 3 mm raindrops. The T 5 was 71.2% (MOR: 44.2 m), which was the worst visibility among the four types, although the difference from PC1 was not substantial. This suggests that the contribution of light absorption by raindrops larger than 1 mm in particle size is relatively small.
  • The 300 mm/h rainfall in PC4 had cloudburst-class conditions, with a noteworthy presence of raindrops of 2 to 5 mm in particle size. However, its PD was comparable to that of PC2, and the T 5 of 86.6% (MOR: 104.3 m) was not significantly different. As shown in the absorption coefficient plot, the coefficients for small raindrops of 1 mm or less in particle size were markedly lower than those observed for PC1 and PC3, while larger raindrops of 2 mm or more in diameter had a non-negligible but limited impact. Consequently, the impact of this rainfall on visibility was relatively moderate.
In summary, these findings indicate that high precipitation rates do not necessarily result in reduced visibility within the precipitation space. The graphs in Figure 5 clearly show that raindrops with a particle size of 1 mm or less are the primary contributors to reduced visibility in the rainy environment. This is consistent with the observations of the low visibility from natural drizzle discussed in Section 2.1. This also suggests that the selection of the precipitation conditions described in Section 3.3 was appropriate. It also reinforces the argument made in this paper that the precipitation rate alone is not an appropriate metric for assessing sensor performance.

4.2. Evaluation of Sensor Detection Performance

4.2.1. Evaluation Procedures

This section details the results of the object-detection tests conducted with the sensors and the precipitation conditions mentioned above.
First, detection was performed under PC0, a clear condition with no precipitation, to determine the distance d as described in Section 3.2. Accurately determining the distance d that satisfies Formula (9) proved to be difficult. Therefore, we incrementally adjusted the position of the test piece during detection. This approach led to an ad hoc determination of the maximum distance at which the detection rate dropped below 100%. The distances thus determined for each sensor and the corresponding detection rate D R ( d , P C 0 ) are listed in Table 2.
For sensor S4, we could not experimentally find a distance d at which a detection rate D R S 4 , d , P C 0 close to 100% could be obtained stably and at which the detection rate would decrease under rain conditions. This is because S4 was originally designed for indoor short-range sensing applications, and its operation was very unstable in this experiment. Therefore, we substituted the detection rate DR of 67.37% obtained at the distance d of 3596 mm. Since this does not strictly satisfy Formula (9), the results for sensor S4 could be considered as supplementary information.
Figure 6 shows examples of depth images and RGB images from frames where each sensor achieved successful detections in the clear condition, PC0. In the depth images, pixels closer to the sensor appear in brighter gray or red, while pixels without distance information are shown in black. The red areas indicate object regions identified by the object-detection algorithm discussed earlier, and such frames are categorized as successful detection frames f d e t .
Then, for each sensor S, the test piece was positioned at the specified distance d S , and detection tests of more than 500 frames were executed repeatedly both in the clear condition PC0 and in precipitation conditions PC1 to PC4.

4.2.2. Detection Rates under the Precipitation Conditions

Figure 7 shows a graph summarizing the detection rates achieved in the experiment. Figure 8 and Figure 9 show examples of successful and unsuccessful detections by each of the sensors in precipitation conditions PC1 (45 mm/h) and PC4 (300 mm/h), respectively.
As shown in Figure 7, the PC1 (45 mm/h) rainfall, with the lowest precipitation rate of the four types, had the most significant impact on the sensors, causing all sensors to fail to detect the test piece. Conversely, the PC4 (300 mm/h) rainfall had a significantly lower impact than PC1, despite being one of the heaviest natural rainfalls. The RGB images in Figure 8 and Figure 9 clearly illustrate that PC1 has significantly poorer visibility.
Similarly, the PC3 rainfall (200 mm/h) resulted in near-zero detection rates for all sensors, demonstrating that its effect on sensor performance is comparable to that of PC1. However, since the precipitation rate of PC3 is much higher than that of PC1, this result further suggests that precipitation rate has minimal effect on sensor performance.
These results clearly demonstrate that rainfall intensity alone is not a reliable metric for sensor detection capability. Regardless of the sensing technology used, it is primarily the spatial transmittance, determined by the particle density PD of small raindrops, that affects a sensor’s detection performance.
Examples of detection failures shown in Figure 8 and Figure 9 include the following cases:
  • For sensors S1 and S3, when the area representing the test piece falls below the detection threshold;
  • For sensor S2, when it is unable to acquire distance data due to rain;
  • For sensor S4, when the test piece is obscured by noise and cannot be detected as a connected region.

4.2.3. Sensing Technologies under the Precipitation Conditions

This section analyzes whether there are differences in the effect of rainfall on different sensing techniques.
For precipitation conditions PC2 (80 mm/h) and PC4 (300 mm/h), which are characterized by a lower number of drizzle particles, a significant difference in the detection performance of the sensors was observed.
Interestingly, the detection rate of the TOF sensor S3 did not decrease significantly under the PC2 and PC4 precipitation conditions, although previous reports indicated that TOF-based sensors tend to be highly affected by scattering from particles in space [30,34]. As shown in Figure 5, PC1 (45 mm/h) and PC3 (200 mm/h) are significantly affected by scattering effects due to small raindrops of 1 mm or less in particle size, while PC2 and PC4, which contain significantly smaller numbers of particles, are less affected.
The marked difference in detection rates between the S2 and S4 sensors, both of which use the TOF method, could be attributed to the fact that the S4 sensor is designed for short-range indoor use, which makes it difficult to detect low-reflectivity test pieces even under clear PC0 conditions, thus making it more susceptible to rainfall.
The detection rate was higher for PC4 compared to PC2 in the case of S2 using SL and S4 using TOF, even though the spatial transmittance of PC4 was slightly lower than that of PC2, but not by a significant margin. This does not easily explain why PC4 would have a higher detection rate. The fact that both sensors are intended for indoor applications may contribute to the considerable variability in performance at their limits.

5. Discussion

This section presents an analysis showing that small raindrops cause a more significant problem for sensors and discusses a metric for evaluating sensor performance in rainfall environments.

5.1. Raindrop Size and Spatial Transmittance

Figure 10 graphically illustrates the precipitation rate required to achieve the same spatial transmittance as the PC1 (45 mm/h) precipitation condition described in Section 3.3, but in a hypothetical scenario where all raindrops are of uniform particle size. The number of raindrops P D ( D i ) required to achieve the same 5 m transmittance T 5 = 0.798 8 (MOR: 66.6 m) as PC1 by rainfall consisting only of raindrops of particle size D i [mm] was determined using Equations (6) and (7). And the fall velocity V i of raindrops of particle size D i was estimated by the Foote and Du Toit polynomial [35]. Then, the precipitation rate R i calculated by Equations (2) and (3) was plotted.
According to these calculations, a precipitation rate consisting of drizzle-sized raindrops with a particle size of 0.4 mm, which has a spatial transmittance equivalent to PC1, is approximately 35 mm/h. This is consistent with the experimental results, because the 0.4 mm particle size corresponds to the most frequent raindrop size of raindrops in PC1, as shown in Figure 5.
However, it would require a cloudburst-level rainfall of about 216 mm/h, consisting of only 1 mm raindrops, to match the transmittance of PC1. Obviously, such a precipitation rate cannot be produced by 1 mm raindrops alone. In the case of 3 mm raindrops, it would require a precipitation rate of approximately 1308 mm/h, which is unrealistic for Earth.

5.2. Redefining Sensor Evaluation Metrics for Rainfall

From the above analysis, it is clear that using precipitation rate as a metric to evaluate sensor performance is not appropriate. Instead, the spatial transmittance T d at the measurement distance d where the sensor is intended to be used, or the density of small particles P D ( D | D 1.0 ) , should serve as the metric for sensor evaluation. More specifically, since the density of larger particles, P D ( D | D > 1.0 ) , is negligibly small, the total particle density PD can be used as an indicator, where P D D D 1.0 P D .
To provide information to sensor users, such as safety sensor system integrators, a description based on the intended maximum operating distance and its spatial transmittance, as suggested in our previous paper [3], should be used. For example, the user manual should state: “This sensor is designed for use in rainfall with a spatial transmittance greater than 80% at a sensing distance of 5 m”. Alternatives might include “… with a raindrop density PD of 50,000 or less” or “… with an MOR of 65 m or more”.
If there is a need to use precipitation rate as a metric for compatibility or ease of general understanding, it is important to distinguish between rainfall and drizzle. For example, “… for use in rainfall of less than 80 mm/h or in drizzle of less than 0.5 mm/h”.
Relying on precipitation rate alone should be avoided, as it can lead to inaccuracies under certain environmental conditions.

6. Conclusions

This study investigated the object-detection performance of non-contact safety sensors in rainfall environments. Our experimental results highlighted that higher precipitation rates do not necessarily correlate with reduced visibility and that the presence of numerous small raindrops has a greater impact on sensor detection performance than the sheer volume of precipitation. These results challenged the conventional use of precipitation rate as a metric for evaluating sensor performance, providing an understanding of its inadequacy and potential for misleading conclusions in certain scenarios. We recommended the adoption of spatial transmittance or raindrop density as more reliable indicators for assessing sensor effectiveness in precipitation conditions.
In this experimental design, we selected four types of optimal precipitation conditions given our limited resources. Ideally, experiments should include a wider variety of precipitation conditions and a greater number of sensor devices. Additionally, validating the approach through field tests is crucial. This will further reinforce the proposed methodology. These are challenges to be addressed in future research.
Integrating the findings from this research with our previous work [3,28,29,30], we conclude that spatial transmittance is a universally applicable metric for evaluating sensor performance across a spectrum of environmental conditions, including rain, drizzle, and fog. This conclusion leads us to propose that spatial transmittance may be the only metric worth considering for evaluating sensor object-detection performance, regardless of the specific weather phenomena. In addition, the applicability of this metric could potentially be extended to various aerosols and environmental particles such as blowing snow and sand dust. However, the influence of spatial transmittance on sensing capabilities can vary significantly depending on the particle types present, suggesting that the uniform application of this metric requires careful consideration and further validation. Addressing these nuanced challenges and testing the proposed hypotheses will be essential for future research in this field.

Author Contributions

Conceptualization, Y.S. and B.K.K.; Data curation, M.K. (Masato Kodama) and M.K. (Masami Kobayashi); Methodology, Y.S.; Project administration, B.K.K.; Resources, N.S. and M.K. (Masami Kobayashi); Software, Y.S. and M.K. (Masato Kodama); Validation, N.S. and M.K. (Masami Kobayashi); Visualization, Y.S.; Writing—original draft, Y.S.; Writing—review and editing, Y.S., B.K.K., T.O. and N.S. All authors have read and agreed to the published version of the manuscript.

Funding

This paper is based on results obtained from a project, JPNP22002, commissioned by the New Energy and Industrial Technology Development Organization (NEDO).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article. Software is online available.

Conflicts of Interest

Author Masato Kodama was employed by Altech Corporation, a temporary-employment agency in Japan (https://www.alpsgiken.co.jp/english/), and also affiliated as technical staff at AIST. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. ISO/IEC Directives Part 2, 9th ed.; International Organization for Standardization/International Electrotechnical Commission: Geneva, Switzerland, 2021.
  2. IEC 61496-1; Safety of Machinery—Electro-Sensitive Protective Equipment—Part 1: General Requirements and Tests. 4th ed. International Electrotechnical Commission: Geneva, Switzerland, 2020.
  3. Sumi, Y.; Kim, B.K.; Kodama, M. Evaluation of Detection Performance for Safety-Related Sensors in Low-Visibility Environments. IEEE Sens. J. 2021, 21, 18855–18863. [Google Scholar] [CrossRef]
  4. Hartley, R.; Zisserman, A. Multiple View Geometry. In Computer Vision; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
  5. Shirai, Y. Recognition of polyhedrons with a range finder. Pattern Recognit. 1972, 4, 243–250. [Google Scholar] [CrossRef]
  6. American Meteorological Society, 2021: “Precipitation”. Glossary of Meteorology. Available online: http://glossary.ametsoc.org/wiki/Precipitation (accessed on 15 February 2024).
  7. American Meteorological Society, 2023: “Drizzle”. Glossary of Meteorology. Available online: http://glossary.ametsoc.org/wiki/Drizzle (accessed on 15 February 2024).
  8. American Meteorological Society, 2012: “Raindrop”. Glossary of Meteorology. Available online: http://glossary.ametsoc.org/wiki/Raindrop (accessed on 15 February 2024).
  9. Angulo-Martínez, M.; Beguería, S.; Latorre, B.; Fernández-Raga, M. Comparison of precipitation measurements by OTT Parsivel2 and Thies LPM optical disdrometer. Hydrol. Earth Syst. Sci. 2018, 22, 2811–2837. [Google Scholar] [CrossRef]
  10. Marshall, J.S.; Palmer, W.M.K. The distribution of raindrops with size. J. Atmos. Sci. 1948, 5, 165–166. [Google Scholar] [CrossRef]
  11. Hardy, K.R. The development of raindrop-size distributions and implications related to the physics of precipitation. J. Atmos. Sci. 1963, 20, 299–312. [Google Scholar] [CrossRef]
  12. Ulbrich, C.W. Natural variations in the analytical form of the raindrop size distribution. J. Appl. Meteorol. Climatol. 1983, 22, 1764–1775. [Google Scholar] [CrossRef]
  13. WMO. Guide to Instruments and Methods of Observation, Edition 2018; WMO: Geneva, Switzerland, 2018. [Google Scholar]
  14. Horvath, H. On the applicability of the Koschmieder visibility formula. Atomos. Environ. 1971, 5, 177–184. [Google Scholar]
  15. Chen, C.C. A Correction for Middleton’s Visible- and Infrared-Radiation Extinction Coefficients due to Rain; Rand Corporation Report; RAND Corp.: Santa Monica, CA, USA, 1974. [Google Scholar]
  16. Gultepe, I.; Milbrandt, J.A. Probabilistic parameterizations of visibility using observations of rain precipitation rate, relative humidity, and visibility. J. Appl. Meteorol. Climatol. 2010, 49, 36–46. [Google Scholar] [CrossRef]
  17. van de Hulst, H.C. Rigorous scattering theory for spheres of arbitrary size (Mie theory). In Light Scattering by Small Particles; Courier Corporation: Dover, NY, USA, 1981; pp. 114–130. [Google Scholar]
  18. IEC TS 62998-1; Safety of Machinery—Safety-Related Sensors Used for the Protection of Persons. 1st ed. International Electrotechnical Commission: Geneva, Switzerland, 2019.
  19. IEC 60721-1; Classification of Environmental Conditions—Part 1: Environmental Parameters and Their Severities, ed. 2.2. International Electrotechnical Commission: Geneva, Switzerland, 2002.
  20. ISO 13482; Robots and Robotic Devices—Safety Requirements for Personal Care Robots. 1st ed. International Organization for Standardization: Geneva, Switzerland, 2014.
  21. ISO 5110; Test Method for Flight Stability of a Multi-Copter Unmanned Aircraft System (UAS) under Wind and Rain Conditions. 1st ed. International Organization for Standardization: Geneva, Switzerland, 2023.
  22. Garg, K.; Nayer, S.K. Vision and rain. Int. J. Comput. Vis. 2007, 75, 3–27. [Google Scholar] [CrossRef]
  23. de Charette, R.; Tamburo, R.; Barnum, P.; Rowe, A.; Kanade, T.; Narasimhan, S.G. Fast reactive control for illumination through rain and snow. In Proceedings of the IEEE Conference on Computational Photography, Seattle, WA, USA, 28–29 April 2012. [Google Scholar]
  24. Lee, U.; Jung, J.; Shin, S.; Jeong, Y.; Park, K.; Shim, D.; Kweon, I. EureCar turbo: A self-driving car that can handle adverse weather conditions. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016. [Google Scholar] [CrossRef]
  25. Murase, H. Image recognition for driver assistance in intelligent vehicles. In Proceedings of the 15th IAPR International Conference on Machine Vision Applications (MVA), Nagoya, Japan, 8–12 May 2017. [Google Scholar]
  26. Xi, Y.; Jia, W.; Miao, Q.; Feng, J.; Liu, X.; Li, F. CoDerainNet: Collaborative deraining network for drone-view object detection in rainy weather conditions. Sensors 2023, 15, 1487. [Google Scholar] [CrossRef]
  27. Li, X.; Wu, J. Extracting high-precision vehicle motion data from unmanned aerial vehicle video captured under various weather conditions. Sensors 2022, 14, 5513. [Google Scholar] [CrossRef]
  28. Sumi, Y.; Kim, B.K.; Matsumoto, Y.; Horiuchi, E.; Matsumoto, O.; Ohba, K. Outdoor environment simulators for vision-based safety sensors—Artificial sunlight lampheads and simulated-snow chamber. In Proceedings of the IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO2013), Tokyo, Japan, 7–9 November 2013. [Google Scholar]
  29. Kim, B.K.; Sumi, Y.; Sagawa, R.; Kosugi, K.; Mochizuki, S. Visibility reduction based performance evaluation of vision-based safety sensors. In Proceedings of the IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS2015), Hamburg, Germany, 28 September–2 October 2015. [Google Scholar]
  30. Kim, B.K.; Sumi, Y. Vision-based safety-related sensors in low visibility by fog. Sensors 2020, 20, 2812. [Google Scholar] [CrossRef]
  31. Large-Scale Rainfall Simulator. Available online: https://www.bosai.go.jp/e/facilities/rainfall.html (accessed on 15 February 2024).
  32. IEC 61496-3; Safety of Machinery—Electro-Sensitive Protective Equipment—Part 3: Particular Requirements for Active Opto-electronic Protective Devices Responsive to Diffuse Reflection (AOPDDR). 3rd ed. International Electrotechnical Commission: Geneva, Switzerland, 2018.
  33. emulated_srs: A ROS Package for Obstacle Detection. Available online: https://github.com/yssmii/emulated_srs (accessed on 15 February 2024).
  34. Muraji, T.; Tanaka, K.; Funatomi, T.; Mukaigawa, Y. Depth from phasor distortions in fog. Opt. Express 2019, 27, 18858–18868. [Google Scholar] [CrossRef] [PubMed]
  35. Foote, G.B.; Toit, P.S. Terminal Velocity of Raindrops Aloft. J. Appl. Meteorol. Climatol. 1969, 8, 249–253. [Google Scholar] [CrossRef]
Figure 1. UAV in adverse weather conditions.
Figure 1. UAV in adverse weather conditions.
Sensors 24 02713 g001
Figure 2. PSVD Matrix and DSD measurements from the NIED Large-scale Rainfall Simulator on 31 October 2023. (a) PSVD matrix depicting 300 mm/h cloudburst (raindrop counts per 10 s visualized as a heat map); (b) DSD derived from the PSVD matrix, overlaying the Marshall Palmer distribution with R = 300.
Figure 2. PSVD Matrix and DSD measurements from the NIED Large-scale Rainfall Simulator on 31 October 2023. (a) PSVD matrix depicting 300 mm/h cloudburst (raindrop counts per 10 s visualized as a heat map); (b) DSD derived from the PSVD matrix, overlaying the Marshall Palmer distribution with R = 300.
Sensors 24 02713 g002
Figure 3. Photographs of the Large-scale Rainfall Simulator, a test facility of NIED. (a) Inside the facility; (b) rainfall simulated for precipitation condition PC4 (300 mm/h).
Figure 3. Photographs of the Large-scale Rainfall Simulator, a test facility of NIED. (a) Inside the facility; (b) rainfall simulated for precipitation condition PC4 (300 mm/h).
Sensors 24 02713 g003
Figure 4. Experimental setup. (a) Test piece with raindrops on surface; (b) positioning of the test piece and a sensor.
Figure 4. Experimental setup. (a) Test piece with raindrops on surface; (b) positioning of the test piece and a sensor.
Sensors 24 02713 g004
Figure 5. Measurement results of the physical properties for each raindrop particle size class. (a) Drop size distribution, DSD. (b) Absorption coefficient, γ.
Figure 5. Measurement results of the physical properties for each raindrop particle size class. (a) Drop size distribution, DSD. (b) Absorption coefficient, γ.
Sensors 24 02713 g005
Figure 6. Examples of output images of sensors S1–S4 in the clear condition, PC0. Top: RGB images. Bottom: depth images.
Figure 6. Examples of output images of sensors S1–S4 in the clear condition, PC0. Top: RGB images. Bottom: depth images.
Sensors 24 02713 g006
Figure 7. Detection rates for each of the sensors under the precipitation conditions.
Figure 7. Detection rates for each of the sensors under the precipitation conditions.
Sensors 24 02713 g007
Figure 8. Example output images of the sensors S1–S4 in precipitation condition PC1 (45 mm/h). Top: RGB images. Middle: depth images showing cases where the test piece was detected. Bottom: depth images where the test piece was not detected.
Figure 8. Example output images of the sensors S1–S4 in precipitation condition PC1 (45 mm/h). Top: RGB images. Middle: depth images showing cases where the test piece was detected. Bottom: depth images where the test piece was not detected.
Sensors 24 02713 g008
Figure 9. Example output images of sensors S1–S4 in precipitation condition PC4 (300 mm/h). Top: RGB images. Middle: depth images showing cases where the test piece was detected. Bottom: depth images where the test piece was not detected.
Figure 9. Example output images of sensors S1–S4 in precipitation condition PC4 (300 mm/h). Top: RGB images. Middle: depth images showing cases where the test piece was detected. Bottom: depth images where the test piece was not detected.
Sensors 24 02713 g009
Figure 10. Precipitation rates required to achieve a 5 m spatial transmittance T 5 of 79.8%, assuming uniform raindrop particle sizes.
Figure 10. Precipitation rates required to achieve a 5 m spatial transmittance T 5 of 79.8%, assuming uniform raindrop particle sizes.
Sensors 24 02713 g010
Table 1. Measurement results of the physical properties of rainfall for each of the predefined precipitation conditions.
Table 1. Measurement results of the physical properties of rainfall for each of the predefined precipitation conditions.
Precipitation
Condition
Precipitation Rate,
R [mm/h],
Pre-Set/Measured
Particle Density,
PD [m−3]
Absorption
Coefficient,
γ [m−1]
Spatial
Transmittance (5 m),
T5 [%]
MOR [m]
PC145/46.264,659.20.045179.866.6
PC280/59.817,043.20.015892.4189.6
PC3200/118.472,718.70.067871.244.2
PC4300/334.118,949.40.028886.6104.3
Table 2. Measurement distances, d, determined for the sensors.
Table 2. Measurement distances, d, determined for the sensors.
SensorSensing
Technology
d [mm]DR (d, PC0) [%]
S1SV421096.86
S2SL323187.76
S3TOF410197.77
S4TOF359667.37 *
* For sensor S4, DR (PC0) is not close to 100%, which does not strictly satisfy Formula (9).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sumi, Y.; Kim, B.K.; Ogure, T.; Kodama, M.; Sakai, N.; Kobayashi, M. Impact of Rainfall on the Detection Performance of Non-Contact Safety Sensors for UAVs/UGVs. Sensors 2024, 24, 2713. https://doi.org/10.3390/s24092713

AMA Style

Sumi Y, Kim BK, Ogure T, Kodama M, Sakai N, Kobayashi M. Impact of Rainfall on the Detection Performance of Non-Contact Safety Sensors for UAVs/UGVs. Sensors. 2024; 24(9):2713. https://doi.org/10.3390/s24092713

Chicago/Turabian Style

Sumi, Yasushi, Bong Keun Kim, Takuya Ogure, Masato Kodama, Naoki Sakai, and Masami Kobayashi. 2024. "Impact of Rainfall on the Detection Performance of Non-Contact Safety Sensors for UAVs/UGVs" Sensors 24, no. 9: 2713. https://doi.org/10.3390/s24092713

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop