Fundamental Practices For Drone Remote Sensing Research Across Disciplines
Fundamental Practices For Drone Remote Sensing Research Across Disciplines
Abstract
Drone remote sensing research has surged over the last few decades as the technology has become increasingly accessible.
Relatively easy-to-operate drones put data collection directly in the hands of the remote sensing community. While an abun-
dance of remote sensing studies using drones in myriad areas of application (e.g., agriculture, forestry, and geomorphology)
have been published, little consensus has emerged regarding best practices for drone usage and incorporation into research.
Therefore, this paper synthesizes relevant literature, supported by the collective experiences of the authors, to propose ten
fundamental practices for drone remote sensing research, including (1) focus on your question, not just the tool, (2) know the
law and abide by it, (3) respect privacy and be ethical, (4) be mindful consumers of technology, (5) develop or adopt a data
collection protocol, (6) treat Structure from Motion (SfM) as a new form of photogrammetry, (7) consider new approaches to
analyze hyperspatial data, (8) think beyond imagery, (9) be transparent and report error, and (10) work collaboratively. These
fundamental practices, meant for all remote sensing researchers using drones regardless of area of interest or disciplinary
background, are elaborated upon and situated within the context of broader remote sensing research.
Key words: drones, unoccupied aircraft systems, UAS, remote sensing, geographic information science, research
Introduction
ability of free satellite and aerial imagery (Rogers et al.
Drones have significantly impacted the remote sensing 2022).
community by providing a temporally flexible, relatively low- Simic Milas et al. (2018) envision drones as the third-
cost, and easy-to-operate platform to obtain very high spatial generation source of remotely sensed data after moderate-
resolution data over small study areas. Otherwise known as /high-altitude aircraft and Earth-orbiting satellites. The pop-
unoccupied/unmanned/uncrewed aircraft systems (UAS), un- ularity and importance of drones to remote sensing are un-
occupied aerial vehicles, or remotely piloted aircraft, among deniable and is reflected in the publication of drone-focused
other names, drones provide a new and exciting perspec- textbooks (e.g., Jensen 2017; Aber et al. 2019; Green et al.
tive for Earth observation. Drones enable research at un- 2020; Frazier and Singh 2021). Teaching and learning suc-
precedented geographic scales, filling a low-altitude data cesses attest to the many benefits of incorporating drones
collection gap between ground-based field/survey data col- into the classroom (Jeziorska 2014; Cliffe 2019; Joyce et al.
lection and moderate-altitude aerial data collection from 2020; Yang et al. 2020). Drones, further, provide a remote
occupied/piloted aircraft. Not surprisingly, remote sensing sensing platform for non-traditional users, i.e., an era of
research on applications ranging from forestry (Wallace “personal remote sensing” (Jensen 2017). Some go so far as
et al. 2012) and precision agriculture (Matese et al. 2015) to state that drones democratize remote sensing (Carrivick
to atmospheric measurement (Hemingway et al. 2017) in- et al. 2016; Frazier and Singh 2021; see survey responses
corporates drones as data collection devices. Beyond re- from Rogers et al. 2022), enabling citizen scientists to ac-
search applications, drones aid basic remote sensing re- quire and share their own geographic data aided by web-
search through multi-scale/multi-sensor data fusion analy- based platforms such as OpenAerialMap (openaerialmap.org)
ses (Campbell et al. 2020; Alvarez-Vanhard et al. 2021). Re- and GeoNadir (geonadir.com).
gardless of intended use, drones (re)connect researchers and So, why should we care? Do drones change remote sens-
students to the remote sensing data collection process (not ing? If so, how? By providing very high spatial, or hyperspa-
including in situ measurements) that has been neglected tial, resolution data (the most substantial offering to date,
over the past several decades due to the increased avail- both as two-dimensional images and three-dimensional point
clouds), drones open new avenues for exploration and chal- mote sensing fundamentals, this will result in significant er-
lenge traditional remote sensing methodological approaches, rors in the collected data. In this way, the latter group might
indicating what some see as a paradigm shift within re- benefit more from this paper. We also provide a critical re-
mote sensing (Lippitt 2015). Conceptualizing the incorpo- mote sensing perspective on the subject by posing conceptual
ration of drones within the remote sensing model (RSM; challenges and opportunities.
Strahler et al. 1986) and the remote sensing communica-
tion model (Lippitt et al. 2014), Lippitt and Zhang (2018) em-
phasize how drones impact optical remote sensing by pro- Ten fundamental practices
viding ∼1-2 cm ground sampling distances necessitating H- We propose ten fundamental practices for drone remote
resolution approaches (i.e., scenes wherein individual objects sensing regardless of field of study, as illustrated in Fig. 1: (1)
can be identified) such as object-based image analyses (OBIA). focus on your research question, not just the tool, (2) know
This notion of a paradigm shift in the context of only opti- the law and abide by it, (3) respect privacy and be ethical,
cal data presents a compelling case. Given that drones en- (4) be mindful consumers of technology, (5) develop or adopt
able data collection from a variety of other sensors beyond a data collection protocol, (6) treat Structure from Motion
images (e.g., lidar and radar), the argument that they present (SfM) as a new form of photogrammetry, (7) consider new ap-
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
a paradigm shift is even stronger. proaches to analyze hyperspatial data, (8) think beyond im-
The sheer volume of remotely sensed data and its level agery, (9) be transparent and report error, and (10) work col-
of spatial detail, though, have outpaced our ability to effec- laboratively.
tively analyze it (see Li et al. 2016; Shook et al. 2019). Other
problems present themselves regarding this new scale of data
collection and its relationship to the scale of conceptualiza- Focus on your research question, not just the
tion of geographic problems. In this way, there may be a dis- tool
connect between the two because of rapid advancements in Similar to geographic information systems (GIS) and global
drone technology that have made hyperspatial and other data navigation satellite systems (GNSS) before them, the excite-
available to researchers. In this way, are we producing too ment associated with a new technology, i.e., the emergence of
much data to actually address our research objectives and low-cost off-the-shelf remote sensing drones, has led to rapid
problems? Much like highly accessible geo-enabled smart- adoption with little forethought regarding how such tech-
phones, drones contribute a great deal to the era of geospatial nology might alter the generation of research questions and
big data. challenge current methodological conventions. GIS is “some-
While there are many useful review papers and bibliogra- times accused of being technology driven, a technology in
phies (Hardin and Hardin 2010; Watts et al. 2012; Colomina search of applications” (Goodchild 1992, p. 31). This accusa-
and Molina 2014; Pajares 2015; Cummings et al. 2017c; tion could very well be directed at drone technology. Those
Mathews and Frazier 2017; Singh and Frazier 2018; Hardin looking to utilize drones, a relatively new tool, in their re-
et al. 2019; Tmušić et al. 2020; Mathews 2022; Nex et al. 2022) search must first, and most importantly, focus on the re-
focusing on drone remote sensing, these works (albeit with search question and not just the technology. As much as
varying foci) do not provide both straightforward and broad- possible, the tool should not dictate or influence the devel-
in-scope (not focused on technical specifications) guidance opment of research questions, objectives, and/or hypotheses
for those adopting drones as a remote sensing tool to conduct (i.e., the scientific process should not be changed based on
research. Other works help to frame and conceptualize the using a drone). That said, given the history of remote sensing
integration of the technology into the modern remote sens- as a field of study, the tools (from satellite-based optical sen-
ing field (e.g., Lippitt and Zhang 2018; Simic Milas et al. 2018), sors to airborne lidar) impact and constrain what researchers
but likewise do not provide broad practical advice on the use can do. Importantly, it is up to the scientist to form suitable
of the technology. Therefore, this article provides such direc- research objectives that align with obtainable remote sens-
tion by proposing ten fundamental practices for those con- ing data. After developing research questions, consider the
ducting remote sensing research with drones, regardless of questions provided in Box 1 to assess whether (and to what
research focus. There could be more than ten practices be- degree) a drone is needed to address said questions.
cause there are myriad items to cover, but we organized this Focusing on the research question helps narrow down the
brief list for the sake of simplicity and the utility of adoption. type of imagery (or other data), seasonality of the targeted
We aim to provide a go-to resource with both the ten funda- object, spatial resolution, image overlap (if imagery is being
mental practices and an extensive in-text citation/reference collected), and frequency of data collection. If the sole pur-
list for both remote sensing and non-remote sensing scien- pose of drone-captured images is to map land cover types for
tists new to drones as well as those with extensive experi- more than 1 hectare (>10,000 m2 ), the availability of high-
ence. Importantly, though, we do distinguish remote sensing resolution satellite imagery (e.g., PlanetScope) captured at
scientists, i.e., those with extensive background in the field high temporal frequency might serve better in mapping than
and actively publishing research, from non-remote sensing spending hours planning, capturing, and processing drone
scientists adopting drone remote sensing methods to conduct images and should be strongly considered. Both the area and
research. Such a differentiation is increasingly important be- height of the object of interest should be considered when
cause drones are highly accessible (compared to more tradi- mulling over acquiring drone imagery. Drone-captured im-
tional forms of remote sensing), but without knowledge of re- ages, due to their sub-centimeter resolution, are optimal for
Fig. 1. The ten fundamental practices for drone remote sensing research across disciplines, starting with the importance of
focusing on your research question and ending with the significance of working collaboratively. These practices are not an
exhaustive list but are synthesized based on peer-reviewed literature and accrued through field experience.
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
r Do you need drone-captured data to answer your research questions, or are there existing remote sensing datasets (e.g.,
satellite or aerial imagery, lidar) that meet the needs of your study?
◦ Consider the necessary temporal and spatial resolutions required for your specific research purpose.
r Will the spatial resolution and scale of drone data match the detail and scale of your study topic?
r Will you be able to obtain permission to fly a drone in your study area?
r Are you planning to use data for mapping or measuring objects (e.g., heights, crown width)?
r What is the smallest size of the object (e.g., wildlife—birds, elephants; crop types—rice, orchards via individual apple tree
canopies; infrastructure) you will be mapping or measuring?
r Do seasonal changes affect the object of interest (e.g., presence/absence, growth, and location)?
r Will you create 3D data from drone-captured images using SfM photogrammetry?
r Do you have access to software (e.g., flight mission planning, SfM, image stitching/orthophoto production) crucial for capturing
and processing images, both in the lab and the field?
Fig. 2. Shape and size of field observations (e.g., golf bunker, road, and golf cart) with minimum mapping unit (A) and number
of 6 cm × 6 cm pixels (B), 1 m × 1 m pixels (C), and 5 m × 5 m pixels (D) for mapping these objects. Drone images of C and D
are aggregated to 1 and 5 m spatial resolutions to show the number of pixels required to map field observations.
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
counting individual birds in a colony (Afán et al. 2018) or iden- phenology, along with seasonal changes that affect vegeta-
tifying wilted crops (Dang et al. 2020) compared to commer- tion productivity and greenness, should be a part of the “fo-
cial high-resolution satellite imagery. However, users often cus on your question” to improve the effectiveness of spatial
forget their need for the minimum spatial resolution known resolution in mapping or measuring any aspect of vegetation.
as the minimum mapping unit and continue focusing on im- Targeting a certain growth stage of a plant (e.g., flowering)
ages with the highest possible resolution that challenge their for planning drone data acquisition may offer a better dis-
data storage and computational resources (see Fig. 2). For ex- tinction. Focus on your question! If the end goal of a planned
ample, do you require drone images below sub-centimeter project is to map an object or land cover type, users should
spatial resolution if the smallest size of the object of interest consider commercial imagery that may cover a larger area
is 100 cm2 ? A spatial resolution between 1 and 5 cm might be with the lowest possible geometric and radiometric errors.
optimal. As a rule of thumb, you should divide the smallest Measuring any aspect of the object of interest requires
size of the object of interest by four to estimate the spatial many photogrammetric considerations (i.e., % overlap [for-
resolution. Usually, the smallest observable feature that you ward and side], georeferencing approach [direct or indirect
can reliably identify would need to be four contiguous pixels with ground control points (GCPs)], flight altitude, and ter-
in an image. Therefore, the spatial resolution of the drone im- rain) in the drone image acquisition. To create a precise three-
age should be 5 cm to meet the requirement of 25 cm2 . Plant dimensional (3D) structure of the object of interest using the
SfM photogrammetric technique, consistent overlap between of time needed to obtain a license and the steep learning
successive images (i.e., forward overlap) and between flight curve associated with remotely controlling aircraft, some re-
passes (i.e., side overlap) is one of the most important consid- searchers might rely on colleagues or consultants to collect
erations (Singh and Frazier 2018). The normal forward and data for them. The onus remains on the scientist, though, to
side overlaps are 60% and 30%, respectively. Image stitch- make sure whoever operates the drone adheres to all laws
ing requires only 20% overlap between images to create a and regulations.
mosaic for mapping but insufficient to create any surface Importantly, though, civil aviation authority regulations
models (e.g., digital surface model (DSM)). Image stitching are fluid and continue to change. For example, flying at night
is the process of combining drone-captured images to cre- was only recently permitted by the FAA. Further, the FAA in-
ate a seamless image or photomosaic. Higher levels of over- stituted the Remote Identification (ID) rule in 2021, requiring
lap are suggested (i.e., 85% forward and 70% side) for cre- drones to broadcast aircraft information such as the drone
ating accurate point clouds and surface models (Dandois et ID, location and altitude, velocity, takeoff location, and more
al. 2015; Fraser and Congalton 2018). Flat terrain may re- during operation (FAA 2021). With the complicated integra-
quire higher overlap to extract matching points for 3D point tion of drones into the airspace, rules are expected to con-
clouds. Any change beyond these overlaps may increase the tinue changing, and those using drones for remote sensing
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
number of swaths and images for the study area, thereby in- should check for aviation authority changes/updates before
creasing the flight duration, image storage, and image pro- conducting any flight. Remote pilots, in most cases, have to
cessing time. The height of the object of interest and terrain renew licenses/certificates regularly (e.g., the FAA requires
characteristics also affect the drone’s cruising altitude and biannual recurrent training). Be cognizant, too, that regula-
takeoff locations (see Thomas et al. 2020). A drone’s actual al- tions vary between countries (Stöcker et al. 2017) and some-
titude above ground level varies with undulating terrain. The times within countries (Hodgson and Sella-Villa 2021). That
targeted overlaps will decrease if the terrain is higher than is, a FAA-issued “Remote Pilot” license is only adequate for
the drone’s takeoff point (e.g., DronesMadeEasy 2021). High- drone flight in the United States unless another country’s
quality image stitching requires many surveyed GCPs or man- CAA formally declares otherwise. In another variation, in
ually documented tie points to build a geometric relation- Guyana, where a remote pilot license is not required, the
ship between the ground reference and images for creating GCAA recently issued an advisory that affects the entire chain
a seamless image. The GCPs are a set of points on the Earth’s of events for flying a drone in that country. Depending on the
surface with known coordinates. A suitable number of GCPs, class of drone (micro [100 g or less], very small [100 g and less
representing landscape type (e.g., mono stand— —crop, com- than 2 kg], small [2 kg and less than 25 kg], medium [25 kg
mercial forest; high relief), contribute to better image stitch- and less than or equal to 150 kg], and large [greater than
ing for a high-quality photomosaic (see Galván Rangel et al. 150 kg]), someone lacking a UAS Operator’s Certificate and
2018). Tie points (e.g., road intersections, rock outcrops, crop wishes to import a drone into the country needs a letter of im-
rows, buildings, trails, etc.) are easily recognizable objects portation at a cost ($4,000 [US$20], $6,000 [US$30, or $10,000
that can be identified in overlapping images to improve the [US$51], respectively; GCAA Advisory Circular, 2022). To ob-
alignment between images of the mosaicked imagery. Hence, tain a UAS permit/flight authorization, the cost is determined
focus on your question to decide these variables in advance by the size of the drone. Aside from fees for importation and
for capturing images suitable for measuring various parame- obtaining flight authorization, upon making an application
ters (Abdullah 2021). for the issue or renewal of a UAS Operator Certificate, there
is a requirement for a non-refundable deposit of the basic fee
based on whether the application is for a first-time license
Know the law and abide by it or renewal. The GCAA guidelines have also been amended
The civil aviation authority (CAA) of the country within to include fees for commercial operations where “dangerous
which a researcher will operate a drone dictates how, when, goods” may be carried by the drone. This illustrates the com-
and where they can do so and for what purposes (e.g., plexity of legal flight from country to country; researchers
Unidad Administrativa Especial de Aeronáutica Civil [Colom- must not overlook the amount of time and effort needed
bia], South African Civil Aviation Authority, Guyana Civil Avi- to adhere to regulations. Fortunately, the International Civil
ation Authority [GCAA; Guyana], etc.). In the United States, Aviation Organization (ICAO) provides guidance to 193 mem-
the Federal Aviation Administration (FAA) permits drone op- ber states that have similar regulations (ICAO 2022).
erations under the Part 107 (or small UAS) Rule, requiring Beyond adhering to CAA rules, drone users must consider
those conducting research to obtain a “Remote Pilot” license within-country, state, and local (e.g., county/parish, town,
(FAA 2022) and follow regulations such as operating within city, etc.) regulations as well (see Fig. 3). Typically, these local
Class G airspace (below 400 feet of altitude from ground authorities do not and cannot regulate the airspace, but other
level), maintaining a visual line of sight with the aircraft, fly- regulations can be in place, often related to where you are
ing at night with anti-collision lighting, etc. In the case of flying. For example, many public lands (e.g., city and county
the FAA, obtaining a remote pilot license only requires pass- parks, state preserves) have outright bans on drone usage on
ing a written exam and a background check without having their properties. Often, though, exemptions for scientific re-
to demonstrate satisfactory ability to operate a drone. There- search can be approved by such authorities assuming drone
fore, it is imperative for remote pilots to train on how to fly operators agree to follow additional policies (i.e., fly when
with already experienced remote pilots. Given the amount the park is closed to visitors [hourly/daily], fly when an en-
Fig. 3. Rules and regulations that remote sensing scientists during use. Most, if not all, remote pilots can share stories
collecting data need to adhere to from civil aviation authori- of accidents occurring in the field due to equipment mal-
ties regulating the airspace (top) to landowners and potential function, remote pilot error, rapidly changing weather condi-
spectators on the ground (bottom). All these aspects are im- tions, and a suite of other issues. Even the best remote pilots
portant and should be thoroughly reviewed before operating face complicated circumstances that can lead to an accident.
a drone at a study site. Being insured alleviates the risks associated with operating
a drone. Some universities also require each drone to be in-
sured prior to flight.
the case with the FAA, which took several years to create
and implement Part 107 to regulate drones in the national
airspace. Such a lag is also apparent regarding additional poli-
cies and ethics. To protect privacy and conduct ethical drone
remote sensing research, think beyond adherence to only
CAA regulations but also in terms of obtaining permission to
be on land (e.g., public, private, and Indigenous peoples land)
and collecting data where people are present (i.e., for safety
but also how collected data might impact people and their
livelihoods). The remote sensing community must go beyond
the minimum requirements, including compliance with eth-
dangered bird species is no longer roosting in the park [sea- ical codes provided by geospatial organizations such as the
sonal]). Data collection plans must account for these potential ASPRS (2014), GISCI (2022), and URISA (2003).
hindrances, which is especially important with remote sens- Further, remote sensing scientists must resist assuming
ing that requires data capture at specific times of day, such that all will benefit from the use of drone technology. Not
as near noon when park visitor traffic is high. To avoid any everyone has positive associations with the technology, and
issues in the field, drone pilots should obtain written permis- the significance of drones used for military and surveillance
sion from all landowners, including those adjacent to study purposes cannot be overstated (see Bracken-Roche 2016;
areas, before conducting flights. In the U.S., this also ensures Jackman and Brickell 2021). Drones, especially in military set-
that drone operators adhere to United States v. Causby, which tings, cast a long shadow over the utility of these devices
granted landowners exclusive rights to airspace up to 350 feet and their potential benefits to society and humanity over-
above the ground or ground-based structures (U.S. Supreme all. Scholars emphasize the negative connotations of military
Court 1945). Drone operators should maintain open commu- drone usage and how these perceptions transfer over to and
nication with all stakeholders (e.g., park managers, private persist in non-military settings (Coker 2013; Enemark 2013;
landowners) regarding when flights will be conducted (in- Cohn 2014).
cluding who will be involved, i.e., an individual or team) and Researchers should remain open to public concerns about
if any other field data will be collected. Print out permissions privacy related to drones. It is the researcher’s obligation
and have a copy readily available when conducting data col- to ensure that everyone surrounding the flying of a drone
lection. The process of obtaining permission may take days, is aware of: (1) the research objectives and why a drone is
weeks, or months, depending on the site, and this should be needed to address them, (2) what data will be obtained and
considered when planning your research strategy. how it will be used, (3) permission being obtained from the
Permissions from your employer also need to be secured. landowners prior to flying, and (4) an open review process
In the academic environment, many universities have drone for landowners/community members to view the data (and
standing committees and/or review boards that issue policies approve its’ use) obtained by the drone before leaving the
and guidelines for those using drones in research. University- study site. The researcher is responsible for protecting the
specific rules should be researched and followed. Further- integrity and privacy of the individuals captured in drone im-
more, an application may need to be filed or additional con- agery. In other words, even though permission was obtained
sultation with administrators may be needed prior to a flight. prior to data collection and an initial review of imagery was
Reaching out to the chair of the committee/review board (if completed before leaving the field, the researcher must main-
your university has one) is never a bad idea if you have ques- tain the responsibility of always protecting people captured
tions or concerns. in the data.
Employers such as universities often handle insurance for Unlike data obtained by satellites, which fly overhead at
drone users, which is also highly suggested to safeguard peo- altitudes that make them inaccessible and invisible to peo-
ple (and, to a lesser extent, property) and drone equipment ple on the ground, drones can change this aspect of re-
motely sensed data collection. The entire process of drone Be mindful consumers of technology
data acquisition provides an opportunity for people to be in- As the number of drone users has increased, so have the
volved. Community members can provide input on the ar- technologies to support drone data collection. Relatively in-
eas where flying is permitted and the implications for the expensive, off-the-shelf products (e.g., drone/aircraft, sensors,
research questions being examined, help to develop drone apps, and SfM software) have made drones easy to adopt and
flight plans/missions, review data and provide insight into allowed users to obtain high-quality data products quickly
land-cover classification and analysis, and help to process im- and effectively. A growing repository of help documentation
agery in the field to identify potential areas of concern for im- exists to support drone users’ endeavors (e.g., DroneDeploy
agery quality and other issues. Therefore, drones provide an 2022; Pix4D 2022). Scientists must be cautious when rely-
opportunity for people’s involvement on the ground, allow- ing on the black box functionality associated with many off-
ing the drone data acquisition process to follow accepted ethi- the-shelf products with proprietary algorithms (e.g., SfM-MVS
cal norms while simultaneously presenting opportunities for software, especially). When the user is only concerned with
transforming the significance of drone data for larger societal the inputs and outputs of the software, uncertainty may arise
applications. Related to this, data sharing agreements enable in the resulting datasets. The user should be aware that even
drone researchers to define who will have access to the data, seemingly minute changes in data collection procedure (e.g.,
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
what will be done with said data, when/how the data will be altitude, overlap, image/camera settings, etc.) or processing
shown (i.e., very high spatial resolution can contain several parameters (e.g., point cloud density) can influence result-
forms of identifying information), and similar themes may ing data accuracy (O’Connor et al. 2017; Pricope et al. 2019;
be explored with research subjects. Special attention should Young et al. 2022). Open-source software options for flight
be paid to respecting sacred sites and cultural spaces (Davis control/operations (e.g., ArduPilot and Pixhawk) and data pro-
et al. 2021), such as Indigenous landscapes. In all instances cessing (e.g., MicMac, OpenDroneMap, VisualSfM) continue
of work in such spaces, researchers must obtain community to emerge and evolve (Ebeid et al. 2017) but are sometimes
approval and involve community members in the research constrained by inconsistent updates, difficult-to-use graphi-
process (Cummings et al. 2017a). cal user interfaces (including language barriers), and a lack
The ethical operation of drones and considerations to of integration between desktop and cloud platforms.
protect human and non-human subjects during and post- Accuracy is a critical component of mapping products pro-
operation, in any setting, should be established long before duced with drones (e.g., orthophotomosaics and digital ele-
the remote pilot and team plan missions. Drone operators vation models). Without knowing better, a user may expect
should approach their work with the express knowledge and all drone products to be created equal, but that is not the
understanding that the data they obtain will contain aspects case. Scientists report a high degree of variation in the accu-
and elements that will make the research subjects— —people racy of SfM-generated elevation data depending on the type
or landscapes— —vulnerable in some way. Consequently, a re- of off-the-shelf drone used (Rogers et al. 2020; Stark et al.
sponsible drone operator will anticipate the type of data that 2021). Most off-the-shelf drones produce outputs with high
will be obtained, what will be contained in the imagery or relative accuracy; that is, when features within the project
other data being captured, and likely outcomes of such data are compared to other features within the project, positional
falling into the wrong hands despite the most stringent and accuracy is very high. However, the absolute accuracy, that is,
well-intentioned data management protocols. The operator when data are compared to a true position on the Earth’s sur-
will therefore develop a workflow and protocols for data col- face, ranges from ±1–2 m horizontally and vertically (Thomas
lection and management that will allow potential negative et al. 2020). This is problematic when overlaying drone data
outcomes to be minimized. The drone operator should seek products with other georeferenced data layers or if you want
to separate their data management and data collection proto- to compare drone outputs from the same location over time.
cols into distinct phases and identify potential risks, as sug- Several off-the-shelf drones now encompass onboard real-
gested in Box 2. time kinematic (RTK) GNSS functionality to increase posi-
Consider, too, the storage and handling of drone-collected tional accuracy; however, this drastically increases the cost
data (i.e., who will have access to the data). Cloud-based com- of the drone. While RTK is useful for real-time positional pro-
panies are governed by their home countries and therefore cessing, it is not completely necessary. Users can instead place
can be influenced by geopolitical tensions. This is well doc- ground control targets within their mapping area, obtain geo-
umented with the United States Government, where adop- graphic coordinates from those targets using a high-precision
tion and use of Chinese-made DJI drones are largely banned GNSS (often already available to researchers in academic set-
due to security concerns (Wright 2017; Puko and Ferek 2019; tings), and then use those GCPs to improve the absolute accu-
U.S. DOI 2020; U.S. DOD 2021). Data processing modality racy of their drone outputs (Sanz-Ablanedo et al. 2018; Oniga
is a significant consideration (i.e., desktop/local storage vs. et al. 2020; Liu et al. 2022).
cloud processing/storage); for example, should the faces of When conducting multi- and hyperspectral surveys with
onlookers or vehicle license plates be captured within images drones, radiometric calibration is another critical component
and processed using a cloud-based platform (e.g., Pix4Dcloud, that needs to be controlled for in the field (Fawcett and An-
etc.), identifying information is then on servers. Those using derson 2019; Xu et al. 2019). That is, it is controlled by the re-
such services— —scientists and professionals— —typically do not searcher and not the (sensor) manufacturer, which requires
have any say in altering any data agreements with technology scientists to have in-depth knowledge of remote sensing fun-
companies. damentals. Sensor-measured radiance, a portion of irradiance
Box 2. Hypothetical data collection effort with significant privacy and ethical concerns.
Phase 1: Pre-mission
During the pre-mission phase, the drone operator/remote pilot must consider the research subjects and how the data they are
about to obtain might negatively impact them. While flying over people is not allowed in many jurisdictions, the operator should
assume that people will be captured in the imagery. Therefore, the remote pilot should explain to human subjects what data are
likely to be captured and provide them with an opportunity to determine whether they will permit themselves and their spaces to
be imaged.The permission process should include disclosing to human subjects that:(1) data on sensitive sites may be acquired,
(2) images of people may be captured, and (3) data on property may be captured. The pre-mission assessment should also
include providing details on the members of the mission team, the actions that will be taken to protect human subjects, and,
where necessary, the mode of communication the team will employ. Prior to planning the drone mission, the operator should
explain these details to the human subjects and obtain permission before flying over property or people. Remember, consent to
fly in any space can be rescinded as desired by human subjects, including local-, national-, and international-level authorities.
But if the operator is transparent about where they intend to fly, what data they are likely to obtain, and what the data will be
used for, consent will likely remain in place. The pre-mission assessment should be included in the standard human subjects
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
research protocol (e.g., IRB) that researchers need to adhere to for their work.
Phase 2: Mission
The operator must be prepared to abort a mission, if necessary, to protect lives and adhere to their research protocol. As an
example, while one of the authors was completing a mission in a rural setting, a mother carrying a days-old baby suddenly
appeared on the edge of the area being flown. In this case, she happened to be aware of the mission, was curious, and wanted
to view the drone while it flew over her family’s farm. Imagine the potential disaster if the drone did something unexpected that
led to injury. Thankfully, the team operating the drone was sufficiently trained and alert to recognize the risk posed by completing
the mission. Team members recognized that pre-mission protocols would be breached if these persons were present,as consent
was given to fly only in the presence of the team members. The pilot was quickly alerted, and the mission was aborted. Drone
operators must be aware of their operating protocols and be prepared to abort a mission at the small sacrifice of battery life
and camera data storage to ensure they remain in compliance. Where first-person view capabilities are present on the drone,
the operator can ensure that the data being obtained adheres to the pre-mission standards.
Phase 3: Post-mission
Upon the completion of flights, the drone operator must review the collected data with human subjects to ensure that they are
comfortable with the images and data obtained. The operator should work with human subjects to review the captured data
to remove any images that contain human subjects or simply do not meet the requirements of the established pre-mission
planning protocols. In the production of publications (assuming involved persons approve of the use of the collected data), the
operator must remove any features that may have cultural or other significance, or that may identify individuals.
(or total incident energy), varies over time and space, requir- practice is to be mindful of the correct data collection param-
ing a number of corrections to obtain accurate spectral in- eters for your research target in the field and knowledgeable
formation. The same targets in a mapping scene display var- about how imagery is processed with SfM software. Without
ied radiance values under different environmental conditions considering the calculations happening behind the scenes
(e.g., weather/cloud cover, and aerosols), viewing angles (re- or understanding the myriad flight collection and post-
quiring bidirectional reflectance distribution function [BRDF] processing parameters, researchers run the risk of inaccurate
modeling; Li et al. 2018; Deng et al. 2021), and more. Thus, sci- results.
entists must undertake in situ calibration using a spectral tar- Beyond knowledge of data handling and processing, some-
get either before, or before and after, image collection occurs, times it becomes necessary to have knowledge of the drone
depending on the sensor employed (e.g., MicaSense 2022). itself—
—the characteristics of the hardware and software that
Calibration values are then used to correct the images dur- you rely on for data collection. Some drone-related stakehold-
ing post-processing to ensure that consistent spectral data (in ers have raised concerns about DJI products (Wright 2017;
units of spectral reflectance) across multiple scenes or time Puko and Ferek 2019). Such concerns may impact the rela-
periods are obtained (Kelcey and Lucieer 2012). This is an es- tionship the drone user has crafted with local people, includ-
pecially important consideration because many optical sen- ing being able to protect their privacy and other concerns.
sors developed specifically for drones are known to introduce In such instances, it may become necessary to pursue other
significant radiometric error (Huang et al. 2021). In addition drone options, including building drones from scratch where
to these issues, calibration of camera lens distortion (inte- you have more control over the parts and components used
rior orientation corrections) must be conducted prior to data in data collection (see Cummings et al. 2017a). Knowledge
collection. It is true that drone aerial image data can be col- of what goes into the drone can help to deal with fears sur-
lected, processed, and presented with the researcher know- rounding the drone obtaining data that will compromise pri-
ing little about remote sensing or photogrammetry. The best vacy and other concerns.
Table 1. Checklist for drone flight operations under U.S. FAA regulations (adapted from GeoIDEA Lab 2022).
r Prepare microSD/SD cards for overhang, etc.) observers as needed throughout where applicable
data storage on the drone r Place drone in takeoff location flight
r Charge batteries (drone, remote r Turn on drone and controller r Communicate with team
controller, tablet or following manufacturer members throughout drone
smartphone) specifications flight on progress
r Prepare any other field r Review flight plan on app r When flight mission or partial
equipment: GNSS for GCPs, r Confirm correct camera settings mission is complete, manually
spectroradiometer, spectral r Verify GNSS connection is active or autonomously bring drone
target, etc. r Confirm skies are clear for back toward staging area
r Check for NOTAMs flight r Confirm staging area is clear for
r Submit for LAANC approval if landing
needed (day before or before r Land drone
flight) r Change batteries and repeat
r Weather precheck flights if multiple are needed
Develop or adopt a data collection protocol fessional that regularly operates drones. Many universities
Drone-based data collection requires extensive planning. now employ drone pilots for a variety of purposes (i.e., mar-
Developing from scratch, adopting, or adapting a data collec- keting/videography, within drone-focused academic depart-
tion protocol is a fundamental practice for conducting drone ments, extension offices), which can fulfil this need for re-
remote sensing research. When preparing a protocol, two pri- searchers uneasy with drone operation.
mary components to include (not mutually exclusive) are (1) On the day of flight, preflight checks include weather mon-
aircraft operation and (2) data capture. Myriad resources are itoring prior to travel to the field site and confirmation on
available to support the development of a protocol. Table 1 site, drone preparation and inspection prior to flight, locat-
provides an example of a drone operation checklist adapted ing and establishing a staging area for drone takeoff and land-
from the GeoIDEA Lab (2022). In this case, data collection ing (keeping in mind different requirements for rotary-wing
specifics are not covered in detail, so this could be formu- and fixed-wing aircraft), placing the drone in the staging area
lated within another checklist or incorporated into this one. and turning it on along with remote control, conducting any
Checklists can be drone-specific or created more generically, necessary camera calibrations, confirming flight plan details
such as in Table 1. and camera settings, verifying GNSS connection, and, lastly,
Specific checklists can be altered if you own more than one checking that the immediate area and nearby skies are clear
drone, especially when comparing rotary-to-fixed-wing plat- for takeoff.
forms. Importantly, operators should compartmentalize op- The flight checklist includes actual aircraft takeoff, confir-
erations as shown in Table 1 with preplanning/preparation mation of full control of the aircraft and an adequate bat-
before the data collection day, and preflight, flight, and post- tery level for the flight mission, and then data collection.
flight items on the data collection day. Preplanning entails As for data collection during flight, the type of data be-
obtaining a remote pilot license and registering your aircraft, ing collected influences how to operate the aircraft (with
as well as confirming insurance coverage, along with per- altered checklists needed for different sensors). Most com-
missions, and equipment preparation. Preplanning can also monly, aerial images are gathered for SfM photogrammetry,
include practicing manual flight control prior to capturing which requires a flight plan that maintains consistent overlap
data. There is a steep learning curve to learning how to fly (front and side) between images (see Figs. 4A–4C for polygon,
drones (especially with fixed-wing platforms), and adequate grid, and double-grid patterns, respectively) and ensures ad-
time should be allotted to ensure safe flight. Those not com- equate camera angle variation. Altitude of data capture and
fortable with flying, or wanting to concentrate on the data altitude variation throughout a flight also help to improve
itself, are advised to outsource flight operations to a pro- eventual data quality (Zimmerman et al. 2020; Santana et
Fig. 4. Drone-based aerial image collection from using (A) polygon, (B) grid, and (C) double-grid flight plans.
al. 2021; Swayze et al. 2022). Importantly, how drone aerial Fig. 5. Documented issues with SfM-MVS-generated data
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
images are captured impacts the quality of the output data products such as (A) “dishing” and (B) “doming” of generated
(Dandois et al. 2015; Young et al. 2022). Convergent imagery terrain surfaces due to inadequate lens distortion corrections
(i.e., integrating converging obliques) has proven time and and/or poor flight planning (i.e., overlap and angle).
again to remove systematic error in SfM-MVS-generated data
(Wackrow and Chandler 2011).
The postflight checklist includes powering down the drone
and other equipment, inspecting equipment for damage,
logging flight details, reviewing collected data on-site, col-
lecting other ground data to support the research project
(i.e., spectroradiometer measurements of a spectral target
placed in the field prior to flight, surveying GCPs), and inter-
acting/reviewing data with community members/landowners
as needed. Data processing and analyses follow the day
of flight (i.e., SfM for image data, georeferencing for all
data).
when mounted on a moving drone requires a global shutter relatively low-cost compared to Earth-observing satellites and
or corrections to be applied to images captured with a rolling piloted aircraft, drone technology (including SfM software)
shutter (Vautherin et al. 2016; Nex et al. 2022). is still cost-prohibitive for many within the remote sensing
In capturing aerial photos for SfM-MVS processing, vari- community, not to mention broader society (e.g., geospatial
ation in flying height and camera angle is recommended, practitioners, citizen scientists, and hobbyists).
and some might argue a degree of randomness is preferred.
This is unlike traditional and more recent forms of digital
Consider new approaches to analyze
photogrammetry, where flight planning is rigid and system-
atic. The amount of image overlap is important in flight
hyperspatial data
The detail captured with sub-centimeter hyperspatial
planning for accurate SfM data. A meta-analysis revealed
aerial imagery opens research avenues that were once
that most research utilizes greater than 75% forward over-
implausible. For example, drones allow monitoring bird
lap and 70% side overlap (Singh and Frazier 2018). Dandois
colonies at desired spatial and temporal scales in areas in-
et al. (2015) suggest 60% forward overlap and 60%–80% side
accessible to ground surveys, repeat assessments of changes
overlap. Building consensus remains difficult because dif-
in wood productivity due to management decisions, and re-
ferent landscape types (including ground/above ground fea-
motely monitor the spread of pests and pathogens in forests
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
Fig. 6. Drone-collected true color orthophoto for precision viticulture at (A) vine row, (B) partial vineyard, and (C) individual
vine scales with the operational/management scale circled in orange (per-vine) and the spatial resolution/scale of ∼1 cm with
white grid boundaries (per-pixel). This scale difference results in thousands of pixels being captured for an individual vine
canopy.
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
be more effective in plant-level management compared to 1 highly effective and underexplored as a replacement for
cm spatial resolution (see Rogers et al. 2020). As Fig. 6 illus- ground-based observation and measurement (e.g., quadrat
trates, drone-collected multispectral data to calculate NDVI plots to determine vegetation types and quantities). Drones
at a 1 cm scale (shown in true color) does not improve per- can be tasked with randomly selecting plots for capturing
plant management practices (assuming per-plant, precision very high spatial resolution photos that can later be analyzed
agriculture crop management practices) compared to data at visually to collect, estimate, and measure any aspect of the
a 0.5 m scale. Therefore, establishing an optimal resolution field plot for above-surface observations. This approach pro-
may help remove some of these deficiencies while improving vides more reference data in a shorter time frame, access to
computational efficiency (Singh et al. 2012). inaccessible sites, coverage to larger extents, repeat visits of
As is the case with precision viticulture, as highlighted sites, and temporal congruence between ground reference
in Fig. 6, research confirms no advantage to very high spa- observations and collected images (Kattenborn et al. 2019).
tial resolution imagery when NDVI or other vegetation in-
dices are implemented (Lamb et al. 2004). Yes, we can cal-
culate NDVI with 1 cm pixels, but does it give us anything Time-series analyses
that aerial/satellite imagery did not already provide? These is-
Flexibility in the repeat collection of images makes drone
sues necessitate potentially less reliance on the spectral data
technology well suited for time-series analyses that may offer
and more on the spatial arrangement of pixels (i.e., OBIA;
new ways to analyze hyperspatial images. For example, NDVI
Mathews 2014) and 3D structure of SfM-MVS point clouds (i.e.,
shows the effect of pests and pathogens and the application
volumetric analyses, point cloud analytics, and comparisons;
of water and fertilizers on growth and productivity through-
Turner et al. 2015; Clapuyt et al. 2017; Hunt and Daughtry
out crop growth stages (e.g., Hunt et al. 2010; Mathews 2014;
2018).
Hunt and Daughtry 2018). However, as mentioned previously,
NDVI at a hyperspatial scale might not always provide useful
information for addressing research objectives. Monitoring
Ground(like) surveys and imagery from the same
the restoration of retired cranberry farms in the northeastern
sensor
United States is another example that can benefit from the
Reference data are fundamental to remote sensing appli- time-series analysis (Harvey et al. 2019). While the restoration
cations and are regularly acquired through cost-ineffective of retired farms can contribute to sustainable agriculture,
ground surveys. Drone-based low-altitude aerial surveys are it is essential to periodically monitor conservation progress
to ensure that conservation efforts are beneficial for both 2015; Suomalainen et al. 2021). The process involves a pro-
landowners and biodiversity. Knowing the utility of OBIA cedure for the correction of dark current, flat field, spec-
with hyperspatial image data, object-based change detection tral response, and absolute radiometric coefficients that al-
provides an ideal approach for those interested in identifying lows for accurate conversion of the camera digital numbers
landscape alterations (Chen et al. 2012). Other approaches in- to at-sensor radiances and/or surface reflectance (Aasen et
clude principal component analysis to detect changes in the al. 2018). Low flight altitudes of drones also limit users, un-
temporal dimension (Deng et al. 2008). less in situ data are captured by correcting atmospheric ef-
fects in multispectral (Guo et al. 2019; Mamaghani and Sal-
vaggio 2019) and thermal (Heinemann et al. 2020) images,
Synopsis
potentially making them unfit to compare or create a fusion
In sum, scientists are restricting themselves and their work dataset with satellite-based imagery. Hyperspectral sensors,
if, by default, they merely apply traditional methodologies specifically with one-dimensional scanners, require different
(i.e., developed for coarser satellite and aerial imagery) to workflows from multispectral sensors entirely to create ge-
non-traditional, in terms of scale, data. The remote sensing ometrically correct data. In sum, quantitative remote sens-
community must consider drone-collected data as another ing applications require radiometric calibrations of sensors
form of geospatial big data and treat it accordingly by col- and should be considered before implementing drones in the
lectively outlining the challenges and opportunities afforded project for data acquisition.
by this new means to capture data. While SfM-generated point clouds are invaluable datasets
for 3D modeling and analysis (Gómez-Gutiérrez and
Think beyond imagery Gonçalves 2020), image-based SfM-MVS is not an alterna-
Drones are most often utilized to capture aerial images us- tive to lidar. As an active remote sensing technology, lidar
ing optical, passive sensors with SfM-MVS as the data pro- provides the benefit of collecting data within and beneath
cessing method. However, there are many types of data that a tree canopy and other porous features (Wallace et al.
can be collected with the help of a drone, including atmo- 2012). In other words, image-based SfM data are “what you
spheric measurements, lidar, radar, etc. (see Table 2). It is see is what you get” (i.e., if you cannot see the forest floor
therefore important that researchers consider options be- in the images, you will not see it in the generated data).
yond imagery when crafting methodologies. Of course, not Integrated lidar sensors for drones continue to decrease
all research problems require multiple data types, but aware- in cost, weight, and power requirements (see Almeida et
ness of the potential options is important to fully addressing al. 2019), and off-the-shelf drones even offer lidar systems
research questions. Fusion across multiple data types is en- (DJI 2022). Hence, if the purpose of drones is to create a
abled when researchers capture additional data along with topographic surface model in a partially forested area, lidar
aerial imagery (e.g., lidar and multispectral imagery data fu- provides the best means by which to measure elevation
sion). from the top of tree canopies to ground level. Drone-based
Miniaturized multi- and hyperspectral cameras are suit- lidar for 3D modeling has enormous potential in a variety of
able for remote sensing drone applications. However, insuffi- applications (Sankey et al. 2017; Jaskierniak et al. 2021) but
cient radiometric calibration methods limit their use in quan- is often limited by data processing algorithms. Therefore,
titative remote sensing applications and thus cannot com- algorithmic advancements are needed to filter and extract
pletely replace satellite-based imagery. Radiance measured information from very dense, small-area point clouds as
by cameras is prone to illumination changes and sensor uni- well as effectively and directly compare point clouds over
formities caused by the roll–pitch–yaw orientation (Smith time (as opposed to converting these data to rasters; this
and Milton 1999). Radiometric calibration of the camera goes for both lidar and SfM data; see Esposito et al. 2017a,
requires accurate reflectance transformation (see Mathews 2017b).
Table 3. Items to report in drone remote sensing publications for transparency and to support research reproducibility and
replicability.
Drone operation Data
Collection Processing Error
Aircraft specifications: platform Sensor specifications: type Software and parameters: Terrain (assessed with point
type (fixed- or rotary-wing), (optical, lidar, radar, etc.), make lens distortion correction clouds or raster
make and model, GNSS and model, details (image size procedures, SfM software used DEM/DTM/DSM): error
accuracy, and other onboard and type of sensor, lidar/laser and options selected, image assessment/validation method
sensors (if applicable) scanning speed/density), stitching method, radiometric (point to point, point to raster,
radiometric (or other) correction method, manual raster to raster), vertical error
calibration interventions in data metrics including RMSE, MAE,
procedures/atmospheric production (if applicable) SDE, ME
corrections
Flight log: civil aviation authority Image characteristics: Georeferencing approach: method Orthophotomosaics: horizontal
adherence procedures, front and side overlap (direct, indirect, or both); accuracy, spectral
altitude(s), speed, flight time, percentages, image angles, and additional corrections (if inconsistencies, including level
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
flight plan type and number of images collected applicable) of agreement with
characteristics, app used to ground-based data
flight plan
Study site details: areal Ground-based data: GNSS data for Data product details: imagery Other data: validate
coverage/survey extent, GCPs and CPs (horizontal and (spatial, spectral, radiometric, drone-collected data (radar,
on-ground permissions vertical accuracy), GCP temporal resolutions), point magnetic anomalies,
obtained, landscape configuration (number and cloud (SfM or lidar) density atmospheric, etc.) by comparing
description, weather on data distribution), GNSS type (RTK, and/or point spacing, those with ground-based or
collection days differential corrections), other known data sources
terrestrial laser scanning (TLS)
data, spectral calibration data
(spectral target and
spectroradiometer
measurements), etc.
Another active remote sensing technology, radar, has been ground reference data on broad scales. Remote sensing uses
adapted to operate from small drone platforms. Radar data of drones in mapping and measuring landscapes are full of
(see Table 2) support remote sensing research ranging from possibilities waiting to be explored.
vegetation and soil moisture to underground feature detec-
tion (see Abushakra et al. 2022 and López et al. 2022, re-
spectively). Like the latter example, drone-mounted magnetic Be transparent and report error
field sensors within an off-the-shelf smartphone were shown Increased transparency and clear communication of data
to accurately detect buried metal anomalies (Campbell et al. and methods within drone remote sensing research is neces-
2020). Researchers are encouraged to creatively integrate at- sary, as studies have only sporadically reported flight plan-
mospheric measurements (Hemingway et al. 2017), data from ning details, image overlap, GCP placement, spectral cali-
onboard sensors (i.e., GNSS and IMU), and other low-cost sen- bration, etc. (Singh and Frazier 2018). Providing comprehen-
sors even if intended for other uses, which was commonly sive methodological details in publications (and open data
done with altered point-and-shoot digital cameras (Hunt et when able) will help to combat reproducibility and replica-
al. 2010; Mathews 2015). bility issues within geospatial research (Kedron et al. 2021).
Several constraints affect the quality and quantity of data Table 3 provides a broad overview of items to include in pub-
from field surveys: inaccessibility of the terrain, area cover- lications organized into two broad categories of drone op-
age, GNSS inaccuracy due to dense canopy cover (Valbuena eration and data (the latter includes collection, processing,
et al. 2010; Kaartinen et al. 2015), and human errors in iden- and error as subcategories). Drone operation requires report-
tifying and/or counting objects of interest (e.g., number of ing of aircraft specifications, flight logs, and study site de-
invasive plants within a plot or number of birds in a colony; tails (see Table 3). The data collection portion entails a de-
Lunetta et al. 1991; Lepš and Hadincova 1992). Discrete point scription of sensor specifications, image characteristics (if
or plot field observations performed on the ground within imagery was captured), and ground-based data. Publications
a forest stand may not represent continuous observations should inform readers of data processing components such
made through sensors above the forest stand (Turner 2014; as software and parameters, georeferencing approach, and
Immitzer et al. 2018; Leitão et al. 2018). Kattenborn et al. data product details (see James et al. 2019 for SfM-specific
(2020) used a drone to collect reliable ground-reference data guidance).
on three different invasive plants, suggesting drones as a Reporting errors is especially important with terrain mod-
promising alternative to cost-ineffective field surveys for els (using image-based approaches and lidar), orthophotomo-
ground-reference data. Drones may help to overcome many saics, and other data. Carrivick et al. (2016) rightly point out
of these constraints and improve the quality and quantity of inconsistent reporting of errors associated specifically with
Fig. 7. A conceptual framework for drones incorporating three major components (including descriptors for each): devel-
opment/manufacturing/testing of drone technology, operation of unoccupied aircraft, and application of drones and drone-
collected data. Contributing academic disciplines (by no means an exhaustive list) are placed outside of the triangle in prox-
imity to their predominant focus (i.e., engineers focus on developing drone technology, whereas geographers emphasize data
analysis for topographic modeling).
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
autonomous, and/or manual), flight planning, and data col- vice flying over your head, undoubtedly looking at your land,
lection. Aviation, atmospheric science, physics, law, and pol- not knowing who sent it or what exactly it is doing. For satel-
icy contribute to drone operation. By no means are re- lites and airplanes, local people possess little power over how
searchers from other fields not able to learn to operate a and when they fly over their landscapes. The advent of drones
drone; these fields provide the background needed for drone has the potential to change this and benefit local communi-
operation for everyone. Application of drones is primarily ties. Community-based scholars agree on the myriad benefits
focused on the drone-collected data and its use for a par- of drone implementation in supporting remote sensing re-
ticular topic of interest. The application work requires re- search (Paneque-Gálvez et al. 2014, 2017; Wachowiak et al.
mote sensing data and analysis techniques, computer vision 2017; Vargas-Ramírez and Paneque-Gálvez 2019).
data processing methodologies, data visualization, and map- It is true that the law surrounding the deployment of
ping. A wide variety of disciplines utilize drones for appli- drones across the world has been evolving to the point
cations ranging from biology and environmental science to where most countries have regulatory frameworks surround-
agriculture, civil and environmental engineering, geography, ing their use. But while drone users may easily meet the reg-
and geology. Computer science advancements help all areas, ulatory requirements and fly in spaces where interactions
but predominantly development and applications (i.e., in- with people are not necessary, embracing local people can
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
cluding SfM-MVS photogrammetry/computer vision). For re- enhance the remote sensing process. Local people under-
mote sensing scientists, the incorporation of drone technol- stand their landscapes, have questions drone imagery may
ogy presents opportunities to collaborate with partners in be able to address, and most critically, are curious what will
other disciplines. Researchers should not overlook the impor- be done with the resulting data (see local engagement with
tance of fields such as science and technology studies, philos- drones to examine agricultural landscapes in Cummings et
ophy, and ethics to engage with the critical components of al. 2017a, 2017b). As scholars, engaging local people provides
working with drones (Braun et al. 2015; Jackman and Brick- the opportunity to address questions, thereby allowing the
ell 2021). research activity to have positive impacts beyond academic
Importantly, though, synergy between disciplinary bound- articles. Researchers, like it or not, are invariably role models.
aries can create more impactful work (Calvario et al. 2017; Most researchers are associated with institutions of higher
Hoople et al. 2019). Practically, conducting drone remote learning where peers and students observe their conduct, or
sensing work can be time-consuming with fieldwork, need- at the very least, their work is released in some form that
ing to obtain a remote pilot certificate, etc. Do not do it all by can influence other people’s thinking. It is therefore impera-
yourself! A well-designed drone remote sensing team brings tive that researchers traverse their study areas and spaces in
together complimentary experts to address complex research such a manner that they leave them better than they found
problems. Not every remote sensing scientist will find op- them. An engaged and informed local community, however
erating a drone easy, especially when they are likely more defined, will more than likely bear the best testimony of the
concerned with data collection. This could mean teaming up impacts of the drone data collection and handling processes
with a pilot, working with an engineer who can customize and serve to enhance the credibility of drone remote sens-
the drone and sensor technology, and/or including an agricul- ing. In this way, Bennett et al. (2022) highlight the impor-
tural specialist to better understand the crop being remotely tance of engagement of situated knowledge and empower-
sensed. However, multi- and interdisciplinary research can be ment of marginalized actors within remote sensing work. Al-
challenging due to a lack of understanding of each other’s dis- though many would agree with this sentiment, few drone re-
ciplines. Such collaboration can be extremely rewarding, but mote sensing works have adopted this approach. Cummings
collaborative relationships take time to foster. Be wary that et al. (2017a) provide an important example of collaborative
those from fields of study that have yet to implement drones drone remote sensing with Indigenous communities in ru-
might view the technology as a magical solution capable of ral Guyana, specifically focusing on empowerment through
solving any problem. Additionally, those interested in using resource management practices. More broadly, though, liter-
drones in research should appreciate that remote sensing is a ature on drone-based countermapping also illustrates how
field of research and takes many years to hone specific skills; mapping practices can be inclusive and empowering (see
it is very difficult to just buy a drone and be ready to collect Radjawali and Pye 2017; Radjawali et al. 2017).
data (as many expect). Often, drone remote sensing experts Remember, you do not have to learn it all when it comes to
must break the news to potential collaborators about what drones because there is often too much to cover. Be open to
the technology can actually do. Having resources prepared to learning and working with others. Get involved and support
share with colleagues about your expertise and how it relates the work of your colleagues. Acknowledge the work of your
to your planned collaborative work. colleagues and share authorship with groups you work with,
More broadly, off-campus, the power of drones lies in their including landowners and community members that con-
ability to change and potentially increase participation in tribute to your projects (see Cummings et al. 2017a, 2017b).
the remote sensing process. Prior to drones, remotely sensed
data collection was primarily a top-down process. Managers,
academics, and government authorities, with different out-
Summary
comes in mind, deployed the full range of platforms— — Simply put, drones are and will continue to impact and
pigeons, kites, balloons, and satellites as examples——to obtain change the field of remote sensing. But will we, the remote
data. Imagine waking up one morning to see a strange de- sensing community, change with it? The growing body of
thermal and multispectral imagery using an unmanned aerial vehicle Dandois, J.P., Olano, M., and Ellis, E.C. 2015. Optimal altitude, overlap,
(UAV). Irrig. Sci. 30(6): 511–522. doi:10.1007/s00271-012-0382-9. and weather conditions for computer vision UAV estimates of forest
Barnhart, R.K., Marshall, D.M., and Shappee, E. (Editors). 2021. Introduc- structure. Remote Sens. 7(10): 13895–13920. doi:10.3390/rs71013895.
tion to unmanned aircraft systems. 3rd ed. CRC Press, Boca Raton, FL. Dang, L.M., Wang, H., Li, Y., Min, K., Kwak, J.T., Lee, O.N., et al. 2020.
ISBN: 9780367366599. Fusarium wilt of radish detection using RGB and near infrared images
Bekar, A., Antoniou, M., and Baker, C.J. 2022. Low-costs, high-resolution, from unmanned aerial vehicles. Remote Sens. 12: 2863. doi:10.3390/
drone-borne SAR imaging. IEEE Trans. Geosci. Remote Sens. 60: rs12172863.
5208811. doi:10.1109/TGRS.2021.3085235. Davis, D.S., Buffa, D., Rasolondrainy, T., Creswell, E., Anyanwu, C.,
Bennett, M.M., Chen, J.K., Alvarez León, L.F., and Gleason, C.J. 2022. Ibirogba, A., et al. 2021. The aerial panopticon and the ethics of
The politics of pixels: a review and agenda for critical remote archaeological remote sensing in sacred cultural spaces. Archaeol.
sensing. Prog. Hum. Geogr. 46(3): 729–752. doi:10.1177/ Prospect. 28(3): 305–320. doi:10.1002/arp.1819.
03091325221074691. Deng, J.S., Wang, K., Deng, Y.H., and Qi, G.J. 2008. PCA-based land-use
Bieber, P., Seifried, T.M., Burkart, J., Gratzl, J., Kasper-Giebl, A., Schmale, change detection and analysis using multitemporal and multisen-
D.G., and Grothe, H. 2020. A drone-based bioaerosol sampling system sor satellite data. Int. J. Remote Sens. 29(16): 4823–4838. doi:10.1080/
to monitor ice nucleation particles in the lower atmosphere. Remote 01431160801950162.
Sens. 12(3): 552. doi:10.3390/rs12030552. Deng, L., Chen, Y., Zhao, Y., Zhu, L., Gong, H.-L., Guo, L.-J., and Zou, H.-
Bolch, E.A., Hestir, E.L., and Khanna, S. 2021. Performance and feasibility Y. 2021. An approach for reflectance anisotropy retrieval from UAV-
of drone-mounted imaging spectroscopy for invasive aquatic vegeta- based oblique photogrammetry hyperspectral imagery. Int. J. Appl.
tion detection. Remote Sens. 13(4): 582. doi:10.3390/rs13040582. Earth Observ. Geoinf. 102: 102442. doi:10.1016/j.jag.2021.102442.
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
Bracken-Roche, C. 2016. Domestic drones: the politics of verticality and DJI. 2022. Zenmuse L1. Available from https://www.dji.com/zenmuse-l1
the surveillance industrial complex. Geogr. Helv. 71: 167–172. doi:10. [accessed 24 May 2022].
5194/gh-71-167-2016. DroneDeploy. 2022. Drone industry resources. Available from https://ww
Braun, S., Friedewald, M., and Valkenburg, G. 2015. Civilizing drones— — w.dronedeploy.com/resources/ [accessed 25 October 2022].
military discourses going civil? Sci. Technol. Stud. 28(2): 73–87. DronesMadeEasy. 2021. Overlap management. Available from
doi:10.23987/sts.55351. https://support.dronesmadeeasy.com/hc/en-us/articles/207743803
Calvario, G., Sierra, B., Alarcón, T.E., Hernandez, C., and Dalmau, O. 2017. -Overlap-Management [accessed 25 October 2022].
A multi-disciplinary approach to remote sensing through low-cost Ebeid, E., Skriver, M., and Jin, J. 2017. A survey on open-source flight
UAVs. Sensors, 17(6): 1411. doi:10.3390/s17061411. control platforms of unmanned aerial vehicle. 2017 Euromicro Con-
Campbell, M.J., Dennison, P.E., Tune, J.W., Kannenberg, S.A., Kerr, K.L., ference on Digital System Design 396-402. doi:10.1109/DSD.2017.30.
Codding, B.F., and Anderegg, W.R.L. 2020. A multi-sensor, multi-scale Enemark, C. 2013. Armed drones and the ethics of war: military virtue
approach to mapping tree mortality in woodland ecosystems. Remote in a post-heroic age. Routledge, Kentucky.
Sens. Environ. 245: 111853. doi:10.1016/j.rse.2020.111853. Esposito, G., Mastrorocco, G., Salvini, R., Oliveti, M., and Starita, P. 2017a.
Carbonneau, P.L., and Dietrich, J. 2017. Cost-effective non-metric pho- Application of UAV photogrammetry for the multi-temporal estima-
togrammetry from consumer-grade sUAS: implications for direct geo- tion of surface extent and volumetric excavation in the Sa Pigada
referencing of structure from motion photogrammetry. Earth Surf. Bianca open-pit mine, Sardinia, Italy. Environ. Earth Sci. 76: 103.
Processes Landforms, 42: 473–486. doi:10.1002/esp.4012. doi:10.1007/s12665-017-6409-z.
Carrivick, J.L., Smith, M.W., and Quincey, D.J. 2016. Structure from mo- Esposito, G., Salvini, R., Matano, F., Sacchi, M., Danzi, M., Somma, R.,
tions in geosciences. Wiley-Blackwell, Chichester, UK. and Troise, C. 2017b. Multitemporal monitoring of a coastal land-
Chen, G., Hay, G.J., Carvalho, L.M.T., and Wulder, M.A. 2012. Object-based slide through SfM-derived point cloud comparison. Photogramm.
change detection. Int. J. Remote Sens. 33(14): 4434–4457. doi:10.1080/ Rec. 32(160): 459–479. doi:10.1111/phor.12218.
01431161.2011.648285. FAA (Federal Aviation Administration). 2021. Part 89 – remote identifica-
Clapuyt, F., Vanacker, V., Schlunegger, F., and Van Oost, K. 2017. Unrav- tion of unmanned aircraft. Available from https://www.ecfr.gov/cur
elling earth flow dynamics with 3-D time series derived from rent/title-14/chapter-I/subchapter-F/part-89 [accessed 24 May 2022].
UAV-SfM models. Earth Surf. Dyn. 5: 791–806. doi:10.5194/ FAA (Federal Aviation Administration). 2022. Unmanned aircraft sys-
esurf-5-791-2017. tems (UAS). Available from https://www.faa.gov/uas/ [accessed 11 April
Cliffe, A.D. 2019. Evaluating the introduction of unmanned aerial ve- 2022].
hicles for teaching and learning in geoscience fieldwork education. Fawcett, D., and Anderson, K. 2019. Investigating impacts of calibra-
J. Geogr. Higher Educ. 43(4): 582–598. doi:10.1080/03098265.2019. tion methodology and irradiance variations on lightweight drone-
1655718. based sensor derived surface reflectance products. In Proceedings of
Cohn, M. 2014. Drones and targeted killing: legal, moral, and geopolitical SPIE: Remote Sensing for Agriculture, Ecosystems, and Hydrology,
issues. Olive Branch Press, New York. XXI 111490D. doi:10.1117/12.2533106.
Coker, C., 2013. Warrior Geeks: how 21st century technology is changing Fawcett, D., Panigada, C., Tagliabue, G., Boschetti, M., Celesti, M., Evdoki-
the way we fight and think about war. Columbia University Press, mov, A., et al. 2020. Multi-scale evaluation of drone-based multispec-
New York. tral surface reflectance and vegetation indices in operational condi-
Colomina, I., and Molina, P. 2014. Unmanned aerial systems for pho- tions. Remote Sens. 12(3): 514. doi:10.3390/rs12030514.
togrammetry and remote sensing: a review, ISPRS. J. Photogramm. Fonstad, M.A., Dietrich, J.T., Courville, B.C., Jensen, J.L., and Carbonneau,
Remote Sens. 92: 79–97. doi:10.1016/j.isprsjprs.2014.02.013. P.E. 2013. Topographic structure from motion: A new development
Cummings, A.R., Cummings, G.R., Hamer, E., Moses, P., Norman, Z., in photogrammetric measurement. Earth Surf. Processes Landforms,
Captain, V., et al. 2017a. Developing a UAV-based monitoring pro- 38(4): 421–430. doi:10.1002/esp.3366.
gram with indigenous peoples. J. Unmanned Veh. Syst. 5(4): 115–125. Fraser, B.T., and Congalton, R.G. 2018. Issues in unmanned aerial systems
doi:10.1139/juvs-2016-0022. (UAS) data collection of complex forest environments. Remote Sens.
Cummings, A.R., Karale, Y., Cummings, G.R., Hamer, E., Moses, P., Nor- 10: 908. doi:10.3390/rs10060908.
man, Z., and Captain, V. 2017b. UAV-derived data for mapping change Frazier, A.E., and Singh, K.K. (Editors). 2021. Fundamentals of capturing
on a swidden agriculture plot: preliminary results from a pilot study. and processing drone imagery and data. CRC Press, New York. ISBN:
Int. J. Remote Sens. 38(8–10): 2066–2082. doi:10.1080/01431161.2017. 9780367245726
1295487. Furukawa, Y., and Ponce, J. 2010. Accurate, dense and robust multiview
Cummings, A.R., McKee, A., Kulkarni, K., and Markandey, N. 2017c. The stereopsis. IEEE Trans. Pattern Anal. Mach. Intell. 32(8): 1362–1376.
rise of UAVs. Photogramm. Eng. Remote Sens. 83(4): 317–325. doi:10. doi:10.1109/tpami.2009.161.
14358/PERS.83.4.317. Furukawa, Y., Curless, B., Seitz, S.M., and Szeliski, R. 2010. Towards
Dandois, J.P., and Ellis, E.C. 2013. High spatial resolution three- internet-scale multi-view stereo. In Proceedings of the 2010 IEEE
dimensional mapping of vegetation spectral dynamics using com- Computer Society Conference on Computer Vision and Pattern
puter vision. Remote Sens. Environ. 136: 259–276. doi:10.1016/j.rse. Recognition, San Francisco, CA, USA. pp. 1434–1441. doi:10.1109/
2013.04.005. CVPR.2010.5539802.
Galván Rangel, J.M., Gonçalves, G.R., and Antonio Pérez, J. 2018. The ICAO (International Civil Aviation Organization). 2022. ICAO model UAS
impact of number and spatial distribution of GCPs on the posi- regulations— —introduction to ICAO model regulations and advisory
tional accuracy of geospatial products derived from low-cost UASs. circulars. Available from https://www.icao.int/saf ety/UA/Pages/ICAO-
Int. J. Remote Sens. 39(21): 7154–7171. doi:10.1080/01431161.2018. Model-UAS-Regulations.aspx [accessed 12 April 2022].
1515508. Immitzer, M., Böck, S., Einzmann, K., Vuolo, F., Pinnel, N., Wallner,
GeoIDEA Lab (Auburn University Geosciences) . 2022. Mission checklist— — A., and Atzberger, C. 2018. Fractional cover mapping of spruce and
UAV (P4P). Available from https://geolightbulbcom.files.wordpress pine at 1 ha resolution combining very high and medium spatial
.com/2022/10/uas-mission-checklist-p4p.docx [accessed 3 November resolution satellite imagery. Remote Sens. Environ. 204: 690–703.
2022]. doi:10.1016/j.rse.2017.09.031.
GISCI (Geographic Information Systems Certification Institute). 2022. Jackman, A., and Brickell, K. 2021. ‘Everyday droning’: towards a femi-
The GIS Certification Institute: code of ethics. Available from https:// nist geopolitics of the drone-home. Prog. Hum. Geogr. doi:10.1177/
www.gisci.org/Portals/0/Ethics/CodeOfEthics_PR.pdf [accessed 31 July 03091325211018745.
2022]. James, M.R., and Robson, S. 2014. Mitigating systematic error in to-
Gómez-Gutiérrez, A., and Gonçalves, G.R. 2020. Surveying coastal cliffs pographic models derived from UAV and ground-based image net-
using two UAV platforms (multirotor and fixed-wing) and three dif- works. Earth Surf. Processes Landforms, 39: 1413–1420. doi:10.1002/
ferent approaches for the estimation of volumetric changes. Int. J. Re- esp.3609.
mote Sens. 41(21): 8143–8175. doi:10.1080/01431161.2020.1752950. James, M.R., Chandler, J.H., Eltner, A., Fraser, C., Miller, P.E., Mills, J.P.,
Goodchild, M.F. 1992. Geographical information science. Int. J. Geogr. Inf. et al. 2019. Guidelines on the use of structure-from-motion pho-
Syst. 6(1): 31–45. doi:10.1080/02693799208901893. togrammetry in geomorphic research. Earth Surf. Processes Land-
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
Green, D.R., Gregory, B.J., and Karachok, A. (Editors). 2020. Unmanned forms, 44(10): 2081–2084. doi:10.1002/esp.4637.
aerial remote sensing: UAS for environmental applications. CRC Press James, M.R., Robson, S., d’Oleire-Oltmanns, S., and Niethammer, U. 2017.
(Taylor & Francis Group), New York. ISBN: 9781482246070. Optimising UAV topographic surveys processed with structure-from-
Griffiths, D., and Burningham, H. 2019. Comparison of pre- and self- motion: ground control quality, quantity and bundle adjustment. Ge-
calibrated camera calibration models for UAS-derived nadir imagery omorphology, 280: 51–66. doi:10.1016/j.geomorph.2016.11.021.
for a SfM application. Prog. Phys. Geogr. 43(2): 215–235. doi:10.1177/ Jaskierniak, D., Lucieer, A., Kuczera, G., Turner, D., Lane, P.N.J., Benyon,
0309133318788964. R.G., and Haydon, S. 2021. Individual tree detection and crown delin-
Guo, Y., Senthilnath, J., Wu, W., Zhang, X., Zeng, Z., and Huang, H. 2019. eation from unmanned aircraft system (UAS) LiDAR in structurally
Radiometric calibration for multispectral camera of different imag- complex mixed species eucalypt forests. ISPRS J. Photogramm. Re-
ing conditions mounted on a UAV platform. Sustainability, 11: 978. mote Sens. 171: 171–187. doi:10.1016/j.isprsjprs.2020.10.016.
doi:10.3390/su11040978. Jensen, J.R. 2017. Drone aerial photography and videography: Data col-
Hardin, P.J., and Hardin, T.J. 2010. Small-scale remotely piloted vehicles lection and image interpretation. eBook.
in environmental research. Geogr. Compass, 4(9): 1297–1311. doi:10. Jenssen, R.O.R., and Jacobsen, S.K. 2021. Measurement of snow wa-
1111/j.1749-8198.2010.00381.x. ter equivalent using drone-mounted ultra-wide-band radar. Remote
Hardin, P.J., Lulla, V., Jensen, R.R., and Jensen, J.R. 2019. Small unmanned Sens. 13(13): 2610. doi:10.3390/rs13132610.
aerial systems (sUAS) for environmental remote sensing: challenges Jeziorska, J. 2014. Unmanned aerial vehicle— —a tool for acquiring spatial
and opportunities revisited. GIScience Remote Sens. 56(2): 309–322. data for research and commercial purposes. New course in the ge-
doi:10.1080/15481603.2018.1510088. ography and cartography curriculum in higher education. Int. Arch.
Harvey, M.C., Hare, D.K., Hackman, A., Davenport, G., Haynes, A.B., Photogramm. Remote Sens. Spatial Inf. Sci. XL–6: 37–42. doi:10.5194/
Helton, A., et al. 2019. Evaluation of stream and wetland restora- isprsarchives-XL-6-37-2014.
tion using UAS-based thermal infrared mapping. Water, 11: 1568. Jiang, S., and Jiang, W. 2017. On-board GNSS/IMU assisted feature extrac-
doi:10.3390/w11081568. tion and matching for oblique UAV images. Remote Sens. 9(8): 813.
Harwin, S., Lucieer, A., and Osborn, J. 2015. The impact of calibration doi:10.3390/rs9080813.
method on the accuracy of point clouds derived using unmanned Joyce, K.E., Duce, S., Leahy, S.M., Leon, J., and Maier, S.W., 2018. Principles
aerial vehicle multi-view stereopsis. Remote Sens. 7(9): 11933–11953. and practice of acquiring drone-based image data in marine environ-
doi:10.3390/rs70911933. ments. Mar. Freshwater Res. 70(7): 952–963. doi:10.1071/MF17380.
Heinemann, S., Siegmann, B., Thonfeld, F., Muro, J., Jedmowski, C., Joyce, K.E., Meiklejohn, N., and Mead, P.C.H. 2020. Using minidrones to
Kemna, A., et al. 2020. Land surface temperature retrieval for agricul- teach geospatial technology fundamentals. Drones, 4(3): 57. doi:10.
tural areas using a novel UAV platform equipped with a thermal in- 3390/drones4030057.
frared and multispectral sensor. Remote Sens. 12: 1075. doi:10.3390/ Kaartinen, H., Hyyppä, J., Vastaranta, M., Kukko, A., Jaakkola, A., Yu, X.,
rs12071075. et al. 2015. Accuracy of kinematic positioning using global satellite
Hemingway, B.L., Frazier, A.E., Elbing, B.R., and Jacob, J.D. 2017. Verti- navigation systems under forest canopies. Forests, 6(9): 3218–3236.
cal sampling scales for atmospheric boundary layer measurements doi:10.3390/f6093218.
from small unmanned aircraft systems (sUAS). Atmosphere, 8(9): 176. Kaminsky, R.S., Snavely, N., Seitz, S.T., and Szeliski, R. 2009. Alignment
doi:10.3390/atmos8090176. of 3D point clouds to overhead images. In Proceedings of the 2009
Hodgson, M.E., and Sella-Villa, D. 2021. State-level statutes governing un- IEEE Computer Society Conference on Computer Vision and Pattern
manned aerial vehicle use in academic research in the United States. Recognition Workshops, Miami, FL, USA. Vol. 1. pp. 63–70. doi:10.
Int. J. Remote Sens. 42(14): 5366–5395. doi:10.1080/01431161.2021. 1109/CVPRW.2009.5204180.
1916121. Kattenborn, T., Eichel, J., Wiser, S., Burrows, L., Fassnacht, F.E., and
Hoople, G., Choi-Fitzpatrick, A., and Reddy, E. 2019. Drones for good: Schmidtlein, S. 2020. Convolutional Neural Networks accurately pre-
interdisciplinary project-based learning between engineering and dict cover fractions of plant species and communities in Unmanned
peace studies. Int. J. Eng. Educ. 35(5): 1378–1391. Aerial Vehicle imagery. Remote. Sens. Ecol. Conserv. 6: 472–486.
Huang, S., Tang, L., Hupy, J.P., Wang, Y., and Shao, G. 2021. A commen- doi:10.1002/rse2.146.
tary on the use of normalized difference vegetation index (NDVI) in Kattenborn, T., Lopatin, J., Forster, M., Braun, A.C., and Fassnacht, F.E.
the era of popular remote sensing. J. For. Res. 32(1): 1–6. doi:10.1007/ 2019. UAV data as alternative to field sampling to map woody invasive
s11676-020-01155-1. species based on combined Sentinel-1 and Sentinel-2 data. Remote
Hunt, E.R., and Daughtry, C.S.T. 2018. What good are unmanned aircraft Sens. Environ. 227: 61–73. doi:10.1016/j.rse.2019.03.025.
systems for agricultural remote sensing and precision agriculture? Kedron, P., Li, W., Fotheringham, S., and Goodchild, M. 2021. Repro-
Int. J. Remote Sens. 39(15–16): 5345–5376. doi:10.1080/01431161. ducibility and replicability: opportunities and challenges for geospa-
2017.1410300. tial research. Int. J. Geogr. Inf. Sci. 35(3): 427–445. doi:10.1080/
Hunt, E.R., Hively, W.D., Fujikawa, S.J., Linden, D.S., Daughtry, C.S.T., 13658816.2020.1802032.
and McCarty, G.W. 2010. Acquisition of NIR-Green-blue digital pho- Kelcey, J., and Lucieer, A. 2012. Sensor correction of a 6-band multi-
tographs from unmanned aircraft for crop monitoring. Remote Sens. spectral imaging sensor for UAV remote sensing. Remote Sens. 4(12):
2: 290–305. doi:10.3390/rs2010290. 1462–1493. doi:10.3390/rs4051462.
Lamb, D.W., Hall, A., Louis, J., and Frazier, P. 2004. Remote sensing imagery and data. Edited by A.E. Frazier and K.K. Singh. CRC Press
for vineyard management. Does (pixel) size really matter? Aust. NZ (Taylor & Francis Group), New York. ISBN: 9780367245726.
Grapegrower Winemaker, 485: 139–142. Mathews, A.J. 2022. Unoccupied aircraft systems. In Oxford bibliogra-
Leitão, P., Schwieder, M., Pötzschner, F., Pinto, J.R.R., Teixeira, A.M.C., Pe- phies: geography. Edited by B. Warf. Oxford University Press.
droni, F., et al. 2018. From sample to pixel: multi-scale remote sensing Mathews, A.J., and Frazier, A.E. 2017. Unmanned aerial systems. In Ge-
data for upscaling aboveground carbon data in heterogeneous land- ographic information science & technology body of knowledge. 2nd
scapes. Ecosphere, 9(8): e02298. doi:10.1002/ecs2.2298. quarter 2017 ed. Edited by J.P. Wilson. doi:10.22224/gistbok/2017.2.4.
Lepš, J., and Hadincová, V. 1992. How reliable are our vegetation analy- McCarthy, E.D., Martin, J.M., Boer, M.M., and Welbergen, J.A. 2021. Drone-
ses? J. Veg. Sci. 3(1): 119–124. doi:10.2307/3236006. based thermal remote sensing provides an effective new tool for mon-
Li, D., Zheng, H., Xu, X., Lu, N., Yao, X., Jiang, J., et al. 2018. BRDF effect on itoring the abundance of roosting fruit bats. Remote Sens. Ecol. Con-
the estimation of canopy chlorophyll content in paddy rice from UAV- serv. 7: 461–474. doi:10.1002/rse2.202.
based hyperspectral imagery. In 2018 IEEE International Geoscience Meinen, B.U., and Robinson, D.T. 2020. Mapping erosion and deposi-
and Remote Sensing Symposium (IGARSS), Valencia, Spain. pp. 6464– tion in an agricultural landscape: optimization of UAV image ac-
6467. doi:10.1109/IGARSS.2018.8517684. quisition schemes for SfM-MVS. Remote Sens. Environ. 239: 111666.
Li, S., Dragicevic, S., Antón Castro, F., Sester, M., Winter, S., Coltekin, doi:10.1016/j.rse.2020.111666.
A., et al. 2016. Geospatial big data handling theory and methods: a MicaSense. 2022. Best practices: collecting data with MicaSense sensors.
review and research challenges. ISPRS J. Photogramm. Remote Sens. Available from https://support.micasense.com/hc/en-us/articles/22
115: 119–133. doi:10.1016/j.isprsjprs.2015.10.012. 4893167-Best-practices-Collecting-Data-with-MicaSense-Sensors
Lippitt, C.D. 2015. Remote sensing from small unmanned platforms: [accessed 25 October 2022].
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
a paradigm shift. Environ. Pract. 17(3): 235–236. doi:10.1017/ Nababan, B., Mastu, L.O.K., Idris, N.H., and Panjaitan, J.P. 2021.
S1466046615000204. Shallow-water benthic habitat mapping using drone with object
Lippitt, C.D., and Zhang, S. 2018. The impact of small unmanned airborne based image analyses. Remote Sens. 13(21): 4452. doi:10.3390/
platforms on passive optical remote sensing: a conceptual perspec- rs13214452.
tive. Int. J. Remote Sens. 39(15–16): 4852–4868. doi:10.1080/01431161. Nevalainen, O., Honkavaara, E., Tuominen, S., Viljanen, N., Hakala, T.,
2018.1490504. Yu, X., et al. 2017. Individual tree detection and classification with
Lippitt, C.D., Stow, D.A., and Clarke, K.C. 2014. On the nature of models UAV-based photogrammetric point clouds and hyperspectral imag-
for time-sensitive remote sensing. Int. J. Remote Sens. 35(18): 6815– ing. Remote Sens. 9(3): 185. doi:10.3390/rs9030185.
6841. doi:10.1080/01431161.2014.965287. Nex, F., Armenakis, C., Cramer, M., Cucci, D.A., Gerke, M., Honkavaara,
Liu, X., Lian, X., Yang, W., Wang, F., Han, Y., and Zhang, Y. 2022. Accuracy E., et al. 2022. UAV in the advent of the twenties: where we stand
assessment of a UAV direct georeferencing method and impact of the and what is next. ISPRS J. Photogramm. Remote Sens. 184: 215–242.
configuration of ground control points. Drones, 6: 30. doi:10.3390/ doi:10.1016/j.isprsjprs.2021.12.006.
drones6020030. Nezami, S., Khoramshahi, E., Nevalainen, O., Pölönen, I.,
López, Y.A., García-Fernández, M., Álvarez-Narciandi, G., and Las-Heras and Honkavaara, E. 2020. Tree species classification of
Andrés, F. 2022. Unmanned aerial vehicle-based ground-penetrating drone hyperspectral and RGB imagery with deep learning
radar systems: a review. IEEE Geosci. Remote Sens. Mag. 10(2): 66–86. convolutional neural networks. Remote Sens. 12(7): 1070.
doi:10.1109/MGRS.2022.3160664. doi:10.3390/rs12071070.
Lowe, D. 2004. Distinctive image features from scale-invariant keypoints. Niu, Y., Zhang, L., Zhang, H., Han, W., and Peng, X. 2019. Estimat-
Int. J. Comput. Vision, 60(2): 91–110. doi:10.1023/B:VISI.0000029664. ing above-ground biomass of maize using features derived from
99615.94. UAV-based RGB imagery. Remote Sens. 11(11): 1261. doi:10.3390/
Ludwig, M., Runge, C.M., Friess, N., Koch, T.L., Richter, S., Seyfried, rs11111261.
S., et al. 2020. Quality assessment of photogrammetric methods— —a O’Connor, J., Smith, M.J., and James, M.R. 2017. Cameras and settings for
workflow for reproducible UAS orthomosaics. Remote Sens. 12(22): aerial surveys in the geosciences: inimizes image data. Prog. Phys.
3831. doi:10.3390/rs12223831. Geogr. 41(3): 325–344. doi:10.1177/0309133317703092.
Lunetta, R.S., Congalton, R.G., Fenstermaker, L.K., Jensen, J.R., McGwire, Oniga, V.E., Breaban, A.I., Pfeifer, N., and Chirila, C. 2020. Determining
K.C., and Tinney, L.R. 1991. Remote sensing and geographic informa- the suitable number of ground control points for UAS images georef-
tion system data integration: Error sources and research issues. Pho- erencing by varying number and spatial distribution. Remote Sens.
togramm. Eng. Remote Sens. 57(6):677–687. 12: 876. doi:10.3390/rs12050876.
Madokoro, H., Kiguchi, O., Nagayoshi, T., Chiba, T., Inoue, M., Chiyonobu, Oré, G., Alcântara, M.S., Góes, J.A., Teruel, B., Oliveira, L.P., Yepes, J.,
S., et al. 2021. Development of drone-mounted multiple sensing sys- et al. 2022. Predicting sugarcane harvest date and productivity with
tem with advanced mobility for In situ atmospheric measurement: a a drone-borne tri-band SAR. Remote Sens. 14(7): 1734. doi:10.3390/
case study focusing on PM2.5 local distribution. Sensors, 21(14): 4881. rs14071734.
doi:10.3390/s21144881. Pajares, G. 2015. Overview and current status of remote sensing applica-
Mamaghani, B., and Salvaggio, C. 2019. Multispectral sensor calibration tions based on unmanned aerial vehicles (UAVs). Photogramm. Eng.
and characterization for sUAS remote sensing. Sensors, 19: 4453. Remote Sens. 81(4): 281–329. doi:10.14358/PERS.81.4.281.
doi:10.3390/s19204453. Paneque-Gálvez, J., McCall, M.K., Napoletano, B.M., Wich, S.A., and Koh,
Matese, A., and Di Gennaro, S.F. 2018. Practical applications of a mul- L.P. 2014. Small drones for community-based forest monitoring: an
tisensor UAV platform based on multispectral, thermal and RGB assessment of their feasibility and potential in tropical areas. Forests,
high resolution images in precision viticulture. Agriculture, 8(7): 116. 5(6): 1481–1507. doi:10.3390/f5061481.
doi:10.3390/agriculture8070116. Paneque-Gálvez, J., Vargas-Ramírez, N., Napoletano, B.M., and Cum-
Matese, A., Toscano, P., Di Gennaro, S.F., Genesio, L., Vaccari, F.P., Prim- mings, A. 2017. Grassroots innovation using drones for indigenous
icerio, J., et al. 2015. Intercomparison of UAV, aircraft and satellite re- mapping and monitoring. Land, 6: 86. doi:10.3390/land6040086.
mote sensing platforms for precision viticulture. Remote Sens. 7(3): Pix4D. 2022. Pix4D documentation. Available from https://support.pix4
2971–2990. doi:10.3390/rs70302971. d.com/ [accessed 25 October 2022].
Mathews, A.J. 2014. Object-based spatiotemporal analysis of Pricope, N.G., Halls, J.N., Mapes, K.L., Baxley, J.B., and Wu, J.J.
vine canopy vigor using an inexpensive UAV remote sensing 2020. Quantitative comparison of UAS-borne LiDAR systems for
system. J. Appl. Remote Sens. 8(1): 085199, 1-17. doi:10.1117/1.JRS.8. high-resolution forested wetland mapping. Sensors, 20: 4453. doi:10.
085199. 3390/s20164453.
Mathews, A.J. 2015. A practical UAV remote sensing methodology to gen- Pricope, N.G., Mapes, K.L., Woodward, K.D., Olsen, S.F., and Baxley, J.B.
erate multispectral orthophotos for vineyards: estimation of spectral 2019. Multi-sensor assessment of the effects of varying processing
reflectance using compact digital cameras. Int. J. Appl. Geospatial Res. parameters on UAS product accuracy and quality. Drones, 3: 63.
6(4): 65–87. doi:10.4018/ijagr.2015100104. doi:10.3390/drones3030063.
Mathews, A.J. 2021. Structure from motion (SfM) workflow for processing Puko, T., and Ferek, K.S. 2019. Interior department grounds aerial drone
drone imagery. In Fundamentals of capturing and processing drone fleet, citing risk from Chinese manufacturers. Wall Street J.Available
from https://www.wsj.com/articles/interior-dept-grounds-aerial-dr Stefanik, K.V., Gassaway, J.C., Kochersberger, K., and Abbott, A.L. 2011.
one-fleet-citing-risk-f rom-chinese-manufacturers-11572473703 UAV-based stereo vision for rapid aerial terrain mapping. GISci. Re-
[accessed 2 August 2022]. mote Sens. 48(1): 24–49. doi:10.2747/1548-1603.48.1.24.
Radjawali, I., and Pye, O. 2017. Drones for justice: inclusive technology Stöcker, C., Bennett, R., Nex, F., Gerke, M., and Zevenbergen, J. 2017. Re-
and river-related action research along the Kapuas. Geogr. Helv. 72: view of the current State of UAV regulations. Remote Sens. 9(5): 459.
17–27. doi:10.5194/gh-72-17-2017. doi:10.3390/rs9050459.
Radjawali, I., Pye, O., and Flitner, M. 2017. Recognition through Stöcker, C., Nex, F., Koeva, M., and Gerke, M. 2020. High-quality UAV-
reconnaissance? Using drones for counter-mapping in based orthophotos for cadastral mapping: guidance for optimal
Indonesia. J. Peasant Stud. 44(4): 817–833. doi:10.1080/03066150. flight configurations. Remote Sens. 12(21): 3625. doi:10.3390/
2016.1264937. rs12213625.
Rogers, S., Livingstone, W., and Manning, I. 2020. Comparing the spa- Strahler, A.H., Woodcock, C.E., and Smith, J.A. 1986. On the nature of
tial accuracy of digital surface models from four unoccupied aerial models in remote-sensing. Remote Sens. Environ. 20(2): 121–139.
systems: photogrammetry versus LiDAR. Remote Sens. 12(17): 2806. doi:10.1016/0034-4257(86)90018-0.
doi:10.3390/rs12172806. Suomalainen, J., Oliveira, R.A., Hakala, T., Koivumäki, N., Markelin, L.,
Rogers, S., Singh, K.K., Mathews, A.J., and Cummings, A.R. 2022. Drones Näsi, R., and Honkavaara, E. 2021. Direct reflectance transformation
and geography: who is using them and why? Prof. Geogr. doi:10.1080/ methodology for drone-based hyperspectral imaging. Remote Sens.
00330124.2021.2000446. Environ. 266: 112691. doi:10.1016/j.rse.2021.112691.
Rosnell, T., and Honkavaara, E. 2012. Point cloud generation from aerial Swayze, N.C., Tinkham, W.T., Creasy, M.B., Vogeler, J.C., Hoffman, C.M.,
image data acquired by a quadrocopter type micro unmanned aerial and Hudak, A.T. 2022. Influence of UAS flight altitude and speed
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23
vehicle and a digital still camera. Sensors, 12: 453–480. doi:10.3390/ on aboveground biomass prediction. Remote Sens. 14: 1989. doi:10.
s120100453. 3390/rs14091989.
Sankey, T., Donager, J., McVay, J., and Sankey, J.B. 2017. UAV Thomas, A.F., Frazier, A.E., Mathews, A.J., and Cordova, C.E. 2020. Im-
lidar and hyperspectral fusion for forest monitoring in the pacts of abrupt terrain changes and grass cover on vertical accuracy
southwestern USA. Remote Sens. Environ. 195: 30–43. doi:10.1016/ of UAS-SfM derived elevation models. Pap. Appl. Geogr. 6(4): 336–351.
j.rse.2017.04.007. doi:10.1080/23754931.2020.1782254.
Santana, L.S., Ferraz, G.A.E.S., Marin, D.B., Barbosa, B.D.S., Dos Santos, Tmušić, G., Manfreda, S., Aasen, H., James, M.R., Gonçalves, G., Ben-Dor,
L.M., Ferraz, P.F.P., et al. 2021. Influence of flight altitude and control E., et al. 2020. Current practices in UAS-based environmental moni-
points in the georeferencing of images obtained by unmanned aerial toring. Remote Sens. 12(6): 1001. doi:10.3390/rs12061001.
vehicle. Eur. J. Remote Sens. 54(1): 59–71. doi:10.1080/22797254.2020. Turner, D., Lucieer, A., and De Jong, S.M. 2015. Time series analysis of
1845104. landslide dynamics using an unmanned aerial vehicle (UAV). Remote
Sanz-Ablanedo, E., Chandler, J.H., Rodríguez-Pérez, J.R., and Ordóñez, Sens. 7: 1736–1757. doi:10.3390/rs70201736.
C. 2018. Accuracy of unmanned aerial vehicle (UAV) and SfM pho- Turner, W. 2014. Sensing biodiversity: sophisticated networks are re-
togrammetry survey as a function of the number and location of quired to make the best use of biodiversity data from satellites
ground control points used. Remote Sens. 10: 1606. doi:10.3390/ and in situ sensors. Science, 346(6207): 301–302. doi:10.1126/science.
rs10101606. 1256014.
Schaeffer, S.E., Jiménez-Lizárraga, M., Rodriguez-Sanchez, S.V., Cuellar- U.S. DOD (Department of Defense). 2021. Department statement on DJI
Rodríguez, G., Aguirre-Calderón, O.A., Reyna-González, A.M., and Es- systems. Available from https://www.defense.gov/News/Releases/Rele
cobar, A. 2021. Detection of bark beetle infestation in drone im- ase/Article/2706082/department-statement-on-dji-systems/ [accessed
agery via thresholding cellular automata. J. Appl. Remote Sens. 15(1): 2 August 2022].
016518. doi:10.1117/1.JRS.15.016518. U.S. DOI (Department of the Interior). 2020. Secretarial order 3379: tem-
Shahbazi, M. 2021. Professional drone mapping. In Unmanned aerial sys- porary cessation of non-emergency unmanned aircraft systems eleet
tems. Edited by A. Koubaa and A.T. Azar. Academic Press (Elsevier), operations. Available from https://www.doi.gov/sites/doi.gov/files/eli
London. pp. 439–464. ISBN: 9780128202760. ps/documents/signed-so-3379-uas-1.29.2020-508.pdf [accessed 2 Au-
Shook, E., Bowlick, F.J., Kemp, K.K., Ahlqvist, O., Carbajeles-Dale, P., DiB- gust 2022].
iase, D., et al. 2019. Cyber literacy for GIScience: toward formal- U.S. Supreme Court. 1945. United States v. Causby, 328 U.S. 256. Avail-
izing geospatial computing education. Prof. Geogr. 71(2): 221–238. able from https://tile.loc.gov/storage-services/service/ll/usrep/usrep32
doi:10.1080/00330124.2018.1518720. 8/usrep328256/usrep328256.pdf [accessed 26 July 2023].
Simic Milas, A., Cracknell, A.P., and Warner, T.A. 2018. Drones— —the third Ullman, S. 1979. The interpretation of structure from motion. Proc. R.
generation source of remote sensing data. Int. J. Remote Sens. 39(21): Soc. Lond. 203(1153): 405–426. doi:10.1098/rspb.1979.0006.
7125–7137. doi:10.1080/01431161.2018.1523832. URISA (Urban and Regional Information Systems Association). 2003. GIS
Singh, K.K., and Frazier, A.E. 2018. A meta-analysis and review of un- code of ethics. Available from https://www.urisa.org/about-us/gis-cod
manned aircraft system (UAS) imagery for terrestrial applications. Int. e-of -ethics/ [accessed: 31 July 2022].
J. Remote Sens. 39(15–16): 5078–5098. doi:10.1080/01431161.2017. Valbuena, R., Mauro, F., Rodriguez-Solano, R., and Manzanera, J.A. 2010.
1420941. Accuracy and precision of GPS receivers under forest canopies in a
Singh, K.K., Vogler, J.B., Shoemaker, D.A., and Meentemeyer, R.K. 2012. mountainous environment. Spanish J. Agric. Res. 8(4): 1047–1057.
LiDAR-Landsat data fusion for large-area assessment of urban land doi:10.5424/sjar/2010084-1242.
cover: balancing spatial resolution, data volume and mapping accu- Vargas-Ramírez, N., and Paneque-Gálvez, J. 2019. The global emer-
racy. ISPRS J. Photogramm. Remote Sens. 74: 110–121. doi:10.1016/j. gence of community drones (2012–2017). Drones, 3: 76. doi:10.3390/
isprsjprs.2012.09.009. drones3040076.
Slonecker, E.T., Shaw, D.M., and Lillesand, T.M. 1998. Emerging legal and Vautherin, J., Rutishauser, S., Schneider-Zapp, K., Choi, H.F., Chovan-
ethical issues in advanced remote sensing technology. Photogramm. cova, V., Glass, A., and Strecha, C. 2016. Photogrammetric accu-
Eng. Remote Sens. 64(6): 589–595. racy and modeling of rolling shutter cameras. ISPRS Ann. Pho-
Smith, G.M., and Milton, E.J. 1999. The use of the empirical line method togramm. Remote Sens. Spatial Inf. Sci. III–3: 139–146. doi:10.5194/
to calibrate remotely sensed data to reflectance. Int. J. Remote Sens. isprsannals-III-3-139-2016.
20(13): 2653–2662. doi:10.1080/014311699211994. Ventura, D., Bonifazi, A., Gravina, M.F., Belluscio, A., and Ardizzone,
Snavely, N., Seitz, S.M., and Szeliski, R. 2008. Modeling the world from G. 2018. Mapping and classification of ecologically sensitive marine
internet photo collections. Int. J. Comput. Vision, 80(2): 189–210. habitats using unmanned aerial vehicle (UAV) imagery and object-
doi:10.1007/s11263-007-0107-3. based image analysis (OBIA). Remote Sens. 10(9): 1331. doi:10.3390/
Stark, M., Heckmann, T., Piermattei, L., Dremel, F., Kaiser, A., Machowski, rs10091331.
P., et al. 2021. From consumer to enterprise grade: how the choice Vergnano, A., Franco, D., and Godio, A. 2022. Drone-borne ground-
of four UAS impacts point cloud quality. Earth Surf. Processes Land- penetrating radar for snow cover mapping. Remote Sens. 14(7): 1763.
forms, 46: 2019–2043. doi:10.1002/esp.5142. doi:10.3390/rs14071763.
Wachowiak, M.P., Walters, D.F., Kovacs, J.M., Wachowiak-Smolíková, R., Young, D.J., Koontz, M.J., and Weeks, J. 2022. Optimizing aerial imagery
and James, A.L. 2017. Visual analytics and remote sensing imagery collection and processing parameters for drone-based individual tree
to support community-based research for precision agriculture in mapping in structurally complex conifer forests. Methods Ecol. Evol.
emerging areas. Comput. Electron. Agric. 143: 149–164. doi:10.1016/ 13: 1447–1463. doi:10.1111/2041-210X.13860.
j.compag.2017.09.035. Zimmerman, T., Jansen, K., and Miller, J. 2020. Analysis of UAS flight al-
Wackrow, R., and Chandler, J.H. 2008. A convergent image configuration titude and ground control point parameters on DEM accuracy along
for DEM extraction that minimises the systematic effects caused by a complex, developed coastline. Remote Sens. 12: 2305. doi:10.3390/
an inaccurate lens model. Photogramm. Rec. 23: 6–18. doi:10.1111/j. rs12142305.
1477-9730.2008.00467.x.
Wackrow, R., and Chandler, J.H. 2011. Minimising systematic error sur-
faces in digital elevation models using oblique convergent imagery. Additional resources
Photogrammetric Record 26: 16–31. doi:10.1111/j.1477-9730.2011.
00623.x. GeoTED-UAS, https://vsgc.odu.edu/geoted-uas/
Wallace, L., Lucieer, A., Malenovský, Z., Turner, D., and Vopěnka, P. 2016. Guyana Civil Aviation Authority (GCAA). (2022). GCAA Advi-
Assessment of forest structure using two UAV techniques: A compari-
sory Circular: Unmanned Aerial Vehicles (UAVs): AC No:
son of airborne laser scanning and structure from motion (SfM) point
clouds. Forests, 7: 62. doi:10.3390/f7030062. GCAA AC/UAV/001
Wallace, L., Lucieer, A., Watson, C., and Turner, D. 2012. Development of ICAO (International Civil Aviation Organization). The UCAO
a UAV-lidar system with application to forest inventory. Remote Sens. UAS Toolkit. https://www.icao.int/saf ety/UA/UASToolkit
Drone Syst. Appl. Downloaded from cdnsciencepub.com by 106.195.11.134 on 10/21/23