Название | Urban Remote Sensing |
---|---|
Автор произведения | Группа авторов |
Жанр | География |
Серия | |
Издательство | География |
Год выпуска | 0 |
isbn | 9781119625858 |
3.2.2.1 RGB Cameras
Visible‐spectrum cameras are the most used sensor paired with UAS platforms. These cameras generally collect high spatial resolution color imagery that can be used to generate digital elevation models (DEMs) and derive orthophoto mosaics.
3.2.2.2 Multispectral Sensors
Multispectral sensors extend beyond the visible portion of the electromagnetic spectrum. Multispectral images can be used to derive vegetation indices like Normalized Difference Vegetation Index (NDVI) and Enhanced Normalized Difference Vegetation Index (ENDVI). This type of sensor is primarily used in the fields of vegetation and agriculture (Adam et al., 2010). With much higher spatial resolution than traditional multispectral sensors mounted on airplanes or satellites, multispectral data collected with UAS allows for detailed examinations of phenomena like leave level farming (Calderón et al., 2014) and pads level water pollution issues (Kislik et al., 2018). However, they are significantly more expensive than RGB cameras. There is currently a lack of processing software that can handle various formats of multispectral data efficiently (Yao et al., 2019).
3.2.2.3 Hyperspectral Sensors
Hyperspectral sensors can capture spectral response at many narrow bands. With such a high spectral resolution, hyperspectral data are useful in many applications including vegetation analyses (Adam et al., 2010), precision agriculture (Haboudane et al., 2004), and urban mapping (Benediktsson et al., 2005). However, the high spectral resolution is often achieved at the cost of spatial resolution, and it is challenging to derive high‐accuracy products with limited meta‐information from the sensor manufacture (Yao et al., 2019).
3.2.2.4 Thermal Cameras
Thermal cameras are designed to detect thermal emission in the mid‐infrared range (Prakash, 2000). They are commonly used for temperature measurement in vegetation studies (Berni et al., 2009), environmental applications (Zarco‐Tejada et al., 2012), and real‐time detection of objects. Given the low flying height of UAS, the products can have a much higher spatial resolution and negligible atmospheric influence. However, UAS‐based thermal cameras usually do not have cooled detectors because of their size, which can lead to low sensitivity and capture rates (Yao et al., 2019).
3.2.2.5 LiDAR
LiDAR is an active remote sensor that sends out light pulses and records reflections of the pulses to measure distances. LiDAR is well known for its high geometric accuracy and its ability to penetrate forest canopies (Dalponte et al., 2009). Because LiDAR accuracy is highly influenced by the positional accuracy of the platform and UAS is usually unstable in flight and their GPS is inaccurate considering sensor resolution, it is hard to obtain accurate point clouds with UAS LiDAR without differential GPS stations. Without regard to cost, integration of RGB and LiDAR data can be promising to improve measurement and interpretation accuracy (Campos‐Taberner et al., 2016).
3.3 DATA COLLECTION AND PROCESSING
When using UAS for remote data collection, there are several different approaches one can take depending on the desired data outcomes and the specific UAS platform and sensor available. UAS are versatile in their ability to be used for various data collection techniques, but the types of data one can collect are highly dependent on the specific type of UAS platform and sensors being used. Therefore, UAS are increasingly being designed and manufactured for specific data collection applications, such as vegetation monitoring in rural areas and 3D modeling of building construction in urban areas. Due to the diversity of scenarios where one can incorporate the use of UAS, professionals should pay close attention to what methods they utilize to collect data because there is no one‐size‐fits‐all approach. This does not mean, however, that there are no best practices associated with UAS data collection. In recent years, UAS and remote sensing researchers have identified effective methodologies and best practices associated with UAS data collection (Hodgson and Koh, 2016; Pepe et al., 2018; Wu and An, 2019; Stecz and Gromada, 2020). In addition to familiarizing oneself with the latest best practices for a specific application, individuals who are interested in using a UAS for data collection should pay attention on the three stages: mission planning, flight operations, and data processing.
3.3.1 MISSION PLANNING (PREFLIGHT)
When using a UAS for remote sensing purposes, operators need to firstly think about their specific project and what their goals are. Because there is no one‐size‐fits‐all approach to using UAS for remote sensing, operators need to situate the technology within the intended project or application. This can often be accomplished by addressing several questions: What am I trying to address in my project? What type of data can a UAS provide to my project? Do I need a particular platform or sensor to acquire that type of data? Where will I be collecting my data (i.e. environmental context)? What potential obstacles could prevent me from acquiring that data? Are these potential obstacles physical (tall buildings, trees, powerlines), regulatory (illegal to fly in that location, limitations on altitude), or a combination of the two? By answering these questions, operators will be able to put together a cohesive and well‐structured mission plan for their project. While there are various ways one can conceptualize a mission plan, Pepe et al. (2018) proposed a useful mission planning framework consisting of several integral components, such as determining the suitable UAS platform and sensor for the application, selecting a suitable flight plan design, and analyzing the user‐determined factors that can impact the flight process. The first component, a discussion of the various types of UAS platforms and sensors, was already discussed in Section 3.2, and here we will focus on the last two components: flight design and flight factors.
The selection of a suitable flight plan design is critical to not only the final data output quality but also the time‐efficiency and cost‐effectiveness of remote sensing projects. Flight plans for data collection purposes are typically conducted in either a manual, assisted, or autonomous fashion depending on the mission’s specifications (Nex and Remondino, 2014). Manual flight plans refer to when an individual is in direct control of the UAS during the data collection process without the assistance of an autopilot system. Manual data collection allows the operator to have more direct control over the imaging process but is prone to pilot‐induced errors, such as not taking pictures with even overlap or skewed image orientations. Assisted flight plans refer to when an individual is in partial control of the system but still has the assistance of an onboard autopilot, such as GPS assisted hovering. These types of flight operations are useful when an operator needs to collect imagery in a precarious location where a completely autonomous flight might not be safe or efficient. Autonomous flight plans are the most robust and involve little‐to‐no direct user input during flight operations while the data are being collected. The automated nature of these flight plans allows the UAS to achieve higher precision in the data collection process (e.g. equal imaging intervals and/or consistent flight speed) and to feasibly collect larger datasets than possible with manual operations. Autonomous flight plans utilize a user‐created flight plan design, often generated with computer software or smartphone/tablet applications, which makes use of the UAS’s hardware and autopilot functionality to perform flight operations without direct input from the pilot. These plans utilize waypoint functionality and user‐defined flight parameters to make the UAS fly along a predetermined path at a set altitude and speed. After the pilot uploads the flight plan from the controller to the UAS itself, the UAS will then fly along the defined routes, collecting overlapping images of the target AOI. Autonomous flight plan designs vary depending on the desired data outputs after processing. However, there are several common factors of relevance impacting all autonomous flight plan designs, such as image overlap, acquisition altitude, and sensor orientation. There