Imagery and GIS. Kass Green

Читать онлайн.
Название Imagery and GIS
Автор произведения Kass Green
Жанр География
Серия
Издательство География
Год выпуска 0
isbn 9781589484894



Скачать книгу

values are interpolated for two values out of every three, Bayer filters will always have lower spectral resolution than multiheaded frame cameras or systems using dispersing elements.

Images

      Figure 3.6. How a Bayer filter framing camera system works. While the figure shows a true color image, Bayer filters can also be used to collect in the near-infrared portions of the electromagnetic spectrum, resulting in infrared imagery.

Images

      Figure 3.7. How a multilens multispectral framing camera system works

Images

      Figure 3.8. How a push broom multispectral scanner works with a dispersing element

      Active Sensors

      The most common active remote sensors are lidar and radar systems. As mentioned earlier, all active instruments work similarly by transmitting electromagnetic energy that is bounced back to the sensor from the surface of the earth. Because active sensors generate their own energy, they can capture imagery at any time of the day or night.

      Radar imagery is often used to create digital surface and digital elevation models over large regions, and to map sea or land cover in perpetually cloudy areas where optical imagery can’t be effectively collected. Figure 3.9 shows an example of a radar image of Los Angeles, California. Radar imagery is collected over a variety of microwave bands, which are denoted by letters and measured in centimeters as follows: Ka, 0.75 to 1.1 cm; K, 1.1 to 1.67 cm; Ku, 1.67 to 2.4 cm; X, 2.4 to 3.75 cm; C, 3.75 to 7.5 cm; S, 7.5 to 15 cm; L, 15 to 30 cm; and P, 30 to 100 cm. Usually, radar imagery is collected in just one band, resulting in a single band image. Bands X, C, and L are the most common ranges used in remote sensing. Some radar systems are able to collect imagery in several bands, resulting in multispectral radar imagery.

      Varying antenna lengths are required to create the radar signal at these different wavelengths. Because it is often not viable to have a long antenna on a platform moving through the air or space, the length of the antenna is extended electronically through a process called synthetic aperture radar.

      Radar signals can also be transmitted and received in either horizontal or vertical polarizations or a combination of both. HH imagery is both transmitted and received in a horizontal polarization, and VV imagery is both transmitted and received in a vertical polarization (i.e., like-polarized). HV imagery is transmitted horizontally and received vertically, and VH imagery is transmitted vertically and received horizontally (i.e., cross-polarized). The different polarizations can be combined to create a multipolarized image, which is similar to a multispectral image as each polarization collects different data about the ground.

Images

      Figure 3.9. An example radar image captured over Los Angeles, California (esriurl.com/IG39). Source: NASA

      Over the last 20 years in much of the world, airborne lidar has surpassed photogrammetric methods for measuring the 3-dimensional world. Lidar imagery is used to develop digital elevation models (DEMs), digital terrain models (DTMs), digital surface models (DSMs), digital height models (DHMs), elevation contours, and other derived datasets (chapter 8 provides more detail on the creation of DEMs). Additionally, NASA uses low-spatial-resolution satellite lidar to monitor ice sheet mass balance and aerosol heights and has recently initiated the Global Ecosystem Dynamics Investigation (GEDI) mission, which will result in the first global, moderate-spatial-resolution, spaceborne topographic lidar (http://science.nasa.gov/missions/gedi/).

      Lidar sensors emit discrete pulses of electromagnetic energy that illuminate a given spot on the earth for an instant (less than 1/100,000 of a second). The energy emitted can be of ultraviolet through near-infrared wavelengths (250 nm to 10 μm), which are much shorter than those of radar pulses. The pulses of light then bounce back and are recaptured by the lidar instrument where the durations of their paths are recorded and analyzed to extract elevation information. The number of returns per unit area for discrete return lidar can be much higher than the number of pulses sent earthward, because each pulse can have multiple (typically three to five) returns.

      There are two types of airborne lidar: topographic and bathymetric. Topographic lidar uses an infrared laser to measure elevations across the surface of the earth. Bathymetric lidar employs green laser light to penetrate water and measure the depth of water bodies. In topographic lidar, pulses of light encounter porous objects, such as vegetation, which will have multiple returns. For example, as shown in figure 3.10, a selected single pulse from this discrete return airborne lidar system has three returns from branches and a fourth return (the final return) from the ground. DTMs are generated from the last returns, DSMs from the first returns (buildings must be removed using specialized algorithms), and DHMs from the difference between the digital surface model and the digital terrain model. Lidar returns collectively form a lidar “point cloud” consisting of millions to billions of points that each contain the point’s latitude, longitude, and elevation.

Images

      Figure 3.10. Illustration of the returns from a topographic lidar system. Source: Dr. Maggi Kelly

Images

      Figure 3.11. Comparison of a hillshade derived from 1.2 pulses/m2 lidar to one derived from eight pulses/m2 lidar. Source: Quantum Geospatial, Inc.

      There are two common types of airborne topographic lidar: discrete return and waveform. Discrete return lidar provides elevation values at the peak intensity of each return. Typically, a maximum of between three and five returns is possible where there is vegetation, but only one return will occur in open areas. Each of the multiple returns is stored as a point in the point cloud, with its associated latitude, longitude, and elevation.

      Full waveform lidar—which is mostly still in the R&D phase—provides the entire “waveform” graph associated with a lidar pulse. Because it records the entire waveform of a lidar pulse’s returns and not just