Next Article in Journal
Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking
Previous Article in Journal
Quantitative Ethylene Measurements with MOx Chemiresistive Sensors at Different Relative Air Humidities
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration

School of Civil and Construction Engineering, Oregon State University, 101 Kearney Hall, Corvallis, OR 97331, USA
*
Author to whom correspondence should be addressed.
Sensors 2015, 15(11), 28099-28128; https://0-doi-org.brum.beds.ac.uk/10.3390/s151128099
Submission received: 6 August 2015 / Revised: 23 October 2015 / Accepted: 2 November 2015 / Published: 6 November 2015
(This article belongs to the Section Remote Sensors)

Abstract

:
In addition to precise 3D coordinates, most light detection and ranging (LIDAR) systems also record “intensity”, loosely defined as the strength of the backscattered echo for each measured point. To date, LIDAR intensity data have proven beneficial in a wide range of applications because they are related to surface parameters, such as reflectance. While numerous procedures have been introduced in the scientific literature, and even commercial software, to enhance the utility of intensity data through a variety of “normalization”, “correction”, or “calibration” techniques, the current situation is complicated by a lack of standardization, as well as confusing, inconsistent use of terminology. In this paper, we first provide an overview of basic principles of LIDAR intensity measurements and applications utilizing intensity information from terrestrial, airborne topographic, and airborne bathymetric LIDAR. Next, we review effective parameters on intensity measurements, basic theory, and current intensity processing methods. We define terminology adopted from the most commonly-used conventions based on a review of current literature. Finally, we identify topics in need of further research. Ultimately, the presented information helps lay the foundation for future standards and specifications for LIDAR radiometric calibration.

1. Introduction

Across a wide range of applications, the usefulness of light detection and ranging (LIDAR) data is enhanced by the availability of “intensity” values. To date, LIDAR intensity data have proven beneficial in data registration, feature extraction, classification, surface analysis, segmentation, and object detection and recognition, to name just a few examples. The list of applications also continues to grow rapidly, as LIDAR researchers and practitioners develop new and innovative uses of these data. The primary benefit of LIDAR intensity lies in the fact that it is related to surface reflectance and other surface characteristics. Unfortunately, there are also a number of confounding variables to which intensity is related, including parameters related to the data acquisition geometry, scanning environment, and sensors, themselves. To overcome this issue, a number of techniques have been developed to calibrate, normalize, or otherwise correct the recorded intensity values to produce values that are more useful and more closely related to true surface characteristics.
Despite the rapid progress that has been made, and the wealth of published literature on this topic, there is very little consistency across efforts. Multiple individuals, groups, and organizations are currently applying vastly different processing approaches to LIDAR intensity, and using differing terminology to describe these procedures. Radiometric calibration, intensity normalization, and intensity correction are just a few of the terms used to refer to different processing approaches. The outputs are also given diverse names, including reflectance, albedo, amplitude, normalized intensity, corrected intensity, pseudo-reflectance, and relative reflectance. Even the term “intensity” itself is debated and variously defined. Not surprisingly, researchers, clients, and end users are often confused by these products and the terminology used by data providers to describe them.
In this paper, we seek to address these pressing challenges. Our specific goals are to: (1) provide an overview of basic principles in LIDAR radiometric measurements and data processing; (2) discuss examples of how intensity values are being utilized for representative applications; (3) define consistent terminology (which we accomplish not by inventing new terms or insisting on a purist’s adherence to strict radiometric or photometric usage, but by adopting the most commonly-used conventions based on a review of current literature); (4) lay the foundations for future standards and specifications for LIDAR radiometric calibration; and (5) identify topics in need of further research.
While it is hoped that this paper will prove useful to a broad range of users, the primary intended audience consists of practitioners who want to evaluate different radiometric processing approaches from an implementation perspective and/or LIDAR data consumers who want to better understand (and possibly control, through appropriate contract wording) the types of intensity-derived products that are delivered by LIDAR service providers. Hence, we avoid an elaborate theoretical formulation, while providing implementation-level details and extensive references for interested readers.

2. Basics of LIDAR Intensity Measurement

While LIDAR system designs differ markedly between different manufacturers and models, most current systems employ one or more receiver channels using an avalanche photodiode (APD), photomultiplier tube (PMT), or other photodetector to convert the received optical signal to an electrical signal, to which various ranging strategies can be applied. For example, the ranging can be performed in hardware, using a constant fraction discriminator (CFD) and time interval meter, or by digitizing the received signal and applying any of a number of range detection algorithms to the output. Leading edge detection, centroid analysis, and deconvolution are just a few of the methods used to extract individual ranges from the return signal. Many of the photodetectors used in commercial topographic LIDAR systems are designed to be linear, meaning that the output photocurrent is linearly proportional to the input optical power over the detector’s operating range [1].
In addition to being used to extract ranges (which can be subsequently georeferenced) through any of the methods listed above, the received signal can be used to extract “intensity” values. While somewhat inconsistent with strict radiometric usage, the term intensity in this context refers to the amplitude of the return signal, which can be the analog electrical signal output from the photodetector or the digitized waveform. Figure 1a shows an example of the shape of a waveform emitted and returned. Usually the peak amplitude is used, but it is important to note that the point(s) selected on the return waveform (analog or digital) for intensity measurement vary from one manufacturer to another and are not necessarily coincident with the points used for range measurement. Figure 1b shows the difference in point selection on the return waveform from the peak detection and leading edge detection methods. For discrete-return systems employing hardware-based ranging and leading-edge detection, another strategy is to delay the intensity measurement by a fixed time after the range measurement. Interested readers are referred to [1] for a detailed discussion.
Figure 1. (a) Example of the shape of a waveform emitted and returned; (b) point selection on the return waveform in the peak detection and leading edge detection methods; (c) saturation impact resulting from highly reflective objects close to the scanner, exceeding detection thresholds.
Figure 1. (a) Example of the shape of a waveform emitted and returned; (b) point selection on the return waveform in the peak detection and leading edge detection methods; (c) saturation impact resulting from highly reflective objects close to the scanner, exceeding detection thresholds.
Sensors 15 28099 g001
The amplitude—however and wherever it is measured—is then typically scaled to an 8, 12, or 16 bit dynamic range, at which point it can be provided as an additional parameter in an output LIDAR point cloud, for example using the “intensity” field in the point data records in a LAS file.
The received optical signal at the detector and, hence, the derived intensity values, are related to the properties of the surface from which the laser pulse was reflected. Therefore, intensity is a potentially useful parameter in that it contains information about the surface. For example, Figure 2 illustrates histograms of intensity values measured by a terrestrial scanner on different surfaces in a street scene. Note that many objects have distinctly different intensity ranges and potentially could be segmented or classified using intensity information.
Figure 2. (a) Panoramic representation of a scanned scene near an intersection (b,c) histograms of intensity values measured on different surfaces.
Figure 2. (a) Panoramic representation of a scanned scene near an intersection (b,c) histograms of intensity values measured on different surfaces.
Sensors 15 28099 g002
However, these intensity values (regardless of the specific details of how they are measured and recorded) are also affected by a number of other parameters, including transmittal power, range, angle of incidence, atmospheric transmittance, beam divergence, and detector responsivity (Section 4 will describe these in more detail). For users who wish to use intensity to analyze surface characteristics, these additional system and environmental variables can be considered nuisance parameters. Therefore, processing strategies which aim to remove the effects of these parameters on the intensity data may be desirable to enhance the utility of the data for the user’s intended application. These strategies, the terminology used to describe them, and the characteristics of the output, are the primary focus of the following sections.

3. Applications of LIDAR Intensity

The radiometric information provided by scanners has been used alone or as a supplement to other spatial and spectral remote sensing data in a variety of applications. Table 1 presents a summary of some currently-studied applications of LIDAR intensity including remote sensing data registration, land cover classification, natural environment sensing, bathymetry, structural damage detection, and transportation asset management. This list is by no means comprehensive; new applications continue to emerge at a high rate.
Figure 3 shows several datasets as examples of applications of LIDAR intensity. Theoretically, materials have different spectral reflectance properties resulting in different backscattering laser intensities. Therefore, the LIDAR intensity can be used as a means to classify and detect different materials in scans of natural or urban environments.
Although some methods involve simply using intensity values to “colorize” the point cloud, another common approach is to generate georeferenced 2D intensity images (Figure 3a). These images can be produced by gridding the data and using different operations such as the (weighted) mean, maximum, or minimum to assign the intensity value of grids containing multiple points. In an extension to 3D, it is also possible to create voxelized representations, in which each voxel stores an intensity value.
A major application of LIDAR intensity that has been widely studied is to classify natural and urban cover surfaces. In initial efforts, Song et al. [2] and Charaniya et al. [3] indicated that intensity data enables the separation of typical land cover surfaces such as asphalt roads, grass, trees, and house roof captured in ALS scans. Brennan and Webster [4] and Matikainen et al. [5] developed methods for detection and classification of building structures. Arnold et al. [6] used intensity data to discriminate snow covered areas from bare ice in a glacier. Im et al. [7] conducted tests to evaluate different features for land cover classification and found that adding the LIDAR intensity to classification features results in 10% to 20% increase in the accuracy of results. The LIDAR intensity has been also used as a supplement feature with other remote sensing data for land cover classification. Zhou et al. [8] used LIDAR intensity data to facilitate land cover classification of shaded areas in aerial images. MacFaden et al. [9] used LIDAR intensity for detecting impervious surfaces not detectable in aerial images.
LIDAR intensity is also used to detect common features in multiple sets of remote sensing data for registration. Methods using LIDAR intensity data have been developed for segmentation of multiple scans [10,11,12,13,14] and co-registration of scans and images [15,16,17,18,19,20].
Table 1. Example applications utilizing LIDAR intensity information.
Table 1. Example applications utilizing LIDAR intensity information.
CategoryApplicationReferences
Cultural Heritage/Virtual TourismAnalysis of historical paintings/artifacts Digital preservation[21,22]
Land cover classificationClassification of urban surfaces[2,3,7]
Detection and classification of buildings[4,5]
Classification of glacier surfaces[6]
Supplementing image-based land cover classifications[8,9]
Remote sensing data registrationRegistration of multiple scans by identifying common features[10,11,12,13,14]
integration of scans and images by identifying common features[15,16,17,18,19,20]
Sensing natural environmentsFlood modeling and wetland hydrology[23,24]
Tree classification, snag detection, and forest understory vegetation cover[25,26,27,28,29,30]
Identification of different rock and soil layers[31]
Lava flows aging[32]
Snow cover change detection[33]
Costal land cover mapping[34]
Bathymetry (using bathymetric LIDAR)Benthic habitat mapping[35,36,37,38,39]
Hydrodynamic and sedimentological properties[40]
Structural damage detectionAssessment of historic buildings[41]
Crack detection of concrete structures[42,43,44]
Detection of bridge surface degradation[45]
Detection of wind-induced cladding damage[46,47,48]
Transportation asset managementDetection of road objects and features (e.g., markings, signs, manhole, culverts, etc.)[49,50,51,52,53,54]
Pavement and tunnel damage detection[55,56]
Extraction of road profile[57]
LIDAR intensity has also proven useful for sensing natural environments. Antonarakis et al. [23] and Lang et al. [24] developed methods for extracting natural surface roughness information for flood modeling and wetland hydrology. Mazzarini et al. [32] and Burton et al. [31] used LIDAR intensity data respectively for aging lava flows and sensing rock properties. Several researchers indicated the potential of LIDAR intensity in forest canopy classification and sensing [25,26,27,28,29,30]. LIDAR intensity data was also used for snow cover change detection [33] and coastal land cover mapping [34].
The water penetrating capabilities of bathymetric LIDAR enable the intensity returns to be used in detecting various seafloor features. For example, benthic habitat types can be classified using the relative reflectance values of the returns. Collin et al. [35] classified seabed features into four primary categories and used underwater photography to ground-truth the data. Several researchers examined the fusion of bathymetric LIDAR data with hyperspectral data to measure environmental parameters including seafloor reflectance, water depth and water column parameters [36,38,39]. Narayanan et al. [37] classified seafloor intensity values into habitat type using decision trees. Figure 3d shows relative reflectance values in which seafloor features can be distinguished. Long et al. [40] demonstrates that bathymetric LIDAR intensity waveforms can be used to determine sedimentological and hydrodynamic characteristics. Seafloor reflectance data from bathymetric LIDAR can be even more valuable than its terrestrial/topographic counterparts, due to the challenges typically encountered in obtaining detailed imagery of the seafloor from other airborne or spaceborne remote sensing technologies.
Figure 3. Example applications of LIDAR intensity. (a) Intensity image from ALS data; (b) intensity shaded point cloud showing damage to concrete in a tunnel (data courtesy of Oregon DOT); (c) Intensity shaded point cloud showing pavement lines and striping; (d) corrected bottom intensity image for mapping seafloor; (e) intensity colored point cloud showing different geologic layers in a cliff; (f) detection of reflective signs based on intensity values; (g) intensity colored point cloud showing damage to concrete walls after an earthquake; and (h) intensity-colored point cloud point cloud showing damage to roof cladding after a tornado.
Figure 3. Example applications of LIDAR intensity. (a) Intensity image from ALS data; (b) intensity shaded point cloud showing damage to concrete in a tunnel (data courtesy of Oregon DOT); (c) Intensity shaded point cloud showing pavement lines and striping; (d) corrected bottom intensity image for mapping seafloor; (e) intensity colored point cloud showing different geologic layers in a cliff; (f) detection of reflective signs based on intensity values; (g) intensity colored point cloud showing damage to concrete walls after an earthquake; and (h) intensity-colored point cloud point cloud showing damage to roof cladding after a tornado.
Sensors 15 28099 g003
Variations of LIDAR intensity backscattered from intact and damaged materials have enabled advanced structural damage detection and quantification using LIDAR. Armesto-González et al. [41] employed LIDAR intensity data to detect degraded stony materials in scans of historic buildings. Several researchers used LIDAR intensity to detect cracks in concrete structural components in their laboratory tests or post-disaster field investigations [42,43,44]. Kashani et al. [46,47,48] indicated that the LIDAR intensity data is an appropriate means to automatically detect cladding damage of buildings after wind storm events.
LIDAR intensity data was used directly without any radiometric processing in some early studies [2,3,4], while subsequent work considered the impacts of radiometric calibration and correction. Gatziolis [26] and Korpela et al. [27] indicated that correcting the range and intensity data resulted in respectively 9% and 31% improvement in their LIDAR-based canopy classification results. Yan et al. [58] demonstrated that applying radiometric correction on scans of an urban area resulted in 9% to 13% improvement in accuracy of their land cover classification. Kaasalainen et al. [59] compared the LIDAR intensity data captured from a number of calibration reference targets with their “true” reflectance values obtained by a near-infrared digital camera. The study indicated that the radiometric calibration improves the accuracy of LIDAR-based target reflectance measurements.

4. Effective Parameters Influencing Intensity Measurements

As mentioned previously, several factors influence LIDAR intensity values that can distort its ability to directly measure reflectance. Table 2 provides a list of effective factors and brief description of their influence. As shown in the first column of Table 2, the effective factors to which intensity values are related can be divided into four main categories of (1) target surface characteristics; (2) data acquisition geometry; (3) instrumental effects; and (4) environmental effects. These factors are discussed in the following subsections. Figure 4 shows examples of variation in intensity values caused by some of these factors.
Table 2. Effective factors influencing LIDAR intensity measurements.
Table 2. Effective factors influencing LIDAR intensity measurements.
CategoryFactorDescriptionRelated References
Target Surface CharacteristicsReflectance (ρ)By definition, surfaces of higher reflectance will reflect a greater portion of the incident laser radiation, thereby increasing the received signal power. In radiometric calibration, this is typically the parameter of interest.[59,60,61,62,63,64,65]
Roughness (ɳ)Surface roughness dictates the type of reflection (e.g., specular vs. diffuse)[62,66,67]
Acquisition GeometryRange (R)The emitted pulse energy decays as a function of range or distance traveled.[27,58,63,64,65,68,69,70,71,72,73]
Angle of Incidence (α)Greater angles of incidence typically result in less of the incident laser energy being backscattered in the direction of the receiver, thereby reducing received optical power. Additionally, when the laser beam strikes a surface obliquely, it increases the backscattering cross section.[58,62,63,64,65,66,68,69,70,71,72]
Multiple ReturnsWhen a single laser pulse reflects from objects, an attenuation correction can be applied to compensate for the energy split between objects. [74,75,76]
Instrumental EffectsTransmitted Energy (E)The amount of energy backscattered from targets is related to the amount of energy transmitted with every pulse. Transmitted pulse energy is related to peak transmitted power (which varies with pulse repetition frequency in many systems) and transmit pulse width.[59,61,65,77]
Intensity Bit Depth (*-bit) and ScalingDifferent scanners use varying bit depth (e.g., 8-bit, 12-bit or 16-bit) when digitizing the return signal. Recorded digital numbers (DNs) are typically scaled to fill the available dynamic range.[70,78]
Amplifier for low reflective surfacesSome scanners amplify the intensity values measured on low reflective surfaces.[59,60,61,72]
Automatic gain control (Ω)Some systems (e.g., Leica ALS systems) employ automatic gain control (AGC), which increases the dynamic range that can be accommodated but can also result in discontinuities in the intensity signal, if not compensated.[27,65,79]
Brightness reducer for near distancesSome scanners reduce intensity values measured on close objects (e.g., less than 10 m distance).[21,54,72]
Aperture Size (Dr)A larger aperture admits more light, increasing received signal strength.[60]
Environmental EffectsAtmospheric Transmittance (T) or (ηatm)Radiant energy attenuates in propagating through the atmosphere, as a function of humidity, temperature pressure and other variables.[58,65,69,70]
WetnessWet surfaces also absorb more energy from the pulse (particularly at the 1.5 micron wavelength used in some systems), resulting in weaker returns.[61,69]
Figure 4. Examples of factors that influence intensity values. (a) Degraded intensity values with range on objects such as street lights and asphalt pavement; (b) dissimilar intensity values captured on walls with different angles of incidence (larger view in Figure 3g); (c) lower intensity values for multipath returns from reflections of the laser off of the water surface; and (d) degraded intensity values (blue) due to wet surfaces at a rocky intertidal site.
Figure 4. Examples of factors that influence intensity values. (a) Degraded intensity values with range on objects such as street lights and asphalt pavement; (b) dissimilar intensity values captured on walls with different angles of incidence (larger view in Figure 3g); (c) lower intensity values for multipath returns from reflections of the laser off of the water surface; and (d) degraded intensity values (blue) due to wet surfaces at a rocky intertidal site.
Sensors 15 28099 g004aSensors 15 28099 g004b

4.1. Target Surface Characteristics

All other parameters being equal, intensity values increase with surface reflectance because a more reflective surface will return more energy from the pulse. An exception is highly reflective surfaces (e.g., mirrors, glass, and water) that can cause specular reflection (mirror effect) and/or multipath. In the case of multipath, range and intensity values are made from pulses reflected from more than one surface and do not represent “true” surface properties [80].
Highly-reflective objects can present some challenges with LIDAR such as saturation and blooming [80]. Figure 1c shows an example of pulse saturation, which occurs with highly reflective objects located close to the scanner. Since the detectors are calibrated to have higher sensitivity to detect weaker returns from less reflective objects such as topography or buildings, returns from these objects exceed the detection threshold, resulting in truncation of the peak of the actual pulse. As a result, the range to the object is often underestimated.
Blooming, in contrast, occurs on highly reflective objects located far from the scanner [80]. These objects appear larger in the point cloud than they actually are because of a bleeding effect of the laser pulse. The laser pulse diverges with distance, resulting in a larger spot size on the object as well as neighboring objects. Hence, the edge of the laser pulse that is not directed at the object, but a neighboring object can partially encompass the highly reflective object. This results in a much higher intensity return on the adjacent object than that would occur if the reflective object were closer.
Reflectance is typically the parameter of interest in radiometric calibration. Reference targets with known reflectance are often used to analyze the impact of material reflectance on LIDAR intensity measurements and for radiometric calibration [59,62,63,64,65,66,72,77,81]. Some researchers also investigated LIDAR intensity values obtained from common natural and artificial materials such as sand, gravel, asphalt, concrete, and brick materials [59,61,62,65,66,72]. More specifics on these methods will be discussed later in the paper in Section 6. In order to extract “true” surface material parameters from intensity, the influence of other effective factors shown in Table 2 need to be eliminated or otherwise reduced.

4.2. Data Acquisition Geometry

Factors related to data acquisition geometry such as range (i.e., distance between the sensor and the target) and angle of incidence (i.e., the angle between the emitted laser beam and the target surface normal) greatly influence LIDAR intensity (see Figure 4a,b). The majority of current intensity correction and calibration methods are developed for range and angle of incidence [27,58,62,63,64,65,66,68,69,70,71,72,73]. The primary influence of range on intensity is the fact that the pulse has to pass through more atmosphere and the pulse strength diminishes (i.e., spreading loss). Increases in range and angle of incidence also results in larger target backscattering cross sections. The pulse width increases with range, enlarging the laser footprint and effective backscattering cross section. However, the influence of laser beam divergence on backscattering cross section depends on the shape of targets. In extended targets where the size of target is larger than the laser footprint, the laser beam divergence has greater influence than for point (e.g., a leaf) and linear (e.g., wire) targets where the target area is much smaller than the laser footprint.
The influence of range and angle of incidence varies in ALS and TLS. Ranges are typically much greater and exhibit less percent variability in ALS than in TLS. The scanning range (a function of flying height) in ALS is typically in the range of 600 m to 3000 m, sometimes lower with bathymetric LIDAR. While, except for rare cases of specific time of flight scanners that can capture points up to 6000 m, most TLS systems are typically only capable of measurements less than 300 m in range. TLS data, even those collected within a single scan, contains points with substantially variable ranges and angles of incidence. The TLS data includes much more data at oblique angles, particularly across ground surfaces [57]. Especially in close range scanning, objects such as walls can be found where part of data points have near orthogonal angle of incidence while other parts transition to oblique angles. Additionally in TLS, several scans are often merged and positioned with substantial overlap. This results in an object appearing in one scan at a different angle and range than in another scan, leading to a mix of intensity values on the object in the merged dataset.
In addition to range and angle of incidence, intensity values are affected by how the beam can be split by multiple objects within the path of a single laser pulse, resulting in multiple returns. Attenuation correction processes have been proposed by [74,75] for correcting intensity values of laser returns obtained on forest floors. Reference intensity values are estimated from nearby single peak waveforms that are close to nadir. For simplicity, these points are taken from the same scanline with several screening criteria. The forest floor return intensity values are then adjusted based on an analysis of the integrals of the signals in the waveforms.

4.3. Instrumental Effects

Instrumental effects result in different intensity measurements from the same target when different sensors are used. Instrument specific parameters must be known or estimated to develop a LIDAR-based reflectance measurement method that is consistent for different instruments. The aperture size, laser wavelength, beam divergence, and emitted power vary between scanners and can influence the intensity measurement. The aperture size impacts the angular resolution of backscatter measurements [60]. Airborne laser scanners typically have larger aperture sizes (8 < Dr < 15 cm) than terrestrial laser scanners (a few cm) [60]. The laser wavelength often varies in the range of 600 nm to 1550 nm.
The received power is measured, digitized, scaled, and modified by sensors internally; however, this process can vary between different sensors. Discrete waveform scanners may use different peak detection methods causing changes in the range and intensity measurements. Once a returned pulse is detected, the pulse power is digitized to produce intensity which is encoded as an integer number. Typically, the intensity is scaled to a 16-bit value when LAS files are created [78]. Riegl scanners further modify the digitized intensity values and provide two more field values in their LAS exports: amplitude and reflectance, which are explained in [78]. Some scanners further modify intensity measurements, e.g., apply amplifiers for areas with low reflectance or reducers for near-distance areas [72]. Some ALS systems have the ability to adjust the gain using automatic gain control (AGC), which alters the intensity measurements [65]. In addition to the sensor itself, data processing software may apply further scaling or modification influencing intensity measurements. For instance, some software can apply unpredictable intensity scaling to enhance visual appearance [72]. If these internal processing steps are not known, documented, and adjusted with scans, it can be difficult, if not impossible, to calibrate intensity values and measure “true” reflectance values.

4.4. Environmental Effects

Atmospheric effects and wetness are the main environmental influences on LIDAR intensity values (except in bathymetric LIDAR, where water column effects dominate). The laser energy is attenuated while passing through atmosphere due to scattering and absorption of the laser photons [58,70]. Small particles suspended in the air such as dust or smoke (aerosol scattering) and air clusters in the atmosphere (Rayleigh scattering) cause laser scattering. Additionally, air molecules such as water vapor, carbon dioxide, oxygen, etc. cause laser absorption and energy loss. Atmospheric effects are more influential in ALS than TLS because the laser travels at further ranges as well as vertically with elevation; hence, there are more variance in atmospheric conditions between the scanner and targets.
Another environmental effect is wetness. Kaasalainen et al. [61] indicated that moisture can cause a 30% to 50% drop in reflectance of brick samples. For example, Figure 4d shows degraded intensity values (blue) due to wet surfaces at a rocky intertidal site. Intensity values can be used as a filter of erroneous data points. For example, Figure 4c shows lower intensity values on multipath returns that have reflected off of the water surface and onto the cliff before returning to the scanner. Hence, they create a reflection below the water surface that is not representative of the scene. Similarly, intensity information can also be used to filter spurious moisture points within a scan.

4.5. Effective Factors in Bathymetric LIDAR

While the intent of this paper is to focus primarily on topographic LIDAR, we provide a brief treatment of corresponding processing of bathymetric LIDAR to illuminate the similarities and differences between the two. It is also important to note that, due to the difficulty in capturing imagery of the seafloor with conventional imaging techniques, radiometric calibration of bathymetric LIDAR is incredibly important in enabling detailed seafloor characterization.
Working with bathymetric LIDAR intensity requires additional parameters to be considered. The effect of parameters listed in Table 3 on the return intensity value will be demonstrated in Equation (4) (Section 5.2). Acquisition geometry parameters must now consider factors such as the bathymetric angle of incidence, or the angle off nadir at which the pulse is transmitted from the aircraft, aircraft altitude, refracted beam angle, and the receiver field of view [82,83,84,85,86,87]. The water depth has a significant effect on the power of the return pulse, as the power decays exponentially with depth [35,84]. Because depth has such a pronounced effect on intensity values, it is highly important to have accurate depth estimates when calculating the bottom reflectance. The rate at which the return power decays at increasing depth is described by the diffuse attenuation coefficient. This coefficient is defined by [85,86] as the sum of the absorption coefficient and the backward scattering coefficient. For systems with smaller receiver field of view, it is also important to consider a forward scattering coefficient [82,83,84,85,86]. Figure 5 demonstrates typical acquisition geometry.
Figure 5. Bathymetric LIDAR acquisition geometry (adapted from [74]).
Figure 5. Bathymetric LIDAR acquisition geometry (adapted from [74]).
Sensors 15 28099 g005
Table 3. Effective factors influencing bathymetric LIDAR intensity measurements.
Table 3. Effective factors influencing bathymetric LIDAR intensity measurements.
CategoryFactorDescriptionRelated References
Acquisition Geometry Water Depth (D)In bathymetric LIDAR, pulse power decays exponentially with the product of water depth and the diffuse attenuation coefficient.[35,84]
Off nadir transmit angle (θ)Affects the signal return due to pulse stretching and retro-reflectance of the surface material.[83,84]
Receiver field of view loss factor (Fp)Loss factor due to a receiver FOV is insufficient to accommodate the spreading of the pulse in the water column.[82,87]
Aircraft altitude (H), refracted beam angle (Φ), effective area of receiver optics (Ar)Other acquisition geometry factors which have an effect on the return power as shown in the bathymetric LIDAR equation (Equation (4)).[82,85]
Diffuse Attenuation Coefficient (K)Light traveling through the water column is exponentially attenuated, due to absorption and scattering by particles in the water.[83,84,86]
Pulse stretching factor (n)Stretching of the pulse due to acquisition geometry and scattering properties of the water. [84,85]

5. Basic Theory

5.1. LIDAR Range Equation

Theoretical or model-driven intensity processing methods are typically based on some form of the LIDAR range equation (also referred to as the laser radar range equation or simply the radar equation), the origins of which lie in the field of microwave radar [88]. This equation relates the received optical power to the transmitted power and other parameters related to the system, acquisition geometry, environment and target characteristics. Numerous forms of the LIDAR range equation can be found in the published literature (e.g., [70,72,88,89,90,91]), but most are equivalent or similar to that given in Equation (1):
P r = P t D r 2 η a t m η s y s σ 4 π R 4 β t 2
where Pr = received optical power (watts), Pt = transmitted power (watts), Dr = receiver aperture diameter (meters), σ = effective target cross section (square meters), ηatm = atmospheric transmission factor (dimensionless), ηsys = system transmission factor (dimensionless), R = range (meters), and βt = transmit beamwidth (radians). The effective target cross section describes the target characteristics and is given by:
σ = 4 π Ω ρ A t
where ρ = target reflectance at the laser wavelength (dimensionless), Ω = scattering solid angle (steradians), and At = target area (square meters). Under the assumptions of an extended target (i.e., one that intercepts the entire laser beam) and Lambertian reflectance, a simplified form of the LIDAR range equation can be obtained (e.g., [70]):
P r = P t D r 2 η a t m η s y s ρ 4 R 2 cos α i
where αi = angle of incidence, and all other variables are defined previously (for a discussion on non-Lambertian surfaces, please see [92]). Solving Equation (3) for reflectance, ρ, is mathematically trivial, but, in practice, the challenge lies in obtaining reliable estimates of all other parameters in the equation. Some general approaches include: (1) combining parameters that are unknown but can be assumed constant over a single flight (or, at least, over a single flightline) to create a combined constant; (2) using manufacturer’s system specifications, when available; and (3) using assumed or empirically-determined values. Sometimes, radiometric processing methods that start out with a rigorous theoretical formulation can become more ad hoc through the introduction of a number of empirically-determined parameters.

5.2. Bathymetric LIDAR Equation

A version of the bathymetric LIDAR equation, adapted from [85], is provided in Equation (4). As with the LIDAR range equation shown above (Equations (1) and (3)), there are also numerous versions of this equation that contain different parameters and are based on different sets of assumptions and simplifications [35,37,38,87,93].
P r = ( m ) P T ηρ F p A r c o s 2 ( θ ) π ( n w H + D ) 2 e x p ( 2 n ( s , ω 0 , θ ) K D s e c ( ϕ ) )
where, Pr = received power, PT = transmitted power, η = system optical efficiency factor, ρ = reflectance of bottom, Fp = loss due to insufficient FOV, Ar = effective area of receiver optics, θ = off nadir transmit angle, nw = index of refraction of water, H = altitude of LIDAR above water, D = bottom depth, n(s, ω0, θ) = pulse stretching factor, s = scattering coefficient, ω0 = single scattering albedo, K = diffuse attenuation coefficient of water, and ϕ = nadir angle of LIDAR after entering the water.
The bathymetric LIDAR equation can be used in a similar manner to its topographic-only counterpart in radiometric calibration. However, the situation in bathymetric LIDAR is even more complex, due to the greater number of system and environmental parameters, for which reliable estimates may be difficult to obtain.

6. Processing Methods

Raw LIDAR intensity data often undergoes some processing steps to reduce variation caused by the parameters discussed above and sometimes to extract true reflectance information. Unfortunately, there is inconsistency of terminology used in literature to describe these processing steps and procedures. These inconsistencies are further compounded by the fact that LIDAR data has permeated as an important data source for a wide variety of applications and is utilized by with people from diverse backgrounds. Hence, as of yet, there are no standardized definitions for terminology associated with intensity modification methods. Given that there are a wide range of applications supported by intensity information and that significant effort is required to determine and apply modifications, it is not always clear what adjustments have been applied to intensity information in a particular dataset. For the purposes of this paper, we distinguish four levels of intensity processing. Each level increases not only with respect to the accuracy and quality of information but also in effort required:
Level 0: 
No modification (raw intensity): These are the basic intensity values directly provided by the manufacturer or vendor in their native storage format. They are typically scaled to values of 0–1 (floating point), 0–255 (8-bit integer), or 0–65,535 (16-bit integer), depending on the manufacturer. However, the processes used for scaling the sensor voltages and any adjustments applied are often unknown. Similar results can be obtained for the same scanner model by the same manufacturer; however, there typically is no direct agreement or relationship between values provided by different systems or manufacturers. In this paper, we refer to this as intensity, generically.
Level 1: 
Intensity correction: In this process an adjustment is made to the intensity values to reduce or ideally eliminate variation caused by one or more effective parameters (e.g., range, angle of incidence, etc.). This process is performed by either a theoretical or empirical correction model. Intensity correction ultimately can result in pseudo-reflectance values.
Level 2: 
Intensity normalization: In this process an intensity image is normalized through scaling to adjust the contrast and/or a shift to adjust the overall “brightness” to improve matching with a neighboring tile or overlapping strip (i.e., a histogram matching or normalization).
Level 3: 
Rigorous radiometric correction and calibration: In this meticulous process, the intensity values from the LIDAR system are first evaluated on targets with known reflectance, resulting in the determination of calibration constants for the sensor. The calibration constants are then applied to future data that are collected with the system including additional Level 1 intensity corrections to account for any deviations in parameters (e.g., range, angle of incidence). When completed rigorously, this process results in “true” reflectance information. Hence, when radiometric calibration has been applied, consistent data can be obtained from different systems, operated with different parameters settings, and in different conditions. In this paper, we refer to these as reflectance values.
The outputs of Levels 1 and 2 are typically referred to as “relative reflectance” values (sometimes “pseudo-reflectance”), while Level 3 is intended to generate “true” or “absolute” surface reflectance. It is also important to note that the processing levels are not necessarily intended to indicate a particular sequence of processing steps.
Another common approach is to simply apply an ad hoc normalization with no proceeding correction. This process is similar to histogram adjustments in image processing software. This workflow is not considered as one of the defined levels, as it is primarily arbitrary and visual in nature so that the intensity values are improved for visual interpretation.
The applications of the different processing levels are application specific and far too numerous to describe in detail. Briefly, however, Levels 1 and 2 are often sufficient for visual analysis and automated land cover classification. On the other hand, combination or comparison of reflectance data acquired with different systems and in different conditions may require a full Level 3 radiometric calibration. Similarly, extraction of true surface albedo requires Level 3 processing. In general, the higher the level the better the results will be for a range of applications. However, lower processing levels can typically be achieved more economically and prove sufficient for a particular user application.
Table 4 and Table 5 summarize some of intensity correction and radiometric calibration methods reported in the literature. The tables are organized by the level of intensity processing and type of scanners used. They also show the types of targets used as well as theoretical and empirical models developed. It should be noted that the tables do not include examples of processing levels 0 and 2. Level 0 is typically completed by the sensor itself. Level 2 processes are not included in the Table and will be summarized later in Section 6.4. The following sections review these methods and the basic theory behind them.
Table 4. Selected intensity correction and calibration methods (A, B, C, D denote empirical coefficients, ref denotes a reference).
Table 4. Selected intensity correction and calibration methods (A, B, C, D denote empirical coefficients, ref denotes a reference).
ReferenceScannerLevelTargetsParametersTheoretical ModelEmpirical Model
Luzum et al. [94](ALS) Optech ALTM 12331n/arange (R) I c = I × R i 2 R r e f 2 n/a
Coren & Sterzai [68](ALS) Optech ALTM30331homogenous surface (asphalt road)range (R)angle of incidence (α) atm. attenuation coeff. (a) I c = I × R i 2 R r e f 2 × 1 cos α I c = I × e A R
Starek et al. [73](ALS) Optech ALTM 12331n/arange (R) I c = I × R i 2 R r e f 2   n/a
Hofle & Pfeifer [70](ALS) Optech ALTM 31001homogenous surface (asphalt road)range (R)angle of incidence (α) atm. attenuation coeff. (a) transmitted energy (ET) I c = I ×   R i 2 R r e f 2   × 1 cos α × 10 2 a R × E T r e f E T j   I c =   I 1000 × f ( R )
f ( R ) = A R 2 + B R + ( 1 1000 2 A   1000   B )
Jutzi and Gross [71](ALS) RIEGL LMS—Q5601homogenous surface (roof planes)range (R)angle of incidence (α) atm. attenuation coeff. (a)n/a I c = I × R A × e 2 B R × c o s C ( α ) × e D
Korpela et al. [27](ALS) Optech ALTM3100Leica ALS501homogenous surfacerange (R) automatic gain control (Gc)n/a I c =   I × R i A R r e f A + I × B × ( C Gc )
Vain et al. [95](ALS) Leica ALS50-II1brightness calibration targets (tarps)automatic gain control (Gc)n/a I c = A + B × I + C ×   I × Gc 
Habib et al. [96](ALS) Leica ALS501n/arange (R) angle of incidence (α) I c = I × R i 2 R r e f 2 × 1 cos α   n/a
Yan et al. [58](ALS) Leica ALS501n/arange (R) angle of incidence (α) atm. attenuation coeff. (a) I c = I × R i 2 R r e f 2   × 1 cos α × e 2 a R n/a
Ding et al. [69](ALS) Leica ALS50-I1overlapping scan areasrange (R) angle of incidence (α) atm. attenuation coeff. (a) I c = I × R i 2 R r e f 2 × 1 cos α × 10 2 a R I c * = I c × R A × 10 2 B R × c o s C ( α ) × e D   and Phong model
Ahokas et al. [77](ALS) Optech ALTM 31003brightness calibration targets (tarps)range (R) atm. attenuation coeff. (a) transmitted energy (ET) reflectance (ρ) I c = I × R i 2 R r e f 2 × E T r e f E T j ρ = A ×   I c + B
Kaasalainen et al. [61](ALS) Optech ALTM 3100 Topeye MK Leica ALS503sand and gravelrange (R) angle of incidence (α) total atmosphere transmittance (T) pulse energy (ET)method described by Vain et al. (2009) ρ = I c I r e f
where: Iref is reference Intensity measured at the same range of targets
Vain et al. [65](ALS) Above scanners + Optech ALTM 20333natural & commercial targets, brightness calibration targets (tarps)range (R)angle of incidence (α) total atmosphere transmittance (T) pulse energy (ET) I c = I × R i 2 R r e f 2 × 1 cos α × 1 T 2 × E T r e f E T j ρ = I c × ρ r e f I c . r e f
Briese et al. [97](ALS) RIEGL VQ820-G LMS-Q680i VQ-5803asphalt road, stone pavementrange (R) angle of incidence (α) detected power (Pr) empirical calibration constant (Ccal) reflectance (ρ) ρ = C c a l × R i 2 cos α C c a l = ρ r e f × cos α r e f R r e f 2
Errington et al. [98](TLS) 3DLS-K21overlapping scan areasrange (R) angle of incidence (α) pseudo-reflectance (ρ)n/aThe separation model proposed by Pfeifer et al. (2008)
Fang et al. [21](TLS) Z + F Imager5006i1White paper targetsrange (R) angle of incidence (α) near-distance effect (n(R))n/a I = n ( R ) × A × ( 1 B + B cos α ) R 2
Pfeifer et al. [63,64](TLS) Riegl LMS-Z420i & Optech ILRIS 3D3brightness calibration targets (Spectralon )range (R) angle of incidence (α) reflectance (ρ)n/a(1) I = g 1 ( R ) · g 2 ( ρ cos ( α ) )
(2) I = g 3 ( ρ cos ( α ) ,   g 4 ( R ) )
where: g1: linear, g2: xA, g3: cubic polynomial, g4: vector valued
Kaasalainen et al. [59,60](TLS) FARO LS HE803brightness calibration targets (Spectralon)range (R) reflectance (ρ)n/a ρ = 10 I I r e f A B
where: Iref is 99% Spectralon ® reference Intensity measured at the same range of targets
Kaasalainen et al. [59](TLS) Leica HDS60003brightness calibration targets (Spectralon) gravelrange (R)n/a ρ = I I r e f
where: Iref is 99% Spectralon ® reference Intensity measured at the same range of targets
Table 5. Selected intensity correction and calibration methods exclusively for bathymetric LIDAR.
Table 5. Selected intensity correction and calibration methods exclusively for bathymetric LIDAR.
ReferenceScannerLevelTargetsParametersTheoretical ModelEmpirical Model
Tuell et al. [86](ALB) Optech SHOALS3homogeneous surface (wall covered in painted tiles)See [86] for derivations of parameters applied.See Equation (28) in [86] for final modeln/a
Collin et al. [35](ALB) Optech SHOALS1n/areceived power (PR) constant combining loss factors (W) transmitted power (PT) benthic reflectance (ρ) diffuse attenuation coeff. (K) depth (D) P R = W × P T × ρ × e 2 K D Fourier transform with low-pass filtering, then a nonlinear least squares regression correction for depth.
Wang & Philpot [84](ALB) Optech SHOALS1n/aBathymetric angle of incidence (θi) Derived coefficients (C)n/aCorrection for bottom reflectance: f ( θ i ) = C 1 ×   θ i + C 2
Correction for pulse stretching: g ( θ ) = { C 3   e C 4 θ i 90 ° < θ i 0 ° C 5 e C 6 θ i 0 ° θ i 90 °

6.1. Theoretical Correction Methods

Many theoretical corrections have been developed from the LIDAR range equation (Equation (3)). Most theoretical correction methods commonly compensate for variation in intensity data caused by range (R) and angle of incidence (α) [58,65,68,69,70,73,77,88,96]. Based on Equation (3), the received power reflected from extended targets is a function of the inverse range square (Figure 6a) and the cosine of the angle of incidence (Figure 6b). Therefore, in theoretical correction methods, the raw intensity data is multiplied by (R2/cos(α)), and then normalized with dividing by a user-defined reference range square (Rref2) (Equation (5)). The corrected intensity values will be equivalent to the intensity values that would have been measured if the range and angle of incidence for all points were the defined reference range and zero, respectively.
I c = I ·   R i 2 R r e f 2   ·   1 cos α
Figure 6. (a) Theoretical relationship of intensity measurements vs. range shown for two materials with different reflectance (ρ); and (b) theoretical relationship of intensity measurements vs. angle of incidence shown for two materials with different reflectance (ρ).
Figure 6. (a) Theoretical relationship of intensity measurements vs. range shown for two materials with different reflectance (ρ); and (b) theoretical relationship of intensity measurements vs. angle of incidence shown for two materials with different reflectance (ρ).
Sensors 15 28099 g006
Some limitations should be considered when using the range squared correction. First, the range squared correction works for extended targets but non-extended targets such as leaves and wires with an area smaller than the laser footprint show different range dependency [88]. Based on the LIDAR theory described before, the intensity recorded from non-extended targets is a function of inverse range with higher powers (e.g., 1/R3, 1/R4). Second, some terrestrial scanners are equipped with brightness reducers for near distances (e.g., less than 10 m) that cause a strong deviation between recorded intensities in near distances and the values calculated by the LIDAR theory equation [21,54,72]. Therefore, the range squared correction is not applicable for near distance intensities (e.g., less than 10 m) recorded by those scanners.
Several approaches to compensate for atmospheric effects have been reported. Generally, the detailed atmospheric conditions and effects are impractical to obtain. However, an approximated value for atmospheric effect should be chosen, which represent the average conditions between the sensor and targets. A common approach is to use radiative transfer simulation models such as MODTRAN. These models can estimate atmospheric transmittance effects based on atmospheric visibility parameters. Vain et al. [65] used MODTRAN to estimate the total atmospheric transmittance (T) and corrected intensity values by multiplying with (1/T2). Some other studies applied more rigorous models based on the Beer-Lambert Law in which the atmospheric transmittance is a function of range. Höfle and Pfeifer [70] and Ding et al. [69] defined an antilog function shown in Equation (6) and Yan et al. [58] defined an exponential decay function shown in Equation (7). In both models the attenuation coefficient a must be determined either by simulation models such as MODTRAN or by empirical methods.
I c = I · 10 2 a R 1000
I c = I · e 2 a R
Some correction models account for transmitted energy (Et) specifically when the pulse repetition frequency varies or different types of scanners were used during data acquisition [65,70,77]. Correlations between pulse repetition frequency and transmitted pulse energy such as those developed by [89,99] can be used to estimate the transmitted energy in each scan. Intensity values are then divided by the transmitted energy and multiplied to a user-defined reference energy value (Equation (8)). In order to apply this correction, the correlation between pulse repetition frequency and transmitted energy must be known for the scanning sensor used.
I c = I   ·   E T r e f E T i

6.2. Empirical Correction Methods

In an empirical correction method, the corrected intensity is defined as a function of one or more variables (e.g., range) and driven from data correlations rather than any physical information or theoretical equation. Empirical methods require homogenous surfaces such as asphalt roads, roof planes, brightness tarps, etc., that are captured in multiple scans with varying settings (e.g., varying ranges). Data captured from these overlapping areas are used to estimate the constant parameters in an empirical correction function, which is then applied to the whole dataset. Empirical methods are suitable when physical and sensor-related parameters in the LIDAR range equation (Equation (3)) are unknown.
Examples of empirical correction methods have been reported in [27,54,68,69,70,71,95,98,100] (see Table 4). Coren and Sterzai [68] used data captured from asphalt roads to estimate the atmospheric attenuation coefficient in an exponential decay correction function as shown in Equation (7). Hofle and Pfeifer [70] adopted an empirical quadratic correction function correlating intensity and range values. Jutzi and Gross [71] and Ding et al. [69] devolved empirical intensity correction models including range and angle of incidence. Some studies presented empirical models to compensate for intensity variation caused by automatic gain control systems [27,95].

6.3. Bathymetric LIDAR Correction Methods

Corrections for bathymetric LIDAR primarily come in determining systematic and environmental parameters. As with topographic LIDAR, the most rigorous method requires calibrating the system on a surface with known reflectance. Reference [86] offers an example of a rigorous radiometric calibration in which the system is first calibrated against a target, then water optical properties are determined by fitting simulated waveforms to the measured waveform. In [35], the correction approach is a combination of a theoretical approach using a simplified version of the bathymetric LIDAR equation shown above, and an empirical approach using a Fourier Transform with low-pass filtering, then correcting for depth using a nonlinear least squares regression fit of the data. Reference [84] corrects for bottom reflectance using laboratory experiments observations and for pulse stretching using the analytical simulation of [101].

6.4. Intensity Normalization Procedures

Following the aforementioned correction procedures, a normalization process is sometimes completed to compensate for differences in overlapping areas between flight-lines or individual scans. This normalization process is also sometimes completed to provide consistency between different sensors.
Qin et al. [102] present a normalized reflective factor (NRF), which characterizes the radiometric attributes of a point cloud. In their approach, they apply corrections to intensity values based on energy, geometry and atmospheric effects. A visual analysis of the radiometric attributes is conducted for areas of overlap for quality control purposes. They also utilize hyperspectral imagery to compare the normalized intensity values in specific land cover classes for normalization.
Yan and Shaker [79] propose a sub-histogram matching approach, primarily focused on minimizing the effects of automated gain control. They first perform corrections for geometric and environmental factors. They also provide a slope correction for steep slopes. Next, they identify regions of overlap with wide variability in intensity values. A histogram is generated for each strip. Gaussian components were then fit for sub-histograms found within the histogram. The intersections of these individual Gaussian components were then used as match strips between each strip. The process is then repeated for all overlapping strips and histogram equalization techniques were then applied such that the data were consistent between the strips.
Teo and Yu [54] propose a normalization approach that considers adjacent strips as well as multiple scanners in a mobile laser scanner. In their approach, they employ an empirically-derived piecewise polynomial function for the range correction to account for close-range effects (e.g., <10 m). They then compare the maximum and minimum amplitude differences between strips for the intensity normalization to compute the normalization adjustment. The improvement is evaluated by comparing the mean amplitude differences before and after correction.

6.5. Radiometric Calibration with Reference Targets

Reference targets with known reflectance values are required for extracting true reflectance values from intensity data. Brightness reflectance targets such as tarps and Spectralon® targets with known nominal reflectance have been used in several studies [59,60,63,64,65,77,95]. Some studies performed laboratory or in situ measurements to determine reflectance values for available natural and commercial objects such as sand, gravel, asphalt, concrete, and brick materials and then used them as reflectance calibration targets [61,62,65,97].
Two main calibration procedures were reported in literature. A common procedure is to separate the correction and calibration steps [61,65,69,77,97]. In this approach, first a theoretical or empirical correction model (explained in Section 6.1 and Section 6.2) is applied to reduce variations in intensity data. Next corrected intensity values are converted to reflectance values by using empirical correlation functions driven from data captured on calibration targets. Another approach [59,60,63,64] combines the correction and calibration steps.

7. Summary of Challenges and Future Direction

As LIDAR intensity information continues to become increasingly useful in a wider range of applications with a diverse audience, greater emphasis is being placed on correction and calibration processes. A significant amount of insightful work has been completed to-date, in order to improve the utility of these intensity values. However, as of yet, there is no standard approach for correction or calibration implemented across manufacturers. In some cases, this is further compounded with no consistency between units from different systems from the same manufacturer. Numerical scaling and units to represent intensity are also not consistent between manufacturers of both hardware and software. While relative differences between intensity values tend to be generally consistent for a particular system and scan, they may be vastly different across different systems, scans, acquisition geometries, etc. This creates challenges when using or developing filtering or classification algorithms based on intensity values.
While a significant amount of research has been conducted on correction or calibration methods, the selection and usage of parameters, as well as simplifying assumptions in the models, are still widely inconsistent between studies. In particular, no methods were found that can consider the influence of some key parameters (e.g., pulses with multiple returns will have lower intensity values since the beam footprint is spread across multiple objects).
While formulating this review of current literature, several knowledge gaps were identified, including the need to:
  • Develop relationships and unifying research for consistent intensity values/measures between LIDAR systems designed for platforms such as airborne, mobile, and terrestrial. Currently much research between these systems remains distinct; however, there are many similarities between these systems.
  • Evaluate and account for the influences of surface characteristics such as roughness or wetness.
  • Clarify what level of intensity processing is needed (or useful) for specific applications. For some applications, a Level 0 intensity value may prove sufficient. However, for advanced classifications (e.g., determination of plant species), Level 3 calibration may be required.
  • Variance of intensity across wavelengths. The wavelength of LIDAR systems can also vary significantly. Even if a “true” reflectance is calculated from the intensity values, it is important to consider that such a reflectance only applies at the specific wavelength of the system. Many of the parameters described in this review are a function of the wavelength used. Hence, we recommend for future studies that the wavelength be included as a subscript of presented reflectance values (e.g., ρ532) obtained via LIDAR.
In addition to future research efforts, we provide the following recommendations to future data exchange standard formats such as the ASPRS “LAS” or the ASTM “E57” formats to better communicate processing completed with intensity data from a LIDAR scan. First, the addition of an attribute in the header could indicate the level of intensity processing applied. Second, an attribute field for the wavelength of the system would be helpful to provide context to these intensity values. Additionally, for critical applications both corrected and original values could be stored. Finally, a detailed description of the process should be documented in the metadata.
As calibration methods continue to evolve, it is likely that future LIDAR systems will be capable of directly providing reflectance values on-board the hardware. Software solutions will continue to utilize this information further, improving processing workflows for a wide range of applications. The improved sensitivity will also result in LIDAR being utilized for a new host of applications that have not been envisioned.

Acknowledgments

This material is based upon work supported by the National Science Foundation under Grant No. 1351487. Leica Geosystems and Maptek I-Site also provided software that was used to generate several images in this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shan, J.; Toth, C.K. Topographic Laser Ranging and Scanning: Principles and Processing; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar]
  2. Song, J.H.; Han, S.H.; Yu, K.Y.; Kim, Y.I. Assessing the possibility of land-cover classification using LIDAR intensity data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 259–262. [Google Scholar]
  3. Charaniya, A.P.; Manduchi, R.; Lodha, S.K. Supervised parametric classification of aerial LIDAR data. In Proceedings of the IEEE 2004 Conference on Computer Vision and Pattern Recognition Workshop, Washington, DC, USA, 27 June–2 July 2004.
  4. Brennan, R.; Webster, T.L. Object-oriented land cover classification of LIDAR-derived surfaces. Can. J. Remote Sens. 2006, 32, 162–172. [Google Scholar] [CrossRef]
  5. Matikainen, L.; Hyyppä, J.; Hyyppä, H. Automatic detection of buildings from laser scanner data for map updating. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2003, 34, 218–224. [Google Scholar]
  6. Arnold, N.S.; Rees, W.G.; Devereux, B.J.; Amable, G.S. Evaluating the potential of high-resolution airborne LIDAR data in glaciology. Int. J. Remote Sens. 2006, 27, 1233–1251. [Google Scholar] [CrossRef]
  7. Im, J.; Jensen, J.R.; Hodgson, M.E. Object-based land cover classification using high-posting-density LIDAR data. GIScience Remote Sens. 2008, 45, 209–228. [Google Scholar] [CrossRef]
  8. Zhou, W.; Huang, G.; Troy, A.; Cadenasso, M.L. Object-based land cover classification of shaded areas in high spatial resolution imagery of urban areas: A comparison study. Remote Sens. Environ. 2009, 113, 1769–1777. [Google Scholar] [CrossRef]
  9. MacFaden, S.W.; O’Neil-Dunne, J.P.; Royar, A.R.; Lu, J.W.; Rundle, A.G. High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis. J. Appl. Remote Sens. 2012, 6. [Google Scholar] [CrossRef]
  10. Alba, M.; Barazzetti, L.; Scaioni, M.; Remondino, F. Automatic registration of multiple laser scans using panoramic RGB and intensity images. In Proceedings of the ISPRS Workshop Laser Scanning 2011, Calgary, AL, Canada, 29–31 August 2011.
  11. Barnea, S.; Filin, S. Geometry-image-intensity combined features for registration of terrestrial laser scans. In Photogrammetry and Computer Vision, ISPRS Commission III; ISPRS: Aint-Mandé, France, 2010; Volume 2, pp. 145–150. [Google Scholar]
  12. Boehm, J.; Becker, S. Automatic Marker-free Registration of Terrestrial Laser Scans using Reflectance Features. In Proceedings of the 8th Conference Optical 3-D Measurement Techniques, Zurich, Switzerland, 9–12 July 2007; Volume I, pp. 338–343.
  13. Kang, Z.; Li, J.; Zhang, L.; Zhao, Q.; Zlatanova, S. Automatic registration of terrestrial laser scanning point clouds using panoramic reflectance images. Sensors 2009, 9, 2621–2646. [Google Scholar] [CrossRef] [PubMed]
  14. Wang, Z.; Brenner, C. Point based registration of terrestrial laser data using intensity and geometry features. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 583–590. [Google Scholar]
  15. Abedinia, A.; Hahnb, M.; Samadzadegan, F. An investigation into the registration of LIDAR intensity data and aerial images using the SIFT approach. In Proceedings of XXI ISPRS Congress, Beijing, China, 3–11 July 2008.
  16. González-Aguilera, D.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; Lerma, J.L. A robust and hierarchical approach for the automatic co-registration of intensity and visible images. Opt. Laser Technol. 2012, 44, 1915–1923. [Google Scholar] [CrossRef]
  17. González-Aguilera, D.; Rodríguez-Gonzálvez, P.; Gómez-Lahoz, J. An automatic procedure for co-registration of terrestrial laser scanners and digital cameras. ISPRS J. Photogramm. Remote Sens. 2009, 64, 308–316. [Google Scholar] [CrossRef]
  18. Parmehr, E.G.; Fraser, C.S.; Zhang, C.; Leach, J. Automatic registration of optical imagery with 3D LIDAR data using statistical similarity. ISPRS J. Photogramm. Remote Sens. 2014, 88, 28–40. [Google Scholar] [CrossRef]
  19. Wong, A.; Orchard, J. Efficient FFT-accelerated approach to invariant optical–LIDAR registration. IEEE Trans. Geosci. Remote Sens. 2008, 46, 3917–3925. [Google Scholar] [CrossRef]
  20. Han, J.Y.; Perng, N.H.; Lin, Y. Feature conjugation for intensity-coded LIDAR point clouds. J. Surv. Eng. 2013, 139, 135–142. [Google Scholar] [CrossRef]
  21. Fang, W.; Huang, X.; Zhang, F.; Li, D. Intensity Correction of Terrestrial Laser Scanning Data by Estimating Laser Transmission Function. IEEE Trans. Geosci. Remote Sens. 2015, 53, 942–951. [Google Scholar] [CrossRef]
  22. Olsen, M.J.; Chen, Z.; Hutchinson, T.; Kuester, F. Optical techniques for multiscale damage assessment. Geomat. Nat. Hazards Risk 2013, 4, 49–70. [Google Scholar] [CrossRef]
  23. Antonarakis, A.S.; Richards, K.S.; Brasington, J. Object-based land cover classification using airborne LIDAR. Remote Sens. Environ. 2008, 112, 2988–2998. [Google Scholar] [CrossRef]
  24. Lang, M.W.; McCarty, G.W. LIDAR intensity for improved detection of inundation below the forest canopy. Wetlands 2009, 29, 1166–1178. [Google Scholar] [CrossRef]
  25. Donoghue, D.N.; Watt, P.J.; Cox, N.J.; Wilson, J. Remote sensing of species mixtures in conifer plantations using LIDAR height and intensity data. Remote Sens. Environ. 2007, 110, 509–522. [Google Scholar] [CrossRef]
  26. Gatziolis, D. Dynamic range-based intensity normalization for airborne, discrete return LIDAR data of forest canopies. Photogramm. Eng. Remote Sens. 2011, 77, 251–259. [Google Scholar] [CrossRef]
  27. Korpela, I.; Ørka, H.O.; Hyyppä, J.; Heikkinen, V.; Tokola, T. Range and AGC normalization in airborne discrete-return LIDAR intensity data for forest canopies. ISPRS J. Photogramm. Remote Sens. 2010, 65, 369–379. [Google Scholar] [CrossRef]
  28. Wing, B.M.; Ritchie, M.W.; Boston, K.; Cohen, W.B.; Olsen, M.J. Individual snag detection using neighborhood attribute filtered airborne LIDAR data. Remote Sens. Environ. 2015, 163, 165–179. [Google Scholar] [CrossRef]
  29. Wing, B.M.; Ritchie, M.W.; Boston, K.; Cohen, W.B.; Gitelman, A.; Olsen, M.J. Prediction of understory vegetation cover with airborne LIDAR in an interior ponderosa pine forest. Remote Sens. Environ. 2012, 124, 730–741. [Google Scholar] [CrossRef]
  30. Barnea, S.; Filin, S. Extraction of objects from terrestrial laser scans by integrating geometry image and intensity data with demonstration on trees. Remote Sens. 2012, 4, 88–110. [Google Scholar] [CrossRef]
  31. Burton, D.; Dunlap, D.B.; Wood, L.J.; Flaig, P.P. LIDAR intensity as a remote sensor of rock properties. J. Sediment. Res. 2011, 81, 339–347. [Google Scholar] [CrossRef]
  32. Mazzarini, F.; Pareschi, M.T.; Favalli, M.; Isola, I.; Tarquini, S.; Boschi, E. Lava flow identification and aging by means of LIDAR intensity: Mount Etna case. J. Geophys. Res. Solid Earth 2007, 112, 1978–2012. [Google Scholar] [CrossRef]
  33. Kaasalainen, S.; Kaartinen, H.; Kukko, A. Snow cover change detection with laser scanning range and brightness measurements. EARSeL eProc. 2008, 7, 133–141. [Google Scholar]
  34. Chust, G.; Galparsoro, I.; Borja, A.; Franco, J.; Uriarte, A. Coastal and estuarine habitat mapping, using LIDAR height and intensity and multi-spectral imagery. Estuar. Coast. Shelf Sci. 2008, 78, 633–643. [Google Scholar] [CrossRef]
  35. Collin, A.; Archambault, P.; Long, B. Mapping the shallow water seabed habitat with the SHOALS. IEEE Trans. Geosci. Remote Sens. 2008, 46, 2947–2955. [Google Scholar] [CrossRef]
  36. Macon, C.; Wozencraft, J.; Park, J.Y.; Tuell, G. Seafloor and land cover classification through airborne LIDAR and hyperspectral data fusion. In Proceedings of the 2008 IEEE International Geoscience and Remote Sensing Symposium, Boston, MA, USA, 7–11 July 2008; Volume 2.
  37. Narayanan, R.; Kim, H.B.; Sohn, G. Classification of SHOALS 3000 bathymetric LIDAR signals using decision tree and ensemble techniques. In Proceedings of the 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH), Toronto, ON, Canada, 26–27 September 2009; pp. 462–467.
  38. Tuell, G.; Park, J.Y.; Aitken, J.; Ramnath, V.; Feygels, V.; Guenther, G.; Kopilevich, Y. SHOALS-enabled 3D Benthic Mapping. Available online: http://proceedings.spiedigitallibrary.org/proceeding.aspx?articleid=864434 (accessed on 5 November 2015).
  39. Tuell, G.H.; Ramnath, V.; Park, J.Y.; Feygels, V.; Aitken, J.; Kopelivich, Y. Fusion of SHOALS bathymetric LIDAR and passive spectral data for shallow water rapid environmental assessment. In Proceedings of the Oceans 2005-Europe, Brest, France, 20–23 June 2005; Volume 2, pp. 1046–1051.
  40. Long, B.; Aucoin, F.; Montreuil, S.; Robitaille, V.; Xhardé, R. Airborne LIDAR bathymetry applied to coastal hydrodynamic processes. In Proceedings of International Conference on Coastal Engineering, Shanghai, China, 30 June–5 July 2010.
  41. Armesto-González, J.; Riveiro-Rodríguez, B.; González-Aguilera, D.; Rivas-Brea, M.T. Terrestrial laser scanning intensity data applied to damage detection for historical buildings. J. Archaeol. Sci. 2010, 37, 3037–3047. [Google Scholar] [CrossRef]
  42. Guldur, B.; Hajjar, J.F. Damage Detection on Structures Using Texture Mapped Laser Point Clouds. In Proceedings of Structures Congress, Boston, MA, USA, 3–5 April 2014.
  43. Olsen, M.J.; Cheung, K.F.; Yamazaki, Y.; Butcher, S.M.; Garlock, M.; Yim, S.C.; Young, Y.L. Damage Assessment of the 2010 Chile Earthquake and Tsunami using ground-based LIDAR. Earthq. Spectra 2012, 28, S179–S197. [Google Scholar] [CrossRef]
  44. Olsen, M.J.; Kuester, F.; Chang, B.J.; Hutchinson, T.C. Terrestrial laser scanning-based structural damage assessment. J. Comput. Civil Eng. 2009, 24, 264–272. [Google Scholar] [CrossRef]
  45. Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A. Semi-Automated Detection of Surface Degradation on Bridges Based on a Level Set Method. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 1, 15–21. [Google Scholar] [CrossRef]
  46. Kashani, A.G.; Olsen, M.; Graettinger, A.J. Laser Scanning Intensity Analysis for Automated Building Wind Damage Detection. In Proceedings of International Workshop on Computing in Civil Engineering, Austin, TX, USA, 21–23 June 2015.
  47. Kashani, A.G.; Graettinger, A.J. Cluster-Based Roof Covering Damage Detection in Ground-Based LIDAR Data. Autom. Constr. 2015, 58, 19–27. [Google Scholar] [CrossRef]
  48. Kashani, A.G. Automated Assessment of Tornado-Induced Building Damage Based on Terrestrial Laser Scanning. Ph.D. Thesis, University of Alabama, Tuscaloosa, AL, USA, September 2014. [Google Scholar]
  49. Guan, H.; Yu, Y.; Li, J.; Liu, P.; Zhao, H.; Wang, C. Automated extraction of manhole covers using mobile LIDAR data. Remote Sens. Lett. 2014, 5, 1042–1050. [Google Scholar] [CrossRef]
  50. Guan, H.; Li, J.; Yu, Y.; Wang, C.; Chapman, M.; Yang, B. Using mobile laser scanning data for automated extraction of road markings. ISPRS J. Photogramm. Remote Sens. 2014, 87, 93–107. [Google Scholar] [CrossRef]
  51. Yang, B.; Fang, L.; Li, Q.; Li, J. Automated extraction of road markings from mobile LIDAR point clouds. Photogramm. Eng. Remote Sens. 2012, 78, 331–338. [Google Scholar] [CrossRef]
  52. Lin, Y.; Hyyppa, J. Geometry and intensity based culvert detection in mobile laser scanning point clouds. J. Appl. Remote Sens. 2010, 4. [Google Scholar] [CrossRef]
  53. Williams, K.; Olsen, M.J.; Roe, G.V.; Glennie, C. Synthesis of transportation applications of mobile LIDAR. Remote Sens. 2013, 5, 4652–4692. [Google Scholar] [CrossRef]
  54. Teo, T.A.; Yu, H.L. Empirical Radiometric Normalization of Road Points from Terrestrial Mobile LIDAR System. Remote Sens. 2015, 7, 6336–6357. [Google Scholar] [CrossRef]
  55. Johnson, W.H.; Johnson, A.M. Operational considerations for terrestrial laser scanner use in highway construction applications. J. Surv. Eng. 2012, 138, 214–222. [Google Scholar] [CrossRef]
  56. Tsai, Y.C.J.; Li, F. Critical assessment of detecting asphalt pavement cracks under different lighting and low intensity contrast conditions using emerging 3D laser technology. J. Transp. Eng. 2012, 138, 649–656. [Google Scholar] [CrossRef]
  57. Chin, A.; Olsen, M.J. Evaluation of Technologies for Road Profile Capture, Analysis, and Evaluation. J. Surv. Eng. 2014, 141. [Google Scholar] [CrossRef]
  58. Yan, W.Y.; Shaker, A.; Habib, A.; Kersting, A.P. Improving classification accuracy of airborne LIDAR intensity data by geometric calibration and radiometric correction. ISPRS J. Photogramm. Remote Sens. 2012, 67, 35–44. [Google Scholar] [CrossRef]
  59. Kaasalainen, S.; Krooks, A.; Kukko, A.; Kaartinen, H. Radiometric calibration of terrestrial laser scanners with external reference targets. Remote Sens. 2009, 1, 144–158. [Google Scholar] [CrossRef]
  60. Kaasalainen, M.; Kaasalainen, S. Aperture size effects on backscatter intensity measurements in Earth and space remote sensing. JOSA A 2008, 25, 1142–1146. [Google Scholar] [CrossRef] [PubMed]
  61. Kaasalainen, S.; Hyyppä, H.; Kukko, A.; Litkey, P.; Ahokas, E.; Hyyppä, J.; Pyysalo, U. Radiometric calibration of LIDAR intensity with commercially available reference targets. IEEE Trans. Geosci. Remote Sens. 2009, 47, 588–598. [Google Scholar] [CrossRef]
  62. Kukko, A.; Kaasalainen, S.; Litkey, P. Effect of incidence angle on laser scanner intensity and surface data. Appl. Opt. 2008, 47, 986–992. [Google Scholar] [CrossRef] [PubMed]
  63. Pfeifer, N.; Dorninger, P.; Haring, A.; Fan, H. Investigating terrestrial laser scanning intensity data: Quality and functional relations. In Proceedings of the 8th Conference on Optical 3-D Measurement Techniques, Zurich, Switzerland, 9–12 July 2007; pp. 328–337.
  64. Pfeifer, N.; Höfle, B.; Briese, C.; Rutzinger, M.; Haring, A. Analysis of the backscattered energy in terrestrial laser scanning data. Int. Arch. Photogramm. Remote Sens. 2008, 37, 1045–1052. [Google Scholar]
  65. Vain, A.; Kaasalainen, S.; Pyysalo, U.; Krooks, A.; Litkey, P. Use of naturally available reference targets to calibrate airborne laser scanning intensity data. Sensors 2009, 9, 2780–2796. [Google Scholar] [CrossRef] [PubMed]
  66. Krooks, A.; Kaasalainen, S.; Hakala, T.; Nevalainen, O. Correction of intensity incidence angle effect in terrestrial laser scanning. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 2, 145–150. [Google Scholar] [CrossRef]
  67. Pesci, A.; Teza, G. Effects of surface irregularities on intensity data from laser scanning: An experimental approach. Ann. Geophys. 2008, 51, 839–848. [Google Scholar]
  68. Coren, F.; Sterzai, P. Radiometric correction in laser scanning. Int. J. Remote Sens. 2006, 27, 3097–3104. [Google Scholar] [CrossRef]
  69. Ding, Q.; Chen, W.; King, B.; Liu, Y.; Liu, G. Combination of overlap-driven adjustment and Phong model for LIDAR intensity correction. ISPRS J. Photogramm. Remote Sens. 2013, 75, 40–47. [Google Scholar] [CrossRef]
  70. Hofle, B.; Pfeifer, N. Correction of laser scanning intensity data: Data and model-driven approaches. ISPRS J. Photogramm. Remote Sens. 2007, 62, 415–433. [Google Scholar] [CrossRef]
  71. Jutzi, B.; Gross, H. Normalization of LIDAR intensity data based on range and surface incidence angle. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2009, 38, 213–218. [Google Scholar]
  72. Kaasalainen, S.; Jaakkola, A.; Kaasalainen, M.; Krooks, A.; Kukko, A. Analysis of incidence angle and distance effects on terrestrial laser scanner intensity: Search for correction methods. Remote Sens. 2011, 3, 2207–2221. [Google Scholar] [CrossRef]
  73. Starek, M.; Luzum, B.; Kumar, R.; Slatton, K.C. Normalizing LIDAR Intensities; GEM Center Report No. Rep_2006-12-001; University of Florida: Gainesville, FL, USA, 2006. [Google Scholar]
  74. Richter, K.; Blaskow, R.; Stelling, N.; Maas, H.G. Reference Value Provision Schemes for Attenuation Correction of Full-Waveform Airborne Laser Scanner Data. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 1, 65–72. [Google Scholar] [CrossRef]
  75. Richter, K.; Stelling, N.; Maas, H.G. Correcting attenuation effects caused by interactions in the forest canopy in full-waveform airborne laser scanner data. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 1, 273–280. [Google Scholar] [CrossRef]
  76. Romanczyk, P.; van Aardt, J.A.; Cawse-Nicholson, K.; Kelbe, D.; Strahler, A.H.; Schaaf, C.; Ramond, T. Quantifying the attenuation due to geometry interactions in waveform LIDAR signals. In Proceedings of the American Geophysical Union, Fall Meeting 2013, San Francisco, CA, USA, 9–13 December 2013.
  77. Ahokas, E.; Kaasalainen, S.; Hyyppä, J.; Suomalainen, J. Calibration of the Optech ALTM 3100 laser scanner intensity data using brightness targets. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2006, 34, 3–6. [Google Scholar]
  78. LAS Extrabytes Implementation in RIEGL Software Whitepaper. Available online: http://www.riegl.com/uploads/tx_pxpriegldownloads/Whitepaper_-_LAS_extrabytes_implementation_in_Riegl_software_01.pdf (accessed on 5 November 2015).
  79. Yan, W.Y.; Shaker, A. Radiometric Correction and Normalization of Airborne LIDAR Intensity Data for Improving Land-Cover Classification. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7658–7673. [Google Scholar] [CrossRef]
  80. Lichti, D.; Gordon, S.J.; Tipdecho, T. Error models and propagation in directly georeferenced terrestrial laser scanner networks. J. Surv. Eng. 2005, 131, 135–142. [Google Scholar] [CrossRef]
  81. Blaskow, R.; Schneider, D. Analysis and correction of the dependency between laser scanner intensity values and range. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Riva del Garda, Italy, 23–25 June 2014.
  82. Guenther, G.C. Airborne LIDAR Bathymetry. In Digital Elevation Model Technologies and Applications: The Dem User’s Manual, 2nd ed.; Maune, D.F., Ed.; American Society for Photogrammetry and Remote Sensing: Bethesda, MD, USA, 2007; pp. 253–320. [Google Scholar]
  83. Kopilevich, Y.I.; Feygals, V.I.; Tuell, G.H.; Surkov, A. Measurement of ocean water optical properties and seafloor reflectance with Scanning Hydrographic Operational Airborne LIDAR Survey (SHOALS): I. Theoretical Background. In Optics & Photonics 2005; International Society for Optics and Photonics: San Diego, CA, USA, 2005; pp. 58850D-1–58850D-9. [Google Scholar]
  84. Wang, C.K.; Philpot, W.D. Using airborne bathymetric LIDAR to detect bottom type variation in shallow waters. Remote Sens. Environ. 2007, 106, 123–135. [Google Scholar] [CrossRef]
  85. Tuell, G.H.; Park, J.Y. Use of SHOALS bottom reflectance images to constrain the inversion of a hyperspectral radiative transfer model. In Defense and Security; International Society for Optics and Photonics: San Diego, CA, USA, 2004; pp. 185–193. [Google Scholar]
  86. Tuell, G.H.; Feygels, V.; Kopilevich, Y.; Weidemann, A.D.; Cunningham, A.G.; Mani, R.; Aitken, J. Measurement of ocean water optical properties and seafloor reflectance with Scanning Hydrographic Operational Airborne LIDAR Survey (SHOALS): II. Practical results and comparison with independent data. In Optics & Photonics 2005; International Society for Optics and Photonics: San Diego, CA, USA, 2005; pp. 58850E-1–58850E-13. [Google Scholar]
  87. Tuell, G.; Carr, D. New Procedure for Estimating Field-of-View Loss in Bathymetric LIDAR. In Imaging Systems and Applications; Optical Society of America: Arlington, VA, USA, 2013. [Google Scholar]
  88. Jelalian, A.V. Laser Radar Systems; Artech House: Norwood, MA, USA, 1992. [Google Scholar]
  89. Baltsavias, E.P. Airborne laser scanning: Basic relations and formulas. ISPRS J. Photogramm. Remote Sens. 1999, 54, 199–214. [Google Scholar] [CrossRef]
  90. Wagner, W.; Hyyppä, J.; Ullrich, A.; Lehner, H.; Briese, C.; Kaasalainen, S. Radiometric calibration of full-waveform small-footprint airborne laser scanners. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 163–168. [Google Scholar]
  91. Wagner, W. Radiometric calibration of small-footprint full-waveform airborne laser scanner measurements: Basic physical concepts. ISPRS J. Photogramm. Remote Sens. 2010, 65, 505–513. [Google Scholar] [CrossRef]
  92. Li, X.; Liang, Y.; Xu, L. Bidirectional reflectance distribution function based surface modeling of non-Lambertian using intensity data of light detection and ranging. JOSA A 2014, 31, 2055–2063. [Google Scholar] [CrossRef] [PubMed]
  93. Wang, C.K.; Philpot, W.D. Using SHOALS LIDAR system to detect bottom material change. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Toronto, ON, Canada, 24–28 June 2002; Volume 5, pp. 2690–2692.
  94. Luzum, B.; Starek, M.; Slatton, K.C. Normalizing ALSM intensities; Geosensing Engineering and Mapping (GEM) Center Report No. Rep_2004-07-01; Civil and Coastal Engineering Department, University of Florida: Gainesville, FL, USA, 2004. [Google Scholar]
  95. Vain, A.; Yu, X.; Kaasalainen, S.; Hyyppä, J. Correcting airborne laser scanning intensity data for automatic gain control effect. Geosci. Remote Sens. Lett. 2010, 7, 511–514. [Google Scholar] [CrossRef]
  96. Habib, A.F.; Kersting, A.P.; Shaker, A.; Yan, W.Y. Geometric calibration and radiometric correction of LIDAR data and their impact on the quality of derived products. Sensors 2011, 11, 9069–9097. [Google Scholar] [CrossRef] [PubMed]
  97. Briese, C.; Pfennigbauer, M.; Lehner, H.; Ullrich, A.; Wagner, W.; Pfeifer, N. Radiometric Calibration of Multi-Wavelength Airborne Laser Scanning Data; ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences (ISPRS Annals): Melbourne, Australia, 2012; Volume 25, pp. 335–340. [Google Scholar]
  98. Errington, A.F.; Daku, B.L.; Prugger, A.F. A model based approach to intensity normalization for terrestrial laser scanners. In Proceedings of the International Symposium on LIDAR and Radar Mapping Technologies, Hong Kong, China, 26–29 May 2011.
  99. Chasmer, L.; Hopkinson, C.; Smith, B.; Treitz, P. Examining the influence of laser pulse repetition frequencies on conifer forest canopy returns. Photogramm. Eng. Remote Sens. 2006, 72, 1359–1367. [Google Scholar] [CrossRef]
  100. Hartzell, P.J.; Glennie, C.L.; Finnegan, D.C. Empirical waveform decomposition and radiometric calibration of a terrestrial full-waveform laser scanner. IEEE Trans. Geosci. Remote Sens. 2015, 53, 162–172. [Google Scholar] [CrossRef]
  101. Steinvall, O.K.; Koppari, K.R. Depth sounding LIDAR: An overview of Swedish activities and future prospects. In CIS Selected Papers: Laser Remote Sensing of Natural Waters—From Theory to Practice; International Society for Optics and Photonics: St. Petersburg, Russia, 1996; Volume 2964, pp. 2–24. [Google Scholar]
  102. Qin, Y.; Yao, W.; Vu, T.T.; Li, S.; Niu, Z.; Ban, Y. Characterizing Radiometric Attributes of Point Cloud Using a Normalized Reflective Factor Derived From Small Footprint LIDAR Waveform. IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens. 2015, 8, 740–749. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Kashani, A.G.; Olsen, M.J.; Parrish, C.E.; Wilson, N. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration. Sensors 2015, 15, 28099-28128. https://0-doi-org.brum.beds.ac.uk/10.3390/s151128099

AMA Style

Kashani AG, Olsen MJ, Parrish CE, Wilson N. A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration. Sensors. 2015; 15(11):28099-28128. https://0-doi-org.brum.beds.ac.uk/10.3390/s151128099

Chicago/Turabian Style

Kashani, Alireza G., Michael J. Olsen, Christopher E. Parrish, and Nicholas Wilson. 2015. "A Review of LIDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration" Sensors 15, no. 11: 28099-28128. https://0-doi-org.brum.beds.ac.uk/10.3390/s151128099

Article Metrics

Back to TopTop