Next Article in Journal
Determining Optimal New Generation Satellite Derived Metrics for Accurate C3 and C4 Grass Species Aboveground Biomass Estimation in South Africa
Previous Article in Journal
Effects of Ambient Ozone on Soybean Biophysical Variables and Mineral Nutrient Accumulation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modeling and Testing of Growth Status for Chinese Cabbage and White Radish with UAV-Based RGB Imagery

1
Department of Biosystems & Biomaterials Science and Engineering, College of Agriculture and Life Sciences, Seoul National University, Seoul KS013, Korea
2
ADAS Segment, Software Solution R&D Center, PLK Technologies, Seoul KS013, Korea
3
Department of Horticultural Crop Research, National Institute of Horticultural and Herbal Science, Muan KS005, Korea
4
Geospatial Information Co., Ltd., Naju KS005, Korea
5
Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611, USA
6
Research Institute of Agriculture and Life Sciences, Seoul National University, Seoul KS013, Korea
*
Author to whom correspondence should be addressed.
Submission received: 28 February 2018 / Revised: 1 March 2018 / Accepted: 3 March 2018 / Published: 5 April 2018
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Abstract

:
Conventional crop-monitoring methods are time-consuming and labor-intensive, necessitating new techniques to provide faster measurements and higher sampling intensity. This study reports on mathematical modeling and testing of growth status for Chinese cabbage and white radish using unmanned aerial vehicle-red, green and blue (UAV-RGB) imagery for measurement of their biophysical properties. Chinese cabbage seedlings and white radish seeds were planted at 7–10-day intervals to provide a wide range of growth rates. Remotely sensed digital imagery data were collected for test fields at approximately one-week intervals using a UAV platform equipped with an RGB digital camera flying at 2 m/s at 20 m above ground. Radiometric calibrations for the RGB band sensors were performed on every UAV flight using standard calibration panels to minimize the effect of ever-changing light conditions on the RGB images. Vegetation fractions (VFs) of crops in each region of interest from the mosaicked ortho-images were calculated as the ratio of pixels classified as crops segmented using the Otsu threshold method and a vegetation index of excess green (ExG). Plant heights (PHs) were estimated using the structure from motion (SfM) algorithm to create 3D surface models from crop canopy data. Multiple linear regression equations consisting of three predictor variables (VF, PH, and VF × PH) and four different response variables (fresh weight, leaf length, leaf width, and leaf count) provided good fits with coefficients of determination (R2) ranging from 0.66 to 0.90. The validation results using a dataset of crop growth obtained in a different year also showed strong linear relationships (R2 > 0.76) between the developed regression models and standard methods, confirming that the models make it possible to use UAV-RGB images for quantifying spatial and temporal variability in biophysical properties of Chinese cabbage and white radish over the growing season.

Graphical Abstract

1. Introduction

On-site monitoring of crop growth throughout the growing season plays an important role in assessing overall crop conditions, determining when to irrigate, and forecasting potential yields [1,2,3,4]. Particularly, periodic monitoring of various biophysical properties of crops grown in a field, such as biomass, leaf area index, and plant height, can help growers to effectively optimize inputs such as fertilizers and herbicides as well as to accurately estimate final yields [5,6,7]. Traditionally, crop-monitoring studies have used in-field measurements or airborne/satellite data to effectively cover wide areas. Field-based methods involving on-site sampling and laboratory analysis have disadvantages in collection of data because they are often destructive, labor-intensive, costly, and time consuming, thereby limiting the number of samples required for establishment of efficient crop growth management [8,9].
Precision agriculture is a site-specific soil and crop management system that assesses variability in soil properties (e.g., pH, organic matter, and soil nutrient levels) and field (e.g., slope and elevation) and crop parameters (e.g., yield and biomass) using various tools including the global positioning system (GPS), geographic information systems (GIS), and remote sensing (RS). To manage crops site -specifically, it is necessary to collect information such as crop and soil conditions and weed distribution at different locations in a field. Remote sensing of crops can be more attractive than traditional methods of crop monitoring due to the ability to cover large areas rapidly and repeatedly. Remote sensing techniques from manned airborne or satellite platforms have been widely adopted for crop monitoring [3,10] since measurements are non-destructive and non-invasive and enable scalable implementation in space and time [11]. A common use of remote sensing is evaluation of crop growth status based on canopy greenness by quantifying the distribution of vegetation index (VI) in the crop field. Various vegetation indices, including Normalized Difference Vegetation Index (NDVI) and Excess Green (ExG), have been defined as representative reflectance values of the vegetation canopy [12].
In recent years, unmanned aerial vehicles (UAVs) have been commonly used for low-altitude and high resolution-based remote sensing applications due to advantages such as versatility, light weight, and low operational costs [2,13]. In addition, UAVs offer a customizable aerial platform from which a variety of sensors can be mounted and flown to collect aerial imagery with much finer spatial and temporal resolutions compared to piloted aircraft or satellite remote sensing systems despite several limitations, such as relatively short flight time, lower payload, and the sensitivity to weather and terrain conditions. Advancements in the accuracy, economic efficiency, and miniaturization of many technologies, including GPS receivers and computer processors, have pushed UAV systems into a cost-effective, innovative remote sensing platform [14]. Especially, multi rotor-based UAVs have been commonly used to assess the vegetation status of crops and predict their yields because the flexibility of vertical takeoff and landing platforms with various image sensors make it easy to fly over agricultural fields [15]. The acquired aerial images can help farmers evaluate the status of crop growth such as canopy greenness, leaf area, water stress estimation, and various geographic conditions including crop area, digital surface models (DSMs), and depth contour lines [16,17].
Several review articles have highlighted the wide range of applications for UAVs and mounted sensors. In the area of agriculture, based on optical diffuse reflectance sensing in the visible and near-infrared (NIR) ranges studying the interaction between incident light and crop surface properties, UAVs have been adopted for monitoring of water status and drought stresses in fruit trees using NIR band data [18]; additionally, they have been used for collecting multispectral and hyperspectral imagery for use in spectral indices [19] and even chlorophyll fluorescence [20]. Baluja et al. analyzed the relationships between various indices derived from UAV imagery for assessing the water status variability of a commercial vineyard [21]. RGB (red, green, and blue) data in the visible range were also utilized by several researchers to investigate the relationships between biophysical parameters of various crops and their UAV imagery. In a study by Torres-Sánchez et al., the visible spectral indices were calculated for multi-temporal mapping of the vegetation fraction from UAV images, and an automatic object-based method was proposed to detect vegetation in herbaceous crops [15,22]. Yun et al. conducted multi-temporal monitoring of soybean vegetation fraction to evaluate crop conditions using UAV-RGB images [23]. Additionally, Bendig et al. estimated biomass of barley using crop height derived from UAV imagery [24], while Anthony et al. presented a micro-UAV system mounted with a laser scanner to measure crop heights [25]. Especially, Geipel et al. used both vegetation indices and crop height based on UAV-RGB imagery for predicting corn yields [26].
Crop growth models require use of a wide range of biophysical parameters, including biomass, leaf area index, and plant height, which are all closely related to future yield [27,28]. The biophysical properties of a crop measured at different locations in a field may further deliver vital information about specific disease situations, enabling field-specific decisions on plant protection [29]. In addition, yield maps generated using crop growth models can provide information about the spatial and temporal variability of yields in previous years [30]. However, these maps have limitations in explaining current growing conditions. To address this issue, several studies have demonstrated the feasibility of using crop growth models to predict yield using linear regression models built with additional information on crop management [31] or weather and soil attributes [32,33,34].
Since plant height is a critical indicator of crop evapotranspiration [35], crop yield [36], crop biomass [24], and crop health [25], 3D image-based plant phenotyping techniques have been utilized to obtain the plant architecture, such as height, size and shape [14,37,38]. In particular, in combination with a non-vegetation ground model, plant height can be obtained using crop surface models (CSMs) [39,40]. Bendig et al. [24,39] defined the CSM as the absolute height of crop canopies, and Geipel et al. [26] defined the CSM as the difference between a digital terrain model (DTM) and a digital surface model (DSM). Multi-temporal CSMs derived from 3D point clouds can deliver a high resolution to the centimeter level [39,41]. Such CSMs have been applied to various crops such as sugar, beets, rice, and summer barely [24,39,40,41]. Since light detection and ranging (LiDAR) sensors can allow users to determine the distance from the sensor to target objects based on discrete return or continuous wave signals, in spite of the relatively high cost of the LiDAR sensor, the LiDAR measurements have been successfully used for constructing 3D canopy structure with satisfactory point densities [42,43,44]. The emergence of structure from motion (SfM)-based software has enabled efficient creation of 3D point clouds and super high detailed ortho-photos without LiDAR sensors [45,46]. The SfM photogrammetry is a computer vision method that offers high resolution 3D topographic or structural reconstruction from overlapping imagery [47]. In principle, SfM performs a bundle adjustment among UAV images based on matching features between the overlapped images to estimate interior and exterior orientation of the onboard sensor. The first step of SfM algorithms is to extract features in each image that can be matched to corresponding features in other images for establishing relative location and parameters of the sensor. The key to SfM methods is the ability to calculate camera position, orientation, and scene geometry purely from the set of overlapping images provided, offering a simple processing workflow compared to alternative photogrammetry techniques [48,49]. The workflow of SfM for generating 3D digital reconstructions of landscapes or scenes makes it applicable in a variety of research fields including the modeling of urban and vegetation features. However, the SfM approach with vegetation has proven more difficult than with urban and other features because of the more complex and inconsistent structures resulting from leaf gaps, repeating structures of the same color, and random geometrics. Nevertheless, satisfactory results of vegetation modeling that estimates canopy height with SfM have been reported using colored field markers and increasing the number of acquired photographs.
Chinese cabbage (Brassica rapa subsp. pekinensis) and white radish (Raphanus sativus) are commonly cultivated in Korea because they are the main ingredients in Kimchi [50,51]. On-site monitoring of their growth status in the field using UAVs with SfM can allow the identification of spatial variation in various biophysical factors, such as canopy coverage, leaf area, and plant height, thereby helping to efficiently regulate the application of fertilizers and water as well as accurately estimate yields prior to harvest. Although previous studies have evaluated the effectiveness of the UAV system for agricultural purposes, yield estimation of Chinese cabbage and white radish using a UAV with only an RGB camera has not yet been studied.
The overall goal of this study was to develop UAV-RGB imagery-based crop growth estimation models that can quantify various biophysical parameters of Chinese cabbage and white radish over the entire growing season, as a means of assessing growth status and estimating potential yields before harvest. Specific objectives were (i) to develop regression models consisting of RGB-based vegetation index and SfM-estimated plant height that can quantify four different biophysical parameters of Chinese cabbage and white radish crops, i.e., leaf length, leaf count, leaf area, and fresh weight, and (ii) to investigate applicability of the developed regression models to a separate dataset of UAV-RGB images obtained from a different year for quantitative analysis of growth status of Chinese cabbage and white radish during the growing season.

2. Materials and Methods

2.1. Test Plots

A two-year field experiment was conducted during the 2015 and 2016 growing seasons (from September to November each year) in four different vegetable fields (denoted Field W15, C15, W16, and C16) of the Bioenergy Crop Research Institute (35°03′N, 126°22′E, altitude 12 m), located in Muan, Jeollanam-do, Republic of Korea (Figure 1). The different areas in 2015 and 2016 were used for evaluating whether regression data developed in the 2015 could be applied to data sets obtained at a different year and in different fields or not. Three different sets of 21-day-old Chinese cabbage seedlings and white radish seeds were planted at 7–10-day intervals (denoted A, B, and C) in each of two separate fields to provide a wide range of growth rates under conventional tillage practices with a sprinkler irrigation system. Chinese cabbage and white radish were first planted on 7 and 6 September 2015 and 5 and 2 September 2016, respectively. A split plot arrangement of treatments for each crop was used with three replications (denoted 1, 2, and 3) in a randomized complete block design. Individual sub-plot dimensions were 3 by 9 m, consisting of four 0.5 m rows. However, only three different sets of Chinse cabbage data without replication were obtained in Field C16 since the growth quality of the other data sets was inadequate for analysis due to damage resulting from inappropriate application of herbicide prior to planting. Data obtained from Field C15 and W15 were used to build statistical models that could quantify various biophysical parameters of the cabbage and radish crops. Testing was performed using data obtained from Field C16 and W16 to investigate the predictive validity of the developed regression models to estimate the growth status of Chinese cabbage and white radish grown in different fields. Granular fertilizer was hand-applied in the furrow at planting. White plastic mulch films were used to suppress weed growth before planting.

2.2. Unmanned Aerial Vehicles Flight and Image Acquisition

The unmanned aerial vehicle (UAV) platform used in this study was a DJI F550 hexa-rotor airframe (DJI Innovations, Shenzhen, China) equipped with a Canon Powershot S110 RGB digital camera (Canon, Tokyo, Japan) and had a total weight of 1.8 kg including batteries and an additional payload capability of up to 0.6 kg. The platform could be connected to a PC (personal computer) ground station via a 433 MHz datalink to monitor the UAV’s flight status and send flight path mission instructions. Details of the UAV platform specifications are described in Table 1. The UAV was set to automatically fly over the experimental field using a Pixhawk automatic flight controller (3D Robotics, Berkeley, USA) while tracking waypoints according to the pre-programmed flight path generated using an Mission Planner open source program (Ardupilot Development Team and Community). A sequence of overlapped images was collected on each flight mission to cover the entire experimental field. The flight path was designed to ensure overlapping images of at least 70% side overlap and 85% forward overlap. The Pix4Dmapper Pro 3.0.17 (Pix4D SA, Lausanne, Switzerland), which allows image mosaicking, was used to generate a complete crop map in the total study area.
The RGB camera used in this study acquired 12-megapixel images using a 1/1.7″ CMOS sensor and a 24–120 mm zoom lens. The field of views (FOVs) of the camera were 72.3° and 57.5° for horizontal and vertical directions, respectively. The images were acquired based on a time-lapse function that took one image every two seconds. From a preliminary test of determining appropriate parameters of the camera to decrease blurriness in images, the shutter speed and F-stop (aperture) were set at 1/2000 s and 4.0, respectively, with the focus distance set at infinity. The internal camera parameters, such as principal point and radial distortion, were auto-compensated by processing the bundle block adjustment in Pix4Dmapper Pro 3.0.17. As shown in Table 2, in 2015, remotely-sensed digital imagery data were collected for the test field on several dates at an approximately one-week interval during both growing seasons, beginning in late September and ending in early November, flying the UAV at 2 m/s at 20 m above ground level (AGL). To obtain ground-truth data on biophysical properties of each crop, plants located along a randomly chosen row were removed from the field within two days after every UAV flight. In the laboratory, fresh weights of plant samples were measured with an electronic balance, and leaf length and widths were measured using a 1 m ruler. In Field C15 and W15, plants sampled in each field were used for building regression models that relate their biophysical properties to the corresponding UAV-RGB images. In Field C16 and W16, a total of 62 Chinese cabbages and 42 white radishes were used to validate the regression models developed using the data in Field C15 and W15. Plant growth stages were determined according to the 10 principal growth stages and 10 secondary growth stages of the “Biologische Bundesanstalt, Bundessortenamt und CHemische Industrie” (BBCH) scale [52].
For geo-referencing UAV-images, similar to that reported in a previous study [53], a set of ground control points (GCPs) consisting of five 15 by 25 cm paper sheets were placed in the corners and center of research plot for each of the 4 fields, i.e., Field W15, W16, C15, and C16 (Figure 1). The GCP locations were measured with a Novatel OEM 615 virtual reference station (VRS)-based real-time kinematic-global positioning system (RTK-GPS) to provide sub-decimeter positioning accuracy within 2 and 5 cm in the horizontal and vertical directions, respectively.

2.3. Image Processing

Figure 2 shows the flow chart of the image processing and analysis steps, including image acquisition, image preprocessing, calculation of vegetation fraction and plant height, and data analysis. Basically, the Pix4Dmapper Pro 3.0.17 performed both image alignment and 3D reconstruction for imagery. To accurately geo-reference the UAV image, the GCPs measured with the RTK-GPS were imported to the Pix4D program, thereby producing the geo-referencing images with a real-world coordinate system [14], which corresponded to both geometric calibration of image sensor and lens distortion correction [54]. As mentioned by An et al. [38], since it was important to evaluate the generation of the mosaicked images reconstructed from 3D meshes, the geo-referencing accuracy was assessed by root mean square errors (RMSEs) of horizontal (X and Y) and vertical (Z) coordinates at GCP locations using the five GCPs, which were installed in each of the four fields (W15, W16, C15, and C16). After the geo-referencing process, converting the individual images into an image orthomosaic and generating digital surface model (DSM) and digital terrain model (DTM) were performed through the SfM processing built in the Pix4Dmapper. Before image analysis for calculating vegetation fraction, radiometric calibration was conducted into the orthomosaic images. ExG index was then selected as a vegetation index, and the Otsu method was used to extract crop images [12,55]. Once the ExG-based crop images were segmented, vegetation fractions (VFs) were calculated to represent the area of vegetation [15]. The DSM and DTM were generated by performing a bundle adjustment based on matching features between the images, thereby calculating plant heights (PHs) [24]. Finally, crop growth estimation models were built using the two predictor variables, VF and PH.

2.4. Radiometric Calibration and Region of Interest

To minimize the effects of ever-changing light and atmospheric conditions on UAV images taken at different times, imagery radiometric calibration was conducted on every flight by placing 1.2 by 1.2 m Group 8 Technology Type 822 ground calibration panels for airborne sensors with seven gray scales (3%, 5%, 11%, 22%, 33%, 44%, and 55%) in a location within the flight path of the UAV platform (Figure 3a). The mean reflectance values of the calibration targets for each of the R, G, and B bands in the RGB camera were determined using Equation (1). For this, as shown in Figure 3b, the standard reference reflectance spectrum of the calibration targets in the 400–800 nm range was measured with an ASD Fieldspec4 (Analytical Spectral Devices, Inc., Longmont, USA). The spectral response of the RGB camera was obtained from the sensor specification provided by the manufacturer (Figure 3c).
r ¯ x , k = 400 800 R x ( λ ) C k ( λ ) d λ 400 800 C k ( λ ) d λ
where r ¯ x , k represents the calculated mean reflectance values of the calibration targets, R x ( λ ) represents the standard reflectance spectrum of the targets measured with the field spectrometer, C k ( λ ) represents the spectral response of the image sensor, x is the calibration target, and k is one of R, G, and B bands.
Assuming that the reflectance values of calibration targets were exponentially proportional to RGB band digital numbers (DNs) based on the empirical line method [23,56], the coefficients of Equation (2) were derived by fitting the DNs of the images to the reflectance spectra of the calibration targets for each of the R, G, and B bands. As a result, to combine all the UAV images obtained at different dates, the DNs of each of the RGB bands measured with the RGB camera on every UAV flight were converted into normalized reflectance values.
r k = A k e B k D N
where r k represents the reflectance values of the acquired images, DN represents the digital numbers of the images, and A k and B k are the coefficients of the exponential relationship.
To effectively perform a bivariate analysis between the aerial images and ground truth data for Chinese cabbage and white radish crops having different biophysical characteristics, as shown in Figure 4, different regions of interest (ROIs) that represent the area of each grid were used, i.e., 60 × 60 cm and 80 × 150 cm for Chinese cabbage and white radish, respectively. The dimensions of the ROIs were determined based on geometric characteristics of the two crops, such as maximum size and inter-row spacing. As shown in Figure 4, it was possible to extract images of individual Chinese cabbage due to its growth pattern to independently stand with a constant spacing of 50 cm, thereby providing an individual crop grid along the transplanting rows. The growth pattern of white radish plants, mixed up with each other at a planting spacing of 30 cm, did not allow the extraction of individual crop grid, but a bulk extraction was possible using an average of image values for 10 plants of two crop rows to represent the ROI.

2.5. Quantification of Vegetation Fraction and Plant Height

A vegetation index of ExG was used for quantifying the vegetation fractions of Chinese cabbage and white radish in each ROI because it was reported that the ExG values could effectively assess canopy variation in green crop biomass based on RGB ortho-images [15]. The ExG (Equation (3)) was calculated using the radiometrically calibrated RGB reflectance values, instead of the RGB digital numbers [12].
ExG = 2g − r − b
where r is R R + G + B , g is G R + G + B , b is B R + G + B , and R, G, and B represent the reflectance values of the R, G, and B bands in the original images, respectively.
Crop segmentation, a process for extracting only crops from a background that includes a mixture of soil and other interfering objects in an image, is an important step performed prior to the calculation of vegetation fractions in each ROI. In this study, since plastic mulch was used to suppress weed growth along with herbicide application to soil prior to crop planting, main interfering objects were soil and plastic mulch. As shown in Figure 5, in a histogram analysis of UAV-RGB images in terms of ExG, it was possible to effectively separate the images of the cabbage and radish plants from the background using the Otsu method, which automatically calculates optimal threshold values, thereby minimizing inter-class variance and maximizing intra-class variance [57]. The ExG-based RGB images segmented based on the Otsu method were converted to binary images, i.e., 1 or 0, classified into two different groups. That is, if the ExG value was equal to or higher than the threshold, it was recognized as a vegetation pixel of 1; on the other hand, a pixel with a value of 0 was considered non-vegetation. The basic principle used in the study was an assumption that dense green vegetation produces a high value, while soil has a low value, thus producing a contrast between vegetation and soil. Finally, the vegetation fractions (VF) in each ROI were calculated as the ratio of the number of pixels segmented as a crop to the number of total pixels following Equation (4) [15]. Accuracy of the Otsu method-based crop segmentation applied in the study was assessed by comparing with manually identified actual crop images.
Vegetation   Fraction =   Number   of   pixels   determined   to   be   1   in   an   ROI Number   of   total   pixels   in   an   ROI
where ROI represents a region of interest.
As described in previous studies [58,59], plant height in this study was defined as the shortest distance between the upper boundary of the main photosynthetic tissues on a plant and ground level. In this study, the 3D points of the DTM and the DSM were created for calculating plant height using the Pix4Dmapper Pro 3.0.17 (Figure 6). That is, the DTM was defined as a model of the underlying field topography without crop features, which is corresponding to the state of no crop grown on the ground, and the DSM was defined as a combined model of the underlying topography and field features such as crops, corresponding to the state of crop grown [60]. The DTM was acquired on the first UAV flight when the crops were not germinated on the ground within 7 days after sowing and the DSMs were acquired on each of the UAV flight dates as shown in Table 2. Finally, as shown in Figure 6, the plant height, defined as a model of field features only, was then calculated by subtracting the DSM from the DTM (Equation (5)):
Plant Height = DSM − DTM
where DSM represents the model of the underlying topography with crops, and DTM represents the underlying field topography without crops.

2.6. Statistical Analysis

Multiple linear regression models were developed to quantify the growth status of Chinese cabbage and white radish from UAV-based imagery using VF and PH as predictor variables and biophysical data as response variables. Since highly significant interaction effects between VF and PH were found from a preliminary correlation analysis, an interaction term of VF × PH was added to the predictor variables as shown in the following equation (Equation (6)):
Y = A × XVF + B × XPH + C × XVF × XPH + D
where Y represents biophysical parameters, i.e., leaf length, leaf width, leaf number, and fresh weight, and XVF represents the variable of VF, XPH represents the variable of PH, and A, B, C, and D represent the estimates of each of the predictor variable terms.
The SAS 9.4 Software (SAS, Cary, NC, USA) was used to determine the four estimates for Equation (6) by fitting the image data acquired from the UAV in terms of VF and PH to the equation. Validation of the developed regression models was conducted through comparison of UAV-measured biophysical values and actual 2016 data measured with standard methods. Finally, to investigate the ability of the UAV-RGB system to estimate spatial variations in biophysical parameters of vegetables in a field, fresh weight maps of Chinese cabbage and white radish were generated using the ArcGIS 10.1 (Esri, Redlands, CA, USA) software.

3. Results

3.1. Geo-Referencing, Radiometric Calibration and Crop Segmentation

Table 3 shows RMSEs of the GCP coordinates in all the four fields. Horizontal (X and Y coordinates) RMSEs ranged from 0.10 to 0.20 m, whereas vertical (Z coordinates) RMSEs ranged from 0.025 to 0.034 m, indicating that the geo-referencing data obtained in this study provided the sub-decimeter positioning accuracy within 2 and 5 cm in the horizontal and vertical directions.
Figure 7 shows an example of calibration curves that relate the digital numbers (DNs) obtained with each of the three individual RGB bands to the corresponding reflectance values calculated using Equation (2) from a flight on 7 October 2016. Table 4 shows coefficients derived by fitting the DNs of the images to the reflectance spectra of the calibration targets for each of the R, G, and B bands obtained at all flight dates. The results indicate that the DNs could be successfully converted into reflectance spectra, showing strong exponential relationships with coefficient of determination (R2) ranging from 0.93 to 0.99, and their reflectance data could be normalized to minimize the effect of varying sunlight conditions on UAV-RGB images.
Figure 8 shows the visual steps of the crop segmentation that uses the Otsu method based on a vegetation index of ExG to calculate the vegetation fractions of Chinese cabbage and white radish. Original RGB images (Figure 8a,e) extracted as each ROI used were converted into ExG-based images (Figure 8b,f). Binary images (Figure 8c,g) were then automatically segmented based on the Otsu threshold method, indicating that the white pixels are classified as crop with a value of 1 or background objects with a value of 0. Figure 8d,h shows crop images in red, manually segmented as Chinese cabbage and white radish, respectively, using the ENVI 5.4 software, which could be compared with the images automatically segmented (Figure 8c,g).
The Otsu method-based segmented images were compared with the manually cropped images based on the crop area calculated using the number of pixels. As shown in Table 5, which reports the results of 10 samples for each of the two different crops randomly selected from the early stage of growth to the late stage of growth, the errors of segmentation performance for Chinese cabbage ranged from −8.72% to 6.01%, whereas those for white radish were in the range of −14.9% to 17.1%. However, in the early growth stage of white radish, relatively higher errors were found because the boundary area between white radish and background was blurred because of the growth characteristics of white radish, which had overlapping leaves (Figure 8g).

3.2. Validation of Plant Height Estimation Based on the SfM Algorithm

In this study, accuracy in 3D measurement was evaluated to validate plant height estimation based on our 3D extraction method. Figure 9 shows examples of the DSMs of Chinese cabbage (Figure 9a) and white radish (Figure 9b) and the DTM (Figure 9c) created using the SfM algorithm, implying that it was possible to obtain 3D images similar to the actual shapes. Similar to a previous study of validation of the SfM method based on subtraction of DTM from DSM [61], as ground truth data, the maximum standing heights of the five plants in each ROI were manually measured using a 1 m ruler and then averaged for each ROI. As shown in Figure 10, which compares plant heights of Chinese cabbage and white radish estimated using the SfM method with the ground truth data, there were significantly linear relationships between the two methods, with R2 > 0.9 and regression slopes near unity, implying that use of the SfM method would be effective in estimating the heights of Chinese cabbage and white radish in the range of 10 to 48 cm and 10 to 60 cm, respectively. However, the height estimates retained offsets of −6.59 and −1.86 cm for Chinese cabbage and white radish between estimated and actual heights, respectively.

3.3. Temporal Variability in Vegetation Fraction and Plant Height

Figure 11 shows changes in ExG-based VF and SfM-estimated plant heights (PH) of Chinese cabbage (Figure 11a,b) and white radish (Figure 11c,d), obtained during the growing period ranging from 18 to 58 days and from 19 to 59 days after transplanting (DAT) and sowing (DAS), respectively. As expected, both VFs and PHs were linearly proportional to DAT and DAS, due to increases in canopy greenness over time. Especially, until 46 DAT and 47 DAS for Chinese cabbage and white radish, respectively, it was observed that the change rates of VF and PH with respect to time were almost constant, while there were significant differences in VF and PH between the UAV images obtained at different dates. In addition, since it was found that the biophysical parameters of Chinese cabbage and white radish, i.e., fresh weight, leaf length, leaf area and leaf count, were highly correlated with the VF and PH (Table 6), it seemed plausible that the VF and PH could be used as predictor variables in linear modeling to quantify the growth status of the two crops. However, their growths were observed to stop after approximately 46 DAT and 47 DAS for Chinese cabbage and white radish, respectively, showing no significant differences in VF and PH between the UAV images. Since this was related to the saturation phenomena of the VF and PH, the UAV data measured at 58 DAT for Chinese cabbage and 59 DAS for white radish were not included in building multiple linear regression models in this study.

3.4. Biophysical Parameter Modeling

Results of SAS regression (REG) analysis to model the growth status of the two crops based on UAV-RGB images are shown in Table 7. All of the multiple regression equations for Chinese cabbage and white radish, when using three predictor variables (VF, PH, and VF × PH) and four different response variables (fresh weight, leaf length, leaf width, and leaf count), provided good fits with coefficients of determination (R2 > 0.8), whereas relatively low estimations in leaf width and count for white radish were obtained (R2 = 0.68 and 0.76, respectively). In particular, it was expected that use of the developed models would make it possible to measure the fresh weights of Chinese cabbage and white radish with an acceptable level of performance and could be used as a method to predict the potential yields of the two vegetables prior to harvesting. In addition, for predicting the root weights of white radish, since a correlation between root weight and above-ground weight exists [24], the estimation of above-ground fresh weight would be feasible in predicting the potential yield of white radish during the growing season.

3.5. Validation of Biophysical Parameter Estimation Models

Validation of the developed growth estimation models for Chinese cabbage and white radish was conducted using a dataset of UAV images with known biophysical data obtained from the second-year experiment conducted from September and November 2016. A total of 62 and 42 ROIs of Chinese cabbage and white radish, respectively, were used to quantify their biophysical properties during the growing season by converting the RGB images into two indices of VF and PHs used as predictor variables of the developed regression models. Figure 12 and Figure 13 compare the biophysical values of Chinese cabbage and white radish, respectively, grown in the test field determined by the developed UAV image-based prediction models with those obtained by standard methods of simple linear regression analysis.
As shown in Figure 12a–c, the developed models performed well in measuring leaf length, leaf width, and fresh weight of Chinese cabbage, showing strong linear relationships between slopes of 0.89 to 1.12 and coefficients of determination >0.76, even though their estimates retained offsets of 3.55 to 4.54 cm and a fresh weight of 178.88 g. However, leaf counts of Chinese cabbage were highly underestimated (57%) when using the UAV image-based estimation model. As shown in Figure 13 that quantifies the growth status of white radish, all estimates obtained with the developed regression models were lower than those measured with standard methods, showing slopes of 0.44 to 0.85. In particular, the UAV method measured 29% less fresh weight than did the standard method, which uses an electronic balance.

3.6. Application to Spatial Mapping of Potential Yield

To investigate the feasibility of using the UAV method for potential yield mapping of Chinese cabbage and white radish during the growing season, the UAV-RGB images (Figure 14a and Figure 15a) collected on 27 October 2016 were converted into fresh weight maps (Figure 14b and Figure 15b) estimated with the developed regression models. As shown in Figure 14a and Figure 15a, it appeared that there was high variation in vegetation fraction of both crops in the two test fields from the UAV-RGB images, implying that different fresh weights would be predicted depending on location.
Georeferenced data on individual Chinese cabbages and a portion of white radishes extracted from each ROI were collected by sequentially locating the center points of each 60 cm × 60 cm and 80 cm × 150 cm plot, respectively, along the transplanting and planting rows on the two orthomosaic UAV-RGB images (Figure 14a and Figure 15a) using the ArcGIS 10.1. As a result, almost 2700 and 160 center points corresponding to each of the ROIs with coordinate information were determined in order to calculate the VFs and PHs for use as predictor variables for determining the fresh weights of Chinese cabbage and white radish in each ROI, respectively. Finally, maps of each crop were generated in ArcGIS 10.1 to visually show spatial variability in fresh weight representative of each ROI, ranging from 0 to 12,000 g/m2 and 0 to 4500 g/m2 for Chinese cabbage and white radish, respectively. This reveals that the fresh weight maps generated using the UAV-RGB images in conjunction with use of the developed prediction models could be used for evaluating the potential yields of the two crops prior to harvesting.

4. Discussion

In analyzing multi-temporal UAV images, radiometric calibration is required to minimize the effects of ever-changing light and atmospheric conditions on UAV images taken at different times. Yun et al. conducted radiometric calibration based on the empirical line method [56] using color-scale calibration targets [23]. The results showed linear relationships between DNs and reflectance spectra with R2 ranging from 0.85 to 0.99. However, in general, since the linear relationship might not be suitable when high saturation effects at high DN values are observed. In this study, it was found that the RGB band DNs were exponentially proportional to reflectance values obtained with gray-scale calibration targets, showing significant relationships with R2 ranging from 0.93 to 0.99 (Table 4). Therefore, their reflectance data could be normalized to minimize the effect of varying sunlight conditions on UAV-RGB images.
Excess green (ExG) is an efficient vegetation index that can separate crops from a background that includes a mixture of soil and other interfering objects in images with only RGB band. In a study by Torres-Sánchez et al., ExG was used for multi-temporal mapping of the vegetation fraction from UAV images [15]. In our study, the ExG was applied to automatic crop segmentation with Otsu threshold (Figure 8). The segmentation results obtained using 10 samples taken from each of randomly selected Chinese cabbage and white radish showed errors ranging from −8.72% to 6.01% for Chinese cabbage and from −14.9% to 17.1% for white radish (Table 5). This indicates that use of the Otsu threshold method based on the ExG would be satisfactory in segmenting crop images from a background consisting of soil and plastic mulch with accuracies >80%. As mentioned earlier, relatively higher errors were found in the early growth stage of white radish because the boundary area between white radish and background was blurred because of the growth characteristics of white radish, which had overlapping leaves (Figure 8g). This requires the use of a robust image processing method to improve the segmentation performance for up to 30-day-old white radishes with overlapping leaves.
Plant height can be obtained using CSMs [39,40]. In the previous studies, the CSMs have been applied to various crops such as sugar, beets, rice, and summer barely [24,39,40,41]. At the same time, the emergence of SfM-based software has enabled efficient creation of 3D point clouds and super high detailed ortho-photos [45,46]. In this study, the SfM algorithm estimated plant heights of Chinese cabbage and white radish, with approximately 1:1 relationships and coefficients of determination (R2) >0.9 between the heights determined by both the SfM method and manually using a 1 m ruler (Figure 10). However, the height estimates retained offsets of −6.59 and −1.86 cm for Chinese cabbage and white radish crops between estimated and actual heights. As mentioned in previous studies by Bendig et al. [24,39] and Ruiz et al. [62], this might be related to an inherent error in vertical locations measured with the RTK-GPS that affects the accuracy of GCPs used to correct georeferenced data on UAV images. Another reason responsible for the height measurement error might be the use of a ruler to measure the maximum heights of the crops. Nevertheless, the overall results showed an improvement in accuracy compared to similar studies [24,39,61,63].
Several studies have applied UAV images to modeling crop growth over the growing season. Bendig et al. applied crop surface models and various vegetation indices to estimate biomass of barely [64]. Brocks and Bareth also conducted biomass estimation of barley using plant height with crop surface models showing R2 between 0.55 and 0.79 for dry biomass [65]. Although many studies have been conducted, yield estimation of Chinese cabbage and white radish using a UAV with an RGB camera has not yet been studied. In this study, prior to building growth estimation models for Chinese cabbage and white radish based on UAV images, time-series analysis of the VF and PH was conducted to characterize their changes with respect to time (Figure 11). As expected from the results of some previous studies [15,23,61], both VF and PH were linearly proportional to DAT and DAS due to an increase in canopy greenness over time. Saturation phenomena for both the VF and PH were observed, however, beginning from 46 DAT for Chinese cabbage and 47 DAS for white radish, which are similar to the general growth patterns of crops [66]. Therefore, the saturated VF and PH data were not used to build multiple linear regression models in this study. As shown in Table 5, it was found that on average, relatively high correlation coefficients existed between the biophysical parameters and the VF compared to the PH data. A possible cause might be explained by the growth characteristics of Chinese cabbage and white radish with a narrower PH range of lower than 60 cm as compared to those obtained with other crops, such as maize and sorghum growing higher than 1 m, whereas Chinese cabbage and white radish grow relatively fast, producing a higher change in VF in almost 2 months.
Multiple regression equations for Chinese cabbage and white radish developed from this study, using three predictor variables (VF, PH, and VF × PH) and four different response variables (fresh weight, leaf length, leaf width, and leaf count), provided good fits (R2 > 0.8) except for relatively decreased estimations in leaf width and count for white radish (R2 = 0.68 and 0.76, respectively). The results of validation testing showed that strong linear relationships (R2 > 0.76) existing between the developed models and standard methods would make it possible to use UAV-RGB images for predicting biophysical properties of the two crops in a quantitative manner. However, on average, the prediction accuracies for white radish were worse than those for Chinese cabbage. Likely causes for the lower estimates of white radish growth status might be related to the higher irregularity and narrower leaves of white radish, thereby reducing the performance of the image segmentation due to a difficulty in extracting white radish from the background (Table 5). An improvement in image segmentation would enhance the ability of the UAV system to estimate the biophysical parameters of white radish.
In addition, there was an issue to address slopes of non-unity and offsets of non-zero in the validation testing. For example, leaf count of Chinese cabbage and other biophysical parameters of white radish were highly underestimated when using the UAV image-based estimation model (Figure 14 and Figure 15). Possible causes responsible for the lower estimates of leaf count with the UAV system are difficult to explain. However, it might be related to limited spatial resolutions of the RGB images obtained with the UAV flying at 20 m that could not separate their individual leaves with an acceptable level of accuracy. The slopes and offsets can be adjusted using a two-point normalization method. The two-point normalization method is an algorithm to compensate for differences in slope and offset between model estimates and actual values prior to analysis using two known samples of different values [67]. When using the two-point normalization, it is necessary to select two known samples having the highest difference in growth status if possible to maximize the effect of the two-point normalization using a wide range of data. The slope is directly compensated by comparing the actual value obtained based on destructive sampling and the predicted value obtained with the UAV-RGB system. As shown in Figure 16, it is apparent that the estimated leaf count can be adjusted, improving the slope and offset from 0.43 to 0.93 and from 17.56 to 5.05, respectively, after two-point normalization. In addition, when the accuracy was assessed using a RMSE [68], the RMSE of leaf count was decreased from 13.31 to 7.23 by use of the two-point normalization. Future studies include the application of the developed UAV-estimated models in conjunction with use of the two-point normalization method to commercial fields growing Chinese cabbage and white radish.

5. Conclusions

In this study, crop growth estimation models based on UAV-RGB imagery were developed and validated for quantifying various biophysical parameters of field-grown Chinese cabbage and white radish. This study differs from previous studies in respect of (i) using the combination of RGB-based vegetation index (VI) and SfM-estimated PH to build their growth estimation models and (ii) conducting a different-year field experiment to investigate applicability of the developed regression models to a separate dataset of UAV-RGB images. Analysis of the modeling and validating results indicates that the two physical parameters (VI and PH) obtained using the UAV-RGB camera can be used as viable predictor variables in quantifying the growth status of Korean Chinese cabbage and white radish, due to strong linear relationships between UAV-RGB and standard methods for fresh weight, leaf length, leaf width, and leaf count. Additionally, since the use of a UAV-RGB system will make it possible to obtain measurements at a closer spatial resolution than is feasible with sample collection and laboratory analysis, we believe this approach will be able to map crop growth status with greater accuracy than current methods. However, one drawback to this UAV-RGB system is slopes of non-unity and offsets of non-zero found in the validation testing. To address this issue, a two-point normalization method, consisting of a sensitivity compensation followed by an offset adjustment using two known samples of different values, will be necessary to implement in the UAV-RGB system. Future studies include the application of the developed UAV-estimated models in conjunction with use of the two-point normalization method to commercial fields growing Chinese cabbage and white radish to confirm their suitability for estimating in-season biophysical properties.

Acknowledgments

This research was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, Forestry, and Fisheries (Project number: 315011-03-3-SB010), Korea.

Author Contributions

H.S.Y. and H.-J.K. conceived and designed the experiments; D.-W.K. and H.S.Y. performed the experiments; D.-W.K. and S.-J.J. analyzed the data; Y.-S.K. and S.-G.K. contributed materials; W.S.L. and H.-J.K. provided feedback; D.-W.K. and H.-J.K. wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cloutis, E.; Connery, D.; Major, D.; Dover, F. Airborne multi-spectral monitoring of agricultural crop status: Effect of time of year, crop type and crop condition parameter. Remote Sens. 1996, 17, 2579–2601. [Google Scholar] [CrossRef]
  2. Hunt, E.R.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.; McCarty, G.W. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef]
  3. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  4. Poenaru, V.; Badea, A.; Cimpeanu, S.M.; Irimescu, A. Multi-temporal multi-spectral and radar remote sensing for agricultural monitoring in the braila plain. Agric. Agric. Sci. Procedia 2015, 6, 506–516. [Google Scholar] [CrossRef]
  5. Borchard, N.; Schirrmann, M.; von Hebel, C.; Schmidt, M.; Baatz, R.; Firbank, L.; Vereecken, H.; Herbst, M. Spatio-temporal drivers of soil and ecosystem carbon fluxes at field scale in an upland grassland in germany. Agric. Ecosyst. Environ. 2015, 211, 84–93. [Google Scholar] [CrossRef]
  6. Dammer, K.-H.; Thöle, H.; Volk, T.; Hau, B. Variable-rate fungicide spraying in real time by combining a plant cover sensor and a decision support system. Precis. Agric. 2009, 10, 431–442. [Google Scholar] [CrossRef]
  7. Thorp, K.; Wang, G.; West, A.; Moran, M.; Bronson, K.; White, J.; Mon, J. Estimating crop biophysical properties from remote sensing data by inverting linked radiative transfer and ecophysiological models. Remote Sens. Environ. 2012, 124, 224–233. [Google Scholar] [CrossRef]
  8. Chang, A.; Eo, Y.; Kim, S.; Kim, Y.; Kim, Y. Canopy-cover thematic-map generation for military map products using remote sensing data in inaccessible areas. Landsc. Ecol. Eng. 2011, 7, 263–274. [Google Scholar] [CrossRef]
  9. Hollinger, S.E. Field monitoring of crop photosynthesis and respiration. Better Crops Plant Food 1997, 81, 23–24. [Google Scholar]
  10. Migdall, S.; Bach, H.; Bobert, J.; Wehrhan, M.; Mauser, W. Inversion of a canopy reflectance model using hyperspectral imagery for monitoring wheat growth and estimating yield. Precis. Agric. 2009, 10, 508–524. [Google Scholar] [CrossRef]
  11. Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar] [CrossRef] [PubMed]
  12. Woebbecke, D.; Meyer, G.; Von Bargen, K.; Mortensen, D. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  13. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  14. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using uav based remote sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  15. Torres-Sánchez, J.; Peña, J.; De Castro, A.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from uav. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  16. Salamí, E.; Barrado, C.; Pastor, E. Uav flight experiments applied to the remote sensing of vegetated areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef] [Green Version]
  17. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  18. Berni, J.; Zarco-Tejada, P.; González-Dugo, V.; Fereres, E. Remote Sensing of Thermal Water Stress Indicators in Peach. In Proceedings of the 7th International Peach Symposium 962, Lleida, Spain, 8–11 June 2009; Girona, J., Marsal, J., Eds.; ISHS: Leuven, Belgium, 2009; pp. 325–331. [Google Scholar]
  19. Panda, S.S.; Ames, D.P.; Panigrahi, S. Application of vegetation indices for agricultural crop yield prediction using neural network techniques. Remote Sens. 2010, 2, 673–696. [Google Scholar] [CrossRef]
  20. Zarco-Tejada, P.J.; Suárez, L.; González-Dugo, V. Spatial resolution effects on chlorophyll fluorescence retrieval in a heterogeneous canopy using hyperspectral imagery and radiative transfer simulation. IEEE Geosci. Remote Sens. Lett. 2013, 10, 937–941. [Google Scholar] [CrossRef]
  21. Baluja, J.; Diago, M.P.; Balda, P.; Zorer, R.; Meggio, F.; Morales, F.; Tardaguila, J. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 2012, 30, 511–522. [Google Scholar] [CrossRef]
  22. Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Comput. Electron. Agric. 2015, 114, 43–52. [Google Scholar] [CrossRef]
  23. Yun, H.S.; Park, S.H.; Kim, H.-J.; Lee, W.D.; Do Lee, K.; Hong, S.Y.; Jung, G.H. Use of unmanned aerial vehicle for multi-temporal monitoring of soybean vegetation fraction. J. Biosyst. Eng. 2016, 41, 126–137. [Google Scholar] [CrossRef]
  24. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMS) derived from UAV-based rgb imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  25. Anthony, D.; Elbaum, S.; Lorenz, A.; Detweiler, C. On crop height estimation with uavs. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 4805–4812. [Google Scholar]
  26. Geipel, J.; Link, J.; Claupein, W. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef]
  27. Ewert, F.; van Ittersum, M.K.; Heckelei, T.; Therond, O.; Bezlepkina, I.; Andersen, E. Scale changes and model linking methods for integrated assessment of agri-environmental systems. Agric., Ecosyst. Environ. 2011, 142, 6–17. [Google Scholar] [CrossRef]
  28. Mirschel, W.; Schultz, A.; Wenkel, K.; Wieland, R.; Poluektov, R. Crop growth modelling on different spatial scales—A wide spectrum of approaches. Arch. Agron. Soil Sci. 2004, 50, 329–343. [Google Scholar] [CrossRef]
  29. Newe, M.; Meier, H.; Johnen, A.; Volk, T. Proplant expert.com—An online consultation system on crop protection in cereals, rape, potatoes and sugarbeet. EPPO Bull. 2003, 33, 443–449. [Google Scholar] [CrossRef]
  30. Blackmore, S. The interpretation of trends from multiple yield maps. Comput. Electron. Agric. 2000, 26, 37–51. [Google Scholar] [CrossRef]
  31. Mourtzinis, S.; Arriaga, F.J.; Balkcom, K.S.; Ortiz, B.V. Corn grain and stover yield prediction at R1 growth stage. Agron. J. 2013, 105, 1045–1050. [Google Scholar] [CrossRef]
  32. Batchelor, W.D.; Basso, B.; Paz, J.O. Examples of strategies to analyze spatial and temporal yield variability using crop models. Eur. J. Agron. 2002, 18, 141–158. [Google Scholar] [CrossRef]
  33. Rodrigues, M.S.; Corá, J.E.; Castrignanò, A.; Mueller, T.G.; Rienzi, E. A spatial and temporal prediction model of corn grain yield as a function of soil attributes. Agron. J. 2013, 105, 1878–1887. [Google Scholar] [CrossRef]
  34. Thorp, K.R.; DeJonge, K.C.; Kaleita, A.L.; Batchelor, W.D.; Paz, J.O. Methodology for the use of dssat models for precision agriculture decision support. Comput. Electron. Agric. 2008, 64, 276–285. [Google Scholar] [CrossRef]
  35. Allen, R.G.; Pereira, L.S.; Raes, D.; Smith, M. Crop evapotranspiration-guidelines for computing crop water requirements-FAO irrigation and drainage paper 56. FAO Rome 1998, 300, D05109. [Google Scholar]
  36. Lazcano, C.; Domínguez, J. The use of vermicompost in sustainable agriculture: Impact on plant growth and soil fertility. Soil Nutr. 2011, 10, 1–23. [Google Scholar]
  37. Li, D.; Xu, L.; Tang, X.-S.; Sun, S.; Cai, X.; Zhang, P. 3D imaging of greenhouse plants with an inexpensive binocular stereo vision system. Remote Sens. 2017, 9, 508. [Google Scholar] [CrossRef]
  38. An, N.; Welch, S.M.; Markelz, R.C.; Baker, R.L.; Palmer, C.M.; Ta, J.; Maloof, J.N.; Weinig, C. Quantifying time-series of leaf morphology using 2D and 3D photogrammetry methods for high-throughput plant phenotyping. Comput. Electron. Agric. 2017, 135, 222–232. [Google Scholar] [CrossRef]
  39. Bendig, J.; Bolten, A.; Bareth, G. Uav-based imaging for multi-temporal, very high resolution crop surface models to monitor crop growth variability. J. Photogramm., Remote Sens. Geoinf. Process. 2013, 2013, 551–562. [Google Scholar]
  40. Hoffmeister, D.; Bolten, A.; Curdt, C.; Waldhoff, G.; Bareth, G. High-resolution crop surface models (CSM) and crop volume models (CVM) on field level by terrestrial laser scanning. In Proceedings of the 6th International Symposium on Digital Earth: Models, Algorithms, and Virtual Reality, Beijing, China, 9–12 September 2009; Proc. SPIE: Bellingham, WA, USA, 2010; p. 78400E. [Google Scholar]
  41. Tilly, N.; Hoffmeister, D.; Cao, Q.; Huang, S.; Lenz-Wiedemann, V.; Miao, Y.; Bareth, G. Multitemporal crop surface models: Accurate plant height measurement and biomass estimation with terrestrial laser scanning in paddy rice. J. Appl. Remote Sens. 2014, 8, 083671. [Google Scholar] [CrossRef]
  42. Hyyppä, J.; Yu, X.; Hyyppä, H.; Vastaranta, M.; Holopainen, M.; Kukko, A.; Kaartinen, H.; Jaakkola, A.; Vaaja, M.; Koskinen, J. Advances in forest inventory using airborne laser scanning. Remote Sens. 2012, 4, 1190–1207. [Google Scholar] [CrossRef]
  43. Kane, V.R.; McGaughey, R.J.; Bakker, J.D.; Gersonde, R.F.; Lutz, J.A.; Franklin, J.F. Comparisons between field-and lidar-based measures of stand structural complexity. Can. J. For. Res. 2010, 40, 761–773. [Google Scholar] [CrossRef]
  44. Wulder, M.A.; Coops, N.C.; Hudak, A.T.; Morsdorf, F.; Nelson, R.; Newnham, G.; Vastaranta, M. Status and prospects for lidar remote sensing of forested ecosystems. Can. J. Remote Sens. 2013, 39, S1–S5. [Google Scholar] [CrossRef]
  45. Dandois, J.P.; Ellis, E.C. Remote sensing of vegetation structure using computer vision. Remote Sens. 2010, 2, 1157–1176. [Google Scholar] [CrossRef]
  46. Verhoeven, G. Taking computer vision aloft–archaeological three-dimensional reconstructions from aerial photographs with photoscan. Archaeol. Prospect. 2011, 18, 67–73. [Google Scholar] [CrossRef]
  47. Westoby, M.; Brasington, J.; Glasser, N.; Hambrey, M.; Reynolds, J. ‘Structure-from-motion’photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef] [Green Version]
  48. James, M.R.; Robson, S. Mitigating systematic error in topographic models derived from uav and ground-based image networks. Earth Surf. Process. Landf. 2014, 39, 1413–1420. [Google Scholar] [CrossRef]
  49. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  50. Qiu, N.; Liu, Q.; Li, J.; Zhang, Y.; Wang, F.; Gao, J. Physiological and transcriptomic responses of Chinese Cabbage (Brassica rapa L. ssp. Pekinensis) to salt stress. Int. J. Mol. Sci. 2017, 18, 1953. [Google Scholar] [CrossRef] [PubMed]
  51. Zhang, G.; Wang, F.; Li, J.; Ding, Q.; Zhang, Y.; Li, H.; Zhang, J.; Gao, J. Genome-wide identification and analysis of the vq motif-containing protein family in chinese cabbage (Brassica rapa L. ssp. Pekinensis). Int. J. Mol. Sci. 2015, 16, 28683–28704. [Google Scholar] [CrossRef] [PubMed]
  52. Lancashire, P.D.; Bleiholder, H.; Boom, T.; Langelüddeke, P.; Stauss, R.; Weber, E.; Witzenberger, A. A uniform decimal code for growth stages of crops and weeds. Ann. Appl. Biol. 1991, 119, 561–601. [Google Scholar] [CrossRef]
  53. Turner, D.; Lucieer, A.; Watson, C. An automated technique for generating georectified mosaics from ultra-high resolution unmanned aerial vehicle (UAV) imagery, based on structure from motion (SFM) point clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
  54. An, N.; Palmer, C.M.; Baker, R.L.; Markelz, R.C.; Ta, J.; Covington, M.F.; Maloof, J.N.; Welch, S.M.; Weinig, C. Plant high-throughput phenotyping using photogrammetry and imaging techniques to measure leaf length and rosette area. Comput. Electron. Agric. 2016, 127, 376–394. [Google Scholar] [CrossRef]
  55. Meyer, G.; Mehta, T.; Kocher, M.; Mortensen, D.; Samal, A. Textural imaging and discriminant analysis for distinguishingweeds for spot spraying. Trans. ASAE 1998, 41, 1189–1197. [Google Scholar] [CrossRef]
  56. Smith, G.M.; Milton, E.J. The use of the empirical line method to calibrate remotely sensed data to reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
  57. Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. Syst. 1979, 9, 62–66. [Google Scholar] [CrossRef]
  58. Cornelissen, J.H.C.; Lavorel, S.; Garnier, E.; Díaz, S.; Buchmann, N.; Gurvich, D.E.; Reich, P.B.; Steege, H.T.; Morgan, H.D.; van der Heijden, M.G.A.; et al. A handbook of protocols for standardised and easy measurement of plant functional traits worldwide. Aust. J. Bot. 2003, 51, 335–380. [Google Scholar] [CrossRef]
  59. Perez-Harguindeguy, N.; Diaz, S.; Garnier, E.; Lavorel, S.; Poorter, H.; Jaureguiberry, P.; Bret-Harte, M.; Cornwell, W.K.; Craine, J.M.; Gurvich, D.E. New handbook for standardised measurement of plant functional traits worldwide. Aust. J. Bot. 2013, 61, 167–234. [Google Scholar] [CrossRef]
  60. Granshaw, S.I. Photogrammetric terminology. Photogramm. Rec. 2016, 31, 210–252. [Google Scholar] [CrossRef]
  61. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from unmanned aerial system (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
  62. Ruiz, J.; Diaz-Mas, L.; Perez, F.; Viguria, A. Evaluating the accuracy of dem generation algorithms from uav imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci 2013, 40, 333–337. [Google Scholar] [CrossRef]
  63. Bareth, G.; Bendig, J.; Tilly, N.; Hoffmeister, D.; Aasen, H.; Bolten, A. A comparison of UAV-and TLS-derived plant height for crop monitoring: Using polygon grids for the analysis of crop surface models (CSMS). J. Photogramm. Remote Sens. Geoinf. Process. 2016, 2016, 85–94. [Google Scholar] [CrossRef]
  64. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  65. Brocks, S.; Bareth, G. Estimating barley biomass with crop surface models from oblique rgb imagery. Remote Sens. 2018, 10, 268. [Google Scholar] [CrossRef]
  66. Dimokas, G.; Tchamitchian, M.; Kittas, C. Calibration and validation of a biological model to simulate the development and production of tomatoes in mediterranean greenhouses during winter period. Biosyst. Eng. 2009, 103, 217–227. [Google Scholar] [CrossRef]
  67. Kim, H.-J.; Hummel, J.W.; Sudduth, K.A.; Motavalli, P.P. Simultaneous analysis of soil macronutrients using ion-selective electrodes. Soil Sci. Soc. Am. J. 2007, 71, 1867–1877. [Google Scholar] [CrossRef]
  68. Kim, H.; Sudduth, K.; Hummel, J.W.; Drummond, S. Validation testing of a soil macronutrient sensing system. Trans. ASABE 2013, 56, 23–31. [Google Scholar] [CrossRef]
Figure 1. Test site: 2 years of Chinese cabbage and white radish experiments.
Figure 1. Test site: 2 years of Chinese cabbage and white radish experiments.
Remotesensing 10 00563 g001
Figure 2. Flow chart of image processing and crop growth modeling procedures for building regression models based on vegetation fraction and plant height.
Figure 2. Flow chart of image processing and crop growth modeling procedures for building regression models based on vegetation fraction and plant height.
Remotesensing 10 00563 g002
Figure 3. (a) View of the calibration targets placed in the test field; (b) standard reflectance spectrum of the calibration targets measured with the field spectrometer; (c) spectral response of the red, green and blue (RGB) camera.
Figure 3. (a) View of the calibration targets placed in the test field; (b) standard reflectance spectrum of the calibration targets measured with the field spectrometer; (c) spectral response of the red, green and blue (RGB) camera.
Remotesensing 10 00563 g003
Figure 4. Views of regions of interest (ROIs) selected for extracting the plants of (a) Chinese cabbage and (b) white radish.
Figure 4. Views of regions of interest (ROIs) selected for extracting the plants of (a) Chinese cabbage and (b) white radish.
Remotesensing 10 00563 g004
Figure 5. Excess green (ExG) histogram for vegetation classification with the Otsu threshold value.
Figure 5. Excess green (ExG) histogram for vegetation classification with the Otsu threshold value.
Remotesensing 10 00563 g005
Figure 6. Schematic of calculating plant height based on the subtraction of digital terrain model (DTM) from digital surface model (DSM).
Figure 6. Schematic of calculating plant height based on the subtraction of digital terrain model (DTM) from digital surface model (DSM).
Remotesensing 10 00563 g006
Figure 7. Sample calibration curves of the red, green and blue (RGB) band sensors mounted on the unmanned aerial vehicle (UAV) (data were obtained from a flight performed on 7 October 2016).
Figure 7. Sample calibration curves of the red, green and blue (RGB) band sensors mounted on the unmanned aerial vehicle (UAV) (data were obtained from a flight performed on 7 October 2016).
Remotesensing 10 00563 g007
Figure 8. Crop Segmentation Steps: Original red, green and blue (RGB) images of (a) Chinese cabbage and (e) white radish; Excess green (ExG)-converted images of (b) Chinese cabbage and (f) white radish; Segmented binary images obtained using the Otsu threshold based on ExG of (c) Chinese cabbage and (g) white radish; Manually segmented images of (d) Chinese cabbage and (h) white radish.
Figure 8. Crop Segmentation Steps: Original red, green and blue (RGB) images of (a) Chinese cabbage and (e) white radish; Excess green (ExG)-converted images of (b) Chinese cabbage and (f) white radish; Segmented binary images obtained using the Otsu threshold based on ExG of (c) Chinese cabbage and (g) white radish; Manually segmented images of (d) Chinese cabbage and (h) white radish.
Remotesensing 10 00563 g008
Figure 9. 3D views of digital surface models (DSMs) of (a) Chinese cabbage; (b) white radish and (c) a digital terrain model (DTM) created using structure from motion (SfM) processing.
Figure 9. 3D views of digital surface models (DSMs) of (a) Chinese cabbage; (b) white radish and (c) a digital terrain model (DTM) created using structure from motion (SfM) processing.
Remotesensing 10 00563 g009aRemotesensing 10 00563 g009b
Figure 10. Relationships between the heights of Chinese cabbage and white radish determined by the structure from motion (SfM) method and by use of a 1 m ruler.
Figure 10. Relationships between the heights of Chinese cabbage and white radish determined by the structure from motion (SfM) method and by use of a 1 m ruler.
Remotesensing 10 00563 g010
Figure 11. Temporal changes in growth status of (a,b) Chinese cabbage and (c,d) white radish in terms of (a,c) escess green (ExG)-based vegetation fraction and (b,d) structure from motion (SfM)-estimated plant height. For ease of visualization, data are presented in boxplot format. Mean vegetation fractions and plant heights followed by the same letter within days after transplanting (DAT) or days after seeding (DAS) are not significantly different at the 5% level, based on the F-test.
Figure 11. Temporal changes in growth status of (a,b) Chinese cabbage and (c,d) white radish in terms of (a,c) escess green (ExG)-based vegetation fraction and (b,d) structure from motion (SfM)-estimated plant height. For ease of visualization, data are presented in boxplot format. Mean vegetation fractions and plant heights followed by the same letter within days after transplanting (DAT) or days after seeding (DAS) are not significantly different at the 5% level, based on the F-test.
Remotesensing 10 00563 g011
Figure 12. Relationships of biophysical properties ((a) leaf length; (b) leaf width; (c) leaf count; and (d) fresh weight) of Chinese cabbage determined by the derived unmanned aerial vehicle (UAV)-estimated models and by standard methods.
Figure 12. Relationships of biophysical properties ((a) leaf length; (b) leaf width; (c) leaf count; and (d) fresh weight) of Chinese cabbage determined by the derived unmanned aerial vehicle (UAV)-estimated models and by standard methods.
Remotesensing 10 00563 g012
Figure 13. Relationships of biophysical properties ((a) leaf length; (b) leaf width; (c) leaf count; and (d) above-ground fresh weight of white radish determined by the derived unmanned aerial vehicle (UAV)-estimated models and by standard methods.
Figure 13. Relationships of biophysical properties ((a) leaf length; (b) leaf width; (c) leaf count; and (d) above-ground fresh weight of white radish determined by the derived unmanned aerial vehicle (UAV)-estimated models and by standard methods.
Remotesensing 10 00563 g013
Figure 14. (a) Unmanned aerial vehicle-red, green and blue (UAV-RGB) orthomosaic image of Chinese cabbage collected on 27 October 2016 and (b) fresh weight map of Chinese cabbage generated using the developed regression models applied to the UAV-RGB image. The capital letters A, B, and C represent the subplots showing the growth of Chinese cabbage transplanted on different dates.
Figure 14. (a) Unmanned aerial vehicle-red, green and blue (UAV-RGB) orthomosaic image of Chinese cabbage collected on 27 October 2016 and (b) fresh weight map of Chinese cabbage generated using the developed regression models applied to the UAV-RGB image. The capital letters A, B, and C represent the subplots showing the growth of Chinese cabbage transplanted on different dates.
Remotesensing 10 00563 g014
Figure 15. (a) Unmanned aerial vehicle-red, green and blue (UAV-RGB) orthomosaic image of white radish collected on 27 October 2016 and (b) above-ground fresh weight map of white radish generated using the developed regression models applied to the UAV-RGB image. The capital letters A, B, and C represent the subplots showing the growth of white radish planted on three different dates.
Figure 15. (a) Unmanned aerial vehicle-red, green and blue (UAV-RGB) orthomosaic image of white radish collected on 27 October 2016 and (b) above-ground fresh weight map of white radish generated using the developed regression models applied to the UAV-RGB image. The capital letters A, B, and C represent the subplots showing the growth of white radish planted on three different dates.
Remotesensing 10 00563 g015
Figure 16. Examples of improving the relationships between the actual and estimated leaf count of Chinese cabbage (a) before and (b) after two-point normalization.
Figure 16. Examples of improving the relationships between the actual and estimated leaf count of Chinese cabbage (a) before and (b) after two-point normalization.
Remotesensing 10 00563 g016
Table 1. Specifications of the unmanned aerial vehicle (UAV) platform.
Table 1. Specifications of the unmanned aerial vehicle (UAV) platform.
ItemsSpecifications
UAV frameDJI F550 Hexa-rotor
Flight controllerPixhawk
PropellerDJI 9450
Battery4S Li-Po 6000 mAh, 30 C
MotorKV: 920 rpm V−1
Electronic speed controller30A OPTO, Signal Frequency: 30–450 Hz
Maximum takeoff weight2400 g
Maximum flight time15 min
Table 2. UAV imaging details and Biologische Bundesanstalt, Bundessortenamt und CHemische Industrie (BBCH) codes of Chinese cabbage and white radish used in 2015 and 2016 field experiments.
Table 2. UAV imaging details and Biologische Bundesanstalt, Bundessortenamt und CHemische Industrie (BBCH) codes of Chinese cabbage and white radish used in 2015 and 2016 field experiments.
Dates (dd/mm/yyyy)BBCH CodeFlight Altitude (m)The Number of ImagesGround Resolution (cm pixel−1)Flight TimeIlluminationWind (m s−1)
Chinese CabbageWhite Radish
25/09/20151919202130.6410–11 a.m.Cloudy2.2
03/10/20154142202220.6411–12 a.m.Cloudy1.4
09/10/20154243202110.6412–1 p.m.Cloudy1.8
17/10/20154545202140.6411–12 a.m.Clear sky1.9
23/10/20154646202270.6410–11 a.m.Cloudy1.7
04/11/20154848202180.6411–12 a.m.Cloudy1.3
23/09/20161919202440.641–2 p.m.Cloudy1.8
07/10/20164142202480.6412–1 p.m.Clear sky4.2
14/10/20164244202460.6411–12 a.m.Cloudy1.2
21/10/20164445202310.6411–12 a.m.Cloudy1.1
27/10/20164647202520.6412–1 p.m.Cloudy1.4
09/11/20164848202510.6411–12 a.m.Cloudy3.2
Table 3. Resulting root mean square error (RMSE) at ground control point (GCP) locations for GCP-based geo-referenced imagery for all fields (W15, W16, C15, and C16 in Figure 1).
Table 3. Resulting root mean square error (RMSE) at ground control point (GCP) locations for GCP-based geo-referenced imagery for all fields (W15, W16, C15, and C16 in Figure 1).
FieldRMSE in X Coordinates (m)RMSE in Y Coordinates (m)RMSE in Z Coordinates (m)
W150.0190.0170.032
C150.0200.0190.034
W160.0140.0100.025
C160.0130.0130.027
Table 4. Coefficients of calibration curves of the red, green and blue (RGB) sensors obtained on all flight dates: A k and B k are the coefficients of the exponential relationship described in Equation (2).
Table 4. Coefficients of calibration curves of the red, green and blue (RGB) sensors obtained on all flight dates: A k and B k are the coefficients of the exponential relationship described in Equation (2).
DatesBandAkBkR2DatesBandAkBkR2
25/09/2015R0.0210.0180.9523/09/2016R0.0200.0190.96
G0.0220.0170.94G0.0190.0180.97
B0.0230.0160.96B0.0210.0170.98
03/10/2015R0.0200.0180.9507/10/2016R0.0170.0180.96
G0.0200.0160.94G0.0160.0180.97
B0.0220.0170.95B0.0180.0170.98
09/10/2015R0.0200.0170.9514/10/2016R0.0160.0170.98
G0.0200.0160.93G0.0140.0170.99
B0.0210.0150.94B0.0120.0170.99
17/10/2015R0.0220.0180.9421/10/2016R0.0200.0200.97
G0.0210.0170.95G0.0190.0180.98
B0.0210.0160.96B0.0210.0170.98
23/10/2015R0.0200.0180.9327/10/2016R0.0160.0220.97
G0.0210.0170.94G0.0160.0200.97
B0.0210.0160.94B0.0190.0180.97
04/11/2015R0.0210.0170.9409/11/2016R0.0180.0180.96
G0.0210.0160.93G0.0230.0160.95
B0.0200.0150.94B0.0170.0160.98
Table 5. Results of crop segmentation performance obtained using the Otsu threshold and excess green (ExG) vegetation index.
Table 5. Results of crop segmentation performance obtained using the Otsu threshold and excess green (ExG) vegetation index.
CropSampleDatesNumber of Pixels Manually SegmentedNumber of Pixels Automatically EstimatedError (%)
Chinese cabbageSample 106/10/20163093203.56
Sample 206/10/20161,3711,359−0.88
Sample 314/10/20161,4891,5262.48
Sample 406/10/20162,4712,364−4.33
Sample 520/10/20163,0342,875−5.24
Sample 614/10/20163,6563,8766.01
Sample 714/10/20165,0264,944−1.63
Sample 820/10/20166,7226,3325.8
Sample 927/10/20168,4907,750−8.72
Sample 1027/10/201611,92511,601−2.71
White radishSample 106/10/20161,8852,20717.08
Sample 206/10/20164,5545,32116.84
Sample 314/10/201612,84915,03116.98
Sample 406/10/201613,56311,534−14.96
Sample 527/10/201621,89725,06314.46
Sample 614/10/201630,47726,402−13.37
Sample 727/10/201633,79938,08912.69
Sample 820/10/201640,80543,8427.44
Sample 927/10/201648,15251,2156.36
Sample 1027/10/201657,75860,0263.93
Table 6. Results of correlation analysis among biophysical parameters and vegetation fraction (VF) and plant height (PH) for Chinese cabbage (top) and white radish (bottom).
Table 6. Results of correlation analysis among biophysical parameters and vegetation fraction (VF) and plant height (PH) for Chinese cabbage (top) and white radish (bottom).
Chinese CabbageVFPHLeaf LengthLeaf WidthLeaf CountFresh Weight
VF1.00
PH0.871.00
Leaf Length0.960.881.00
Leaf Width0.900.750.901.00
Leaf Number0.890.820.890.801.00
Fresh Weight0.890.810.890.780.951.00
White RadishVFPHLeaf LengthLeaf WidthLeaf CountFresh Weight
VF1.00
PH0.761.00
Leaf Length0.900.701.00
Leaf Width0.770.630.791.00
Leaf Number0.880.690.880.651.00
Fresh Weight0.830.650.850.530.941.00
Table 7. Results of multiple linear regression equations to estimate the biophysical parameters of Chinese cabbage and white radish using their vegetation fractions (VFs) and plant heights (PHs) obtained from unmanned aerial vehicle-red, green and blue (UAV-RGB) images: Y = Biophysical parameters; XVF = Vegetation fraction value; XPH = Plant height value; n = Number of samples; R2 = Coefficient of determination; SE = Standard error.
Table 7. Results of multiple linear regression equations to estimate the biophysical parameters of Chinese cabbage and white radish using their vegetation fractions (VFs) and plant heights (PHs) obtained from unmanned aerial vehicle-red, green and blue (UAV-RGB) images: Y = Biophysical parameters; XVF = Vegetation fraction value; XPH = Plant height value; n = Number of samples; R2 = Coefficient of determination; SE = Standard error.
CropBiophysical ParameterMultiple Regression ModelsnR2SE
Chinese cabbageLeaf length (cm)Y = 23.66 × XVF + 11.38 × XPH + 2.48 × XVF × XPH + 18.14360.942.27
Leaf width (cm)Y = 26.17 × XVF + 0.85 × XPH − 11.27 × XVF × XPH + 11.17360.832.79
Leaf countY = 19.01 × XVF − 76.15 × XPH + 100.78 × XVF × XPH + 23.66360.906.18
Fresh weight (g)Y = 701.70 × XVF − 6280.40 × XPH + 7528.03 × XVF × XPH + 508.40360.94273.75
White radishLeaf length (cm)Y = 11.48 × XVF − 8.48 × XPH + 21.95 × XVF × XPH + 26.21360.822.56
Leaf width (cm)Y = 13.14 × XVF + 8.64 × XPH − 17.29 × XVF × XPH + 7.45360.661.10
Leaf countY = 15.80 × XVF − 5.50 × XPH + 18.24 × XVF × XPH + 11.02360.783.32
Above-ground fresh weight (g)Y = 117.5 × XVF − 355.6 × XPH + 869.2 × XVF × XPH + 77.04360.8553.77

Share and Cite

MDPI and ACS Style

Kim, D.-W.; Yun, H.S.; Jeong, S.-J.; Kwon, Y.-S.; Kim, S.-G.; Lee, W.S.; Kim, H.-J. Modeling and Testing of Growth Status for Chinese Cabbage and White Radish with UAV-Based RGB Imagery. Remote Sens. 2018, 10, 563. https://0-doi-org.brum.beds.ac.uk/10.3390/rs10040563

AMA Style

Kim D-W, Yun HS, Jeong S-J, Kwon Y-S, Kim S-G, Lee WS, Kim H-J. Modeling and Testing of Growth Status for Chinese Cabbage and White Radish with UAV-Based RGB Imagery. Remote Sensing. 2018; 10(4):563. https://0-doi-org.brum.beds.ac.uk/10.3390/rs10040563

Chicago/Turabian Style

Kim, Dong-Wook, Hee Sup Yun, Sang-Jin Jeong, Young-Seok Kwon, Suk-Gu Kim, Won Suk Lee, and Hak-Jin Kim. 2018. "Modeling and Testing of Growth Status for Chinese Cabbage and White Radish with UAV-Based RGB Imagery" Remote Sensing 10, no. 4: 563. https://0-doi-org.brum.beds.ac.uk/10.3390/rs10040563

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop