A UAV or un-crewed aerial vehicle, commonly known as a drone is an aircraft without a human pilot onboard and a type of unmanned vehicle [11]. Some of the use of UAV drones for civilian purposes especially for mapping is as stated by [12] which began around the 2006s which included thematic mapping for agriculture, forestry, archeology and architecture, environment, emergency management and traffic monitoring [4] both from the aspects of projects, regulations, classifications and UAV applications in the mapping domain [13]. Observing the application side, the use of UAV drones for thematic mapping purposes has continued to develop since 2017 both for applications in the field of ecology [14,15,16,17], crop productivity [18,19,20], LULC mapping [1, 3, 5, 21], plant diseases detection [22,23,24] and application for plant/tree detection and counting [25] and [26].
In the field of ecology, Cruzan et al. [15], were used to create a composite image (ortho-mosaic) and 3D DSM. Vegetation classification was conducted manually and using an automated routine. A comparison between manual and automated habitat classification confirmed that mapping methods were accurate. A species with high contrast to the background matrix allowed an adequate estimate of its coverage. Villanueva et al. [17] investigates the contribution of LiDAR elevation data in DEM generation based on fixed-wing UAV imaging for flood estimates including volume and area. While in the previous year, Ruwaimana [16], present study compares the utility of drone images and satellite images for mangrove mapping, and Casella et al. [14] present a review of the state of art in drones and photogrammetry for beach surveys and the respective achieved measurement quality.
Sola Guirado et al. [20] calculates the actual yield (in years) of the crop which links it to the geometric parameters of the tree canopy using orthophoto that were acquired with UAV drone. Actual yield was forecasted using both manual canopy volume and individual tree crown as the main factors for olive productivity. In conclusion, Sola Guirado et al. [20] mention a thematic map describing spatial AY variability that may be a powerful tool for farmers, insurance systems, and market forecasts to detect agronomical problems. Huang et al. [18] compared satellite imagery, piloted aircraft remote sensing, and UAV data to capture high spatial resolution imagery to generate an accurate weed cover map over the rice field and with the relatively same goal. Nuijten et al. [19] have used high-resolution optical imagery from UAV Drone for providing insight into the potential of drone data for determining crop productivity on a large scale.
In mapping LULC using drones vehicle after 2016, it was initiated by Kalantar et al. [1] presents a novel method that integrates the fuzzy unordered rule induction algorithm (FURIA) into OBIA to achieve accurate landcover extraction from UAV images with study area located in Universiti Putra Malaysia (UPM) campus in the state of Selangor. Lizuka et al. [3] with drone’s image data are used to collect ground survey information and microscale information by implementing a structure from motion (SfM) technique to develop mosaicked orthorectified images of the sites. The orthophoto and DSM derived from the drone-based data had resolutions of 0.05 and 0.1 m, respectively. In the same year, Natesan et al. [21] work was to investigate the use of UAV equipped with an RGB camera and a compact spectrometer for land cover classification. Yao et al. [5], Conduct a comparison application between LULC mapping and change detection using ultra-high-resolution UAV drone data.
Plant disease detection application has been developed by [22,23,24]. Sandino et al. [23] proposed a methodology for effective detection and mapping of indicators of poor health in forest and plantation trees integrating drone technology and artificial intelligence approaches. The approach is illustrated with an experimentation case of myrtle rust (Austropuccinia psidii) on paperbark tea trees (Melaleuca quinquenervia) in New South Wales (NSW). Di Girolamo-Neto et al. [22] explore the potential of texture features derived from images acquired by an optical sensor onboard a UAV to detect Bermudagrass in sugarcane and using the same vehicle [24], evaluate the potential of three-band multispectral imagery from a multi-rotor UAV platform for the detection of ramularia leaf blight from different flight heights in an experimental field. The study found that increasing infection had caused progressive degradation of spectral vegetation signals, however, it was not enough to distinguish the finer scale disease severity.
In the application for tree detection and counting Koh et al. [25], and Grznárová et al. [26] proposed different methods for the same goal. Koh et al. [25], estimated plant density or counts) in field experiments. Challenges posed to digital plant counting models are heterogeneous germination and mixed growth stages that are present in field experiments with diverse genotypes. Grznárová et al. [26], identify the influence of tree species on the accuracy of estimation of crown diameter using the same obtained from the drone. A method for safflower seedling count at early mixed growth stages using UAV data was developed and validated. The model performed well across heterogenous growth stages and has the potential to be used for plant density estimation across various crop species.
In terms of the technology used in the previous study, most researchers use data from multi-rotor drones with a combination of data from other remote sensing vehicles such as satellite data and an aerial photograph. Most researchers process 3D and ortho mosaic information from drone data to obtain better analysis results as Cruzan et al. [15], Kalantara et al. [1], Lizuka et al. [3], Nuijten et al. [19], Yao et al. [5] and Casella et al. [14]. Cruzan et al. [15] created a vegetation map for the entire region from the ortho mosaic and DSM, and mapped the density of the species and using a set of overlapping images with the corresponding referencing information and the next year [1] create a DEM and ortho mosaics to achieve accurate landcover extraction from UAV images using fuzzy unordered rule induction algorithm (FURIA). Lizuka et al., [3] organized orthophoto and DSM derived from the drone-based data had resolutions of 0.05 and 0.1 m for the case study of post-mining sites in Indonesia to produce LULC with high accuracy. Confirmed that the drone-based LULC map with orthophoto and DSM showed an average accuracy of 92.3% compared with The SAR-based LULC map showed an overall accuracy of 78.1%. Improved accuracy concerning the use of orthophoto and DSM in data processing was also shown by [1] with an overall accuracy of up to 91.23% and also by [19] which reconstructed a DSM and terrain model for determining crop productivity on a large scale. It was stated that there was an increase in accuracy in detecting individual crops up to 99.8% by processing data using the image segmentation approach. Yao et al. [5] fusing 3D information such as height, geometric and oblique information for remote sensing analysis in two project goals including LULC mapping and change detection. Contextual information and deep learning methods are essential for accuracy improvement. Casella et al. [14] produce DEM of a beach using a GNSS-RTK system. DEM taken at different times were used to calculate the before after sediment budget following a storm that hit a sandy coast in Sylt Island at the German North Sea coast. In the final, the average RMSE number of the digital elevation models was ~ 5 cm, with a survey efficiency of coverage area about 3 m2 min−1.
In processing high-resolution data from drones, most researchers use artificial intelligent approaches in their implementation such as [1] which implements Fuzzy FURIA [18], which proposes the use of the fully convolutional network (FCN) method for weed mapping of the collected imagery. A maximum-likelihood algorithm was proposed by Ruwaimana et al. [16] and Sandino et al. [23] to process hyperspectral data. ALOS-2 Phased Array L-band Synthetic Aperture Radar-2 (PALSAR-2) data L-band backscattering data are processed with a multilayer perceptron (MLP) supervised classification for generating a categorical map by Lizuka et al. [3]. Nuijten et al. [19] used Multi-resolution image segmentation (MIRS) and template matching algorithms for an integrated workflow to detect individual crops and delineate C. endivia crop covered areas and In [22] using random forest algorithms to perform automatic classification. Figure 2 shown below is the principle of drones doing the mapping.