AskIFAS Powered by EDIS

Types of Unmanned Aerial Vehicles (UAVs), Sensing Technologies, and Software for Agricultural Applications

Sri Charan Kakarla and Yiannis Ampatzidis


The purpose of this document is to provide guidance and information about various sensors and technologies that can be integrated with unmanned aerial vehicles (UAVs) for agricultural applications. There are various types of sensing technologies available commercially (starting from $200) that could help growers to collect data across large fields rapidly and at a low cost. For example, optical sensors, such as color (RGB), multispectral, hyperspectral, and thermal cameras, can be mounted on UAVs. These UAVs equipped with various sensors can be used with the appropriate software (e.g., Pix4D, DroneDeploy, Agroview, Aerobotics) to develop plant inventory, estimate plant height and canopy volume, determine plant leaf density, plant health (or stress), and plant nutrient concentrations, and create plant fertility maps. UAV-collected data can also be used to compute various vegetation indices (VIs) and produce VI maps. Several post-processing software applications can be utilized to process, analyze, and visualize collected data from UAVs, and convert these raw data to “practical/useful” information. This document provides detailed information about the most used UAVs, sensing technologies, and software for agricultural applications to growers and allied industry, consultants, researchers, Extension personnel, and students.


UAVs have recently become a great tool for scouting for abiotic and biotic stresses. When compared to manual scouting, they take less time and effort, and they can cover larger areas. With the integration of spectral sensors to UAVs, they can see beyond the human eye. UAVs equipped with multispectral and hyperspectral sensing systems are being used to early-detect and differentiate diseases that have similar visual symptoms (Abdulridha et al. 2020a; Abdulridha et al. 2020b; Abdulridha et al. 2020c). Ampatzidis and Partel (2019) and Ampatzidis et al. (2019) developed novel artificial intelligence (AI) models to evaluate HLB-affected citrus rootstocks utilizing spectral data collected from UAV imagery. Abdulridha et al. (2019) utilized UAV hyperspectral imagery and machine learning, an application of AI, to detect citrus canker and achieved 100% classification accuracy for identifying healthy and canker-infected trees. Unmanned aerial vehicles equipped with light detection and ranging (LiDAR) sensors are also used to monitor landscape and terrain changes in forests. LiDAR sensors can also be used to generate weed intensity maps, soil maps, and yield prediction maps. There are several practical applications of UAVs in agriculture, and this publication presents information about the different types of UAVs, sensors, and processing software currently available in the market.

Types of UAVs

Unmanned aerial vehicles can be classified into different types based on their aerodynamic features. The four major types are multi-rotor, fixed-wing, single-rotor, and hybrid vertical take-off and landing (VTOL).

  1. Multi-rotor UAVs (Figure 1): These are the most commonly used UAVs by professionals and hobbyists. They are widely used for aerial photography, aerial mapping, and recreational sports. These types of UAVs are currently the cheapest option available on the market. They can be further classified into different types based on the number of rotors on the UAV: tricopter (three rotors); quadrocopter (four rotors); hexacopter (six rotors); and octocopter (eight rotors). Generally, the amount of the payload UAVs can carry increases with the number of rotors that the UAV has. Even though multi-rotor UAVs are the cheapest option available, they have their disadvantages. They have limited flight time and endurance compared to the other types of UAVs. This is due to the fact that multi-rotor UAVs need a lot of energy to remain stable in the air against gravity and winds. The average flight time for multi-rotor UAVs ranges from approximately 20–40 minutes.
Example of a multi-rotor UAV (Matrice 210 from DJI, Shenzhen, China).
Figure 1. Example of a multi-rotor UAV (Matrice 210 from DJI, Shenzhen, China).
Credit: UF/IFAS

2. Fixed-wing UAVs (Figure 2): These types of UAVs are built similarly to regular airplanes. In contrast to the multi-rotor UAVs, they do not require a lot of energy to stay in the air because they make use of the aerodynamic lift provided by their structure. Due to this, they can fly for a longer time compared to the multi-rotor UAVs. The average flight time for fixed-wing UAVs can range from 1–2 hours. A disadvantage of these types of UAVs is that they usually need a lot of space for takeoff and landing. They also lack the ability to hover and are considered to be more complex and difficult to fly; they usually require a lot of training. These types of UAVs also typically cost more than multi-rotor UAVs.

Example of a fixed-wing UAV (created by UF researchers, Gainesville, FL, US).
Figure 2. Example of a fixed-wing UAV (created by UF researchers, Gainesville, FL, US).
Credit: UF/IFAS

3. Single-rotor UAVs: These types of UAVs look very similar to helicopters in their design and structure. They are usually equipped with a large rotor on the top and a small rotor on the tail to control their direction. These UAVs are usually powered by gas engines and therefore can fly for a longer time compared to multi-rotor UAVs. The downside of these types of UAVs are the operational dangers that come with their large rotors. They are also harder to fly, and significant training is required. They usually cost even more than fixed-wing UAVs, but they do come with heavier payload capability than fixed-wing UAVs.

4. Hybrid VTOL UAVs (Figure 3): These types of UAVs combine the abilities of fixed-wing and multi-rotor UAVs with vertical takeoff and landing capabilities. These UAVs are still in early development and there are very few options available on the market currently. They have a long flight time and can carry larger payloads, but their efficiency needs to be tested and evaluated.

Example of a hybrid VTOL UAV.
Figure 3. Example of a hybrid VTOL UAV.
Credit: UF/IFAS


The aforementioned UAVs can be used for data collection in agriculture depending on the user’s familiarity with the UAV and the application. Users must also check the compatibility or availability of sensors that can be mounted on the UAVs before purchasing them.

Types of Sensing Technologies

Color (Red, Green, and Blue—RGB) Camera

The RGB sensors (color cameras) are commonly referred to as visual cameras. They are widely used in everyday devices such as cellphones, tablets, digital cameras, etc. These sensors measure the reflectance in red, green, and blue spectrum and provide the users with an image. When UAVs equipped with sensors or cameras (RGB in this case) are flown over large areas, they collect thousands of images, which are then stitched together using photogrammetry software to produce a map of the entire field. These maps can be used for several agricultural applications (e.g., to develop plant inventories, or to estimate plant leaf density and plant canopy volume) (Ampatzidis and Partel 2019; Ampatzidis et al. 2019).


Multispectral sensors (Figure 4) are an advanced version of RGB sensors. They provide data beyond what the human eye can see, which cannot be captured by RGB sensors. They usually provide reflectance data from the near-infrared (NIR) spectrum in addition to the red, green, and blue spectrums that are usually captured by the RGB sensors. These data can be used for the calculation of several vegetation indices (VIs), including the most widely used VI, called normalized difference vegetation index (NDVI). In agriculture, NDVI is measured on a scale from 0 to 1, with 0 indicating a stressed plant and 1 indicating a healthy plant. NDVI is being widely used by researchers across the world to identify plant stress, predict crop yield, etc. (Costa et al. 2020) (Table 1).

UAV equipped with a multispectral sensor (marked in red circle) (Altum, RedEdge-M, AgEagle Aerial Systems Inc., Wichita, KS, US).
Figure 4. UAV equipped with a multispectral sensor (Altum, RedEdge-M, AgEagle Aerial Systems Inc., Wichita, KS, US).
Credit: UF/IFAS


The hyperspectral sensor (Figure 5) is one of the most complex spectral sensing technologies in use for agricultural applications. Currently, it is not as widely used as the other spectral sensors due to very high equipment cost, high payload capability requirement, and complex operating procedures. Contrary to the RGB and multispectral sensors, hyperspectral sensors collect reflectance data in continuous scans along a spectrum, usually ranging from 400–2400 nm. While multispectral sensors collect reflectance data over discrete broader bands (e.g., 4–10 bands), hyperspectral sensors collect reflectance data from much narrower bands (e.g., 100–200 bands). Researchers have been using hyperspectral sensors combined with machine learning algorithms to correlate the collected reflectance data with various agricultural parameters. For example, hyperspectral sensors are being used to detect, identify, and distinguish plant diseases with similar visual symptoms, which can be a very complex task (Abdulridha et al. 2020a; Hariharan et al. 2019).

UAV equipped with a hyperspectral sensor.
Figure 5. UAV equipped with a hyperspectral sensor.
Credit: UF/IFAS


Thermal sensors measure the thermal energy emitted by an object at a wavelength corresponding to its surface temperature. They can provide the users with the surface temperature of various objects (e.g., tree canopies) present in a field. Thermal cameras can measure plant canopy temperature, which can be used to determine canopy water stress for precision irrigation applications (Zhou et al. 2021). They can also be used with machine learning to detect leaf wetness (Swarup et al. 2021) and fruit count on trees (Gan et al. 2020).

Light Detection and Ranging (LiDAR)

Light detection and ranging (LiDAR) sensors measure the distance to objects around them by illuminating the target with laser light and calculating the time required for the light to return to the sensor. LiDAR sensors have been used historically to map digital elevation and surface models of the Earth’s surface. In agriculture, LiDAR sensors are being used for 3D modeling of farms and farm buildings. They can also be used to measure various parameters such as crop height, crop density, canopy size, etc. Garcia et al. (2018) used LiDAR samples to model forest canopy height, and Sankey et al. (2017) used LiDAR and hyperspectral fusion for topography modeling in southwestern forests of the United States, which can help in monitoring the landscape changes. LiDAR sensors can be used both on ground- and air-based platforms (Figure 6) for various applications.

UAV equipped with a LiDAR mapping system.
Figure 6. UAV equipped with a LiDAR mapping system.
Credit: UF/IFAS

Table 1 summarizes and compares the most commonly used sensing technologies mounted on UAVs for agricultural applications.

Table 1. Comparison of commonly used sensing technologies mounted on UAVs.


Flight Mission Planning

Flight mission planning is a very important step that helps users fly a UAV with minimal effort and successfully collect the required data. There are various computer and mobile applications and software available for flight mission planning. These apps will automatically program the path for the UAV to fly, just by selecting the area to be covered on a map. The apps allow the user to select various parameters such as overlap percentage, flight speed, picture trigger interval, etc.

There are several flight mission planning mobile applications and software available.

DroneDeploy: This mobile application is widely used across the world by professionals and hobbyists. It helps users to create flight paths by letting them select the area to be covered. The user can set various parameters such as flight height, flight speed, overlap percentage, etc. It can also create real-time maps for immediate analysis. This application has various pricing models available according to the use of the platform (

Similar free apps that provide such functionality include the DJI GS Pro, Pix4Dcapture, and Precision Flight. There are also other options like Litchi which require a one-time payment.


Photogrammetry is a very important technique involved in all UAV agricultural applications. This technique involves obtaining geometric information in three dimensions from multiple images. It is used to convert thousands of individual images into a single map, also known as an orthomosaic. This orthomosaic, containing different kinds of information, is then used by several applications (software) to extract, analyze, and visualize data relevant to the grower (i.e., convert data to information).

There are various photogrammetry applications (software) available to convert user-collected images or videos into an orthomosaic. The most commonly used are listed below.

Pix4Dmapper: This is a subscription-based application that is widely used in the industry for processing images collected from RGB, multispectral, and thermal sensors and converting them into orthomosaics. This application also offers visualization of orthomosaics based on customized VI calculations. The user can either select the existing VIs offered by the application, or create a new index by inputting the formula to calculate it. It can also create 3D models of large-scale objects using RGB images. The subscription costs around $350 per month (2021 data), but users can also buy a perpetual license that will be around $4,990 (one device). They also have educational licenses where it costs $1,990 (two devices) as well as several other plans.

OpenDroneMap: This is an open-source photogrammetry toolkit to process UAV-collected images into orthomosaics and 3D models. This is available free of cost to users. The disadvantage of this software is that it might be a bit complicated for an average user due to its command line user interface.

Similar applications that offer these features are the ArcGIS Drone2Map, 3DF Zephyr, Autodesk Recap, Agisoft Photoscan, etc. There are also various cloud-based photogrammetry tools provided as a third-party service by various UAV companies that deal with flight mission planning or providing UAV flight services (e.g., DroneDeploy, Measure, etc.).

Data Analysis and Visualization

Several web- or cloud-based platforms have been developed to analyze and visualize the collected data from UAVs. These platforms/apps develop orthomosaic maps, and then use algorithms for extracting, computing, analyzing, and visualizing information for precision agricultural applications. For example, apps such as Agroview, Aerobotics, and DroneDeploy can be used to predict yield, detect plant stress, and develop maps for precision irrigation applications.

Agroview: Agroview (Ampatzidis et al. 2020) is a cloud- and artificial intelligence (AI)-based application developed to analyze and visualize UAV-collected data. This interactive and user-friendly application ( can process images to create an orthomosaic, and then run AI-based algorithms to develop tree inventory maps, which include information for individual plants, such as plant height and canopy volume, plant health/stress, plant leaf density, plant nutrient concentration, etc. This application currently supports tree crops, but expansion to other crops is also in development.


Unmanned aerial vehicles can be used for mapping and scouting large farm areas in a short time using minimal personnel. They provide growers with practical information that can be used to increase the efficiency and productivity of a farm. There are several UAV options available to purchase, ranging from $1,000 to $40,000, and they can be equipped with a wide range of sensors. The most commonly used sensors in agriculture are RGB, multispectral, hyperspectral, and thermal cameras, and LiDAR, ranging from $200 to $25,000. These sensors and software (apps) can be used to detect plant stresses including pests and diseases, develop plant inventory, and predict yield. Three main categories of apps are needed for UAV applications in agriculture: flight mission planning apps, data processing apps, and data analysis and visualization apps.

Step-by-step instructions on how to use a drone for agricultural applications can be found at Kakarla and Ampatzidis (2018; 2019) and Kakarla et al. (2019a).


Abdulridha, J., Y. Ampatzidis, J. Qureshi, and P. Roberts. 2020a. “Laboratory and UAV-Based Identification and Classification of Tomato Yellow Leaf Curl, Bacterial Spot, and Target Spot Diseases in Tomato Utilizing Hyperspectral Imaging and Machine Learning.” Remote Sensing 12(17): 2732. doi:10.3390/rs12172732.

Abdulridha, J., Y. Ampatzidis, P. Roberts, and S. C. Kakarla. 2020b. “Detecting Powdery Mildew Disease in Squash at Different Stages Using UAV-Based Hyperspectral Imaging and Artificial Intelligence.” Biosystems Engineering 197:135–148.

Abdulridha, J., Y. Ampatzidis, S. C. Kakarla, and P. Roberts. 2020c. “Detection of Target Spot and Bacterial Spot Diseases in Tomato Using UAV-Based and Benchtop-Based Hyperspectral Imaging Techniques.” Precision Agriculture 21:955–978.

Abdulridha, J., O. Batuman, and Y. Ampatzidis. 2019. “UAV-Based Remote Sensing Technique to Detect Citrus Canker Disease Utilizing Hyperspectral Imaging and Machine Learning.” Remote Sensing 11(11): 1373.

Ampatzidis, Y., V. Partel, and L. Costa. 2020. “Agroview: Cloud-Based Application to Process, Analyze and Visualize UAV-Collected Data for Precision Agriculture Applications Utilizing Artificial Intelligence.” Computers and Electronics in Agriculture 174:105157.

Ampatzidis, Y., V. Partel, B. Meyering, and U. Albrecht. 2019. “Citrus Rootstock Evaluation Utilizing UAV-Based Remote Sensing and Artificial Intelligence.” Computers and Electronics in Agriculture 164:104900.

Ampatzidis, Y., and V. Partel. 2019. “UAV-Based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence.” Remote Sensing 11(4): 410. doi: 10.3390/rs11040410.

Costa, L., L. Nunes, and Y. Ampatzidis. 2020. “A New Visible Band Index (vNDVI) for Estimating NDVI Values on RGB Images Utilizing Genetic Algorithms.” Computers and Electronics in Agriculture 172:105334.

Gan, H., W. S. Lee, V. Alchanatis, and A. Abd-Elrahman. 2020. “Active Thermal Imaging for Immature Citrus Fruit Detection.” Biosystems Engineering 198:291–303. 10.1016/j.biosystemseng.2020.08.015.

García, M., S. Saatchi, S. Ustin, and H. Balzter. 2018. “Modelling Forest Canopy Height by Integrating Airborne LiDAR Samples with Satellite Radar and Multispectral Imagery.” International Journal of Applied Earth Observation and Geoinformation 66. 10.1016/j.jag.2017.11.017.

Hariharan, J., J. Fuller, Y. Ampatzidis, J. Abdulridha, and A. Lerwill. 2019. “Finite Difference Analysis and Bivariate Correlation of Hyperspectral Data for Detecting Laurel Wilt Disease and Nutritional Deficiency in Avocado.” Remote Sensing 11(15): 1748.

Kakarla, S. C., and Y. Ampatzidis. 2018. Instructions on the Use of Unmanned Aerial Vehicles (UAVs). AE527. Gainesville: University of Florida Institute of Food and Agricultural Sciences.

Kakarla, S. C., and Y. Ampatzidis. 2019. Postflight Data Processing Instructions on the Use of Unmanned Aerial Vehicles (UAVs) for Agricultural Applications. AE533. Gainesville: University of Florida Institute of Food and Agricultural Sciences.

Kakarla, S. C., L. De Morais, and Y. Ampatzidis. 2019. Preflight and Flight Instructions on the Use of Unmanned Aerial Vehicles (UAVs) for Agricultural Applications. AE535. Gainesville: University of Florida Institute of Food and Agricultural Sciences.

Sankey, T., J. Donager, J. McVay, and J. Sankey. 2017. “UAV LiDAR and Hyperspectral Fusion for Forest Monitoring in the Southwestern USA.” Remote Sensing of Environment 195:30–43. 10.1016/j.rse.2017.04.007.

Swarup, A., W. S. Lee, N. Peres, and C. Fraisse. 2020. “Strawberry Plant Wetness Detection Using Color and Thermal Imaging.” Biosystems Engineering 45:409–421.

Zhou, Z., Y. Majeed, G. D. Naranjo, and E. M. T. Gambacorta. 2021. “Assessment for Crop Water Stress with Infrared Thermal Imagery in Precision Agriculture: A Review and Future Prospects for Deep Learning Applications.” Computers and Electronics in Agriculture 182:106019.

Peer Reviewed

Publication #AE565

Date: 10/27/2021

Fact Sheet

About this Publication

This document is AE565, one of a series of the Department of Agricultural and Biological Engineering, UF/IFAS Extension. Original publication date October 2021. Visit the EDIS website at for the currently supported version of this publication.

About the Authors

Sri Charan Kakarla, engineering and research technologist, M.S.; and Yiannis Ampatzidis, associate professor, precision agriculture/automation, smart machines, Department of Agricultural and Biological Engineering; UF/IFAS Southwest Florida Research and Education Center, Immokalee, FL 34142.


  • Ioannis Ampatzidis