The camera, or payload, that a drone carries directly influences the imagery it can capture and therefore its potential usage. Here's a quick guide to the most commonly-used aerial imaging sensors in this field:
RGB cameras (left image above)
Like the camera in your smartphone or SLR, RGB sensors acquire data in the visible spectrum (specifically Red, Green & Blue bands). The images they produce can be transformed into 2D orthomosaics (A.K.A. orthophotos) and 3D digital surface models. Such sensors have been used, for example to create terrain models of glacial features, monitor coastal erosion, perform volume measurements etc.
Near-infrared (NIR), red-edge (RED) & multispectral cameras (centre image above)
These cameras acquire data across bands in the visible and non-visible spectrums. This type of data enables users to compute vegetation indices in order to create reflectance maps for assessing plant health, estimating biomass and more.
Thermal cameras (right image above)
Temperature-measuring thermal cameras, such as the senseFly thermoMAP, assign a temperature value to each pixel and have already proved highly useful in the field—being used to count treetop orangutan nests and seals, assess the spread of wildfires and more.