Browsing by Subject "Drone"

Sort by: Order: Results:

Now showing items 1-4 of 4
  • Suomalainen, Juha; Oliveira, Raquel A.; Hakala, Teemu; Koivumäki, Niko; Markelin, Lauri; Näsi, Roope; Honkavaara, Eija (Elsevier, 2021)
    Remote Sensing of Environment
    Multi- and hyperspectral cameras on drones can be valuable tools in environmental monitoring. A significant shortcoming complicating their usage in quantitative remote sensing applications is insufficient robust radiometric calibration methods. In a direct reflectance transformation method, the drone is equipped with a camera and an irradiance sensor, allowing transformation of image pixel values to reflectance factors without ground reference data. This method requires the sensors to be calibrated with higher accuracy than what is usually required by the empirical line method (ELM), but consequently it offers benefits in robustness, ease of operation, and ability to be used on Beyond-Visual Line of Sight flights. The objective of this study was to develop and assess a drone-based workflow for direct reflectance transformation and implement it on our hyperspectral remote sensing system. A novel atmospheric correction method is also introduced, using two reference panels, but, unlike in the ELM, the correction is not directly affected by changes in the illumination. The sensor system consists of a hyperspectral camera (Rikola HSI, by Senop) and an onboard irradiance spectrometer (FGI AIRS), which were both given thorough radiometric calibrations. In laboratory tests and in a flight experiment, the FGI AIRS tilt-corrected irradiances had accuracy better than 1.9% at solar zenith angles up to 70◦. The system’s lowaltitude reflectance factor accuracy was assessed in a flight experiment using reflectance reference panels, where the normalized root mean square errors (NRMSE) were less than ±2% for the light panels (25% and 50%) and less than ±4% for the dark panels (5% and 10%). In the high-altitude images, taken at 100–150 m altitude, the NRMSEs without atmospheric correction were within 1.4%–8.7% for VIS bands and 2.0%–18.5% for NIR bands. Significant atmospheric effects appeared already at 50 m flight altitude. The proposed atmospheric correction was found to be practical and it decreased the high-altitude NRMSEs to 1.3%–2.6% for VIS bands and to 2.3%– 5.3% for NIR bands. Overall, the workflow was found to be efficient and to provide similar accuracies as the ELM, but providing operational advantages in such challenging scenarios as in forest monitoring, large-scale autonomous mapping tasks, and real-time applications. Tests in varying illumination conditions showed that the reflectance factors of the gravel and vegetation targets varied up to 8% between sunny and cloudy conditions due to reflectance anisotropy effects, while the direct reflectance workflow had better accuracy. This suggests that the varying illumination conditions have to be further accounted for in drone-based in quantitative remote sensing applications.
  • Pöysä, Hannu; Kotilainen, Juho; Väänänen, Veli-Matti; Kunnasranta, Mervi (2018)
    We tested the use of unmanned aerial systems (UAS) in duck brood surveys in boreal wetlands in Finland. We performed brood surveys at the same wetlands concurrently with ground-based point counts and using a UAS (multicopter; drone counts) equipped with a camera that produced high-quality images for identification of broods and ducklings. The number of broods did not differ between point counts and drone counts in three duck species, the mallard (Anas platyrhynchos), common teal (Anas crecca), and common goldeneye (Bucephala clangula). The number of ducklings was higher in drone counts than in point counts in the common teal, but no such difference was found in the mallard and common goldeneye. UAS-based images seem to be useful for estimating numbers of both broods and ducklings for different duck species, although the manual processing of images is labor intensive.
  • Änäkkälä, Mikael (Helsingin yliopisto, 2020)
    The number of drones has increased in both the private and corporate sectors. There is also an interest in the use of drones in agriculture since by using them the large fields can be monitored easily. Automatic flight systems of drones are simple to use. More accurate overview of the field can be got by utilizing the drones than by making observations from the side of the field. With aerial photographs the measures for the field can be planned further. For example, based on the photos pesticide spraying or fertilize spreading can be planned for the field. Drones can also be used to estimate crop biomasses. With drones the development of the crops is possible to observe as a timeseries during the growing season. The aim of this study was to explore the use of multispectral images and 3D models in crop monitoring. Crop leaf area index (LAI), biomass and chlorophyll content were measured. There were 8 different plants/fertilization levels in this study. In this study, a multispectral camera and a RGB-camera were used to estimate crops features. With a multispectral camera the reflectance values of the vegetation, which described how much of the incoming sun radiation was reflected back from the vegetation, were able to determine. The multispectral camera had five spectral bands (blue, green, red, red edge and NIR). Based on these bands NDVI vegetation index was calculated. The reflectance values and vegetation indices were compared to the dry matter mass, LAI, and chlorophyll content determinations of the vegetation. From the images of the RGB-camera 3D-models were created to calculate crop volumes. Calculated volumes were compared to crop dry matter mass and LAI measurements. Linear regression analysis was used to examine the relationship between the variables calculated from the images and the parameters determined from the crops on the field. According to these results, the variables determined from the multispectral images explained the dry matter mass and leaf area index of the crop slightly less than the 3D-models determined from the RGB images. The strongest determined dependence of the data recorded by the multispectral camera was between the faba bean LAI and NDVI (R2 = 0,85). The relationship between the reflection/index data of multispectral camera and crop parameter was weak: average coefficient of determination for dry matter mass of the crop was 0.15, for chlorophyll content 0.14, and for LAI 0,21. The highest coefficient of determination for 3D model of crop volume was between the dry matter mass of oats (R2 = 0.91). The mean coefficient of dependence was 0.69 for the relationship between the plant dry matter masses and 3D model volumes. The mean coefficient of determination for the relationship between the leaf area index of plants and the 3D model volumes was 0.57. Based on these results, from the multispectral camera data, the NDVI index was best suited to determine the crops dry matter mass, leaf area index, and chlorophyll content. However, there were differences in the dependencies between different spectral bands/NDVI index and plant properties determined from different crops. 3D models produced stronger dependences for estimating crop dry matter mass and leaf area index than the quantities determined from multispectral images. Analyzing the data with more sophisticated calculation methods utilizing the values of several spectral bands and the indices in the same time would probably have been a more efficient method to analyzing the data than the current used linear regression used in this study. Removing errors, caused by external factors, from multispectral images was found to be very difficult. Especially reflectance values of dry soil differed clearly from vegetations values. Further studies are needed to develop vegetation indices that can reduce errors caused by external factors. In addition, data processing of images should be developed to utilize multiple spectral bands and vegetation indices to determine the relationship between crop characteristics and variables measured from images. In addition, different plant species imaging techniques should be investigated, as different plants have different reflection values.
  • Oliveira, R.A.; Khoramshahi, E.; Suomalainen, J.; Hakala, T.; Viljanen, N.; Honkavaara, E. (2018)
    The use of drones and photogrammetric technologies are increasing rapidly in different applications. Currently, drone processing workflow is in most cases based on sequential image acquisition and post-processing, but there are great interests towards real-time solutions. Fast and reliable real-time drone data processing can benefit, for instance, environmental monitoring tasks in precision agriculture and in forest. Recent developments in miniaturized and low-cost inertial measurement systems and GNSS sensors, and Real-time kinematic (RTK) position data are offering new perspectives for the comprehensive remote sensing applications. The combination of these sensors and light-weight and low-cost multi- or hyperspectral frame sensors in drones provides the opportunity of creating near real-time or real-time remote sensing data of target object. We have developed a system with direct georeferencing onboard drone to be used combined with hyperspectral frame cameras in real-time remote sensing applications. The objective of this study is to evaluate the real-time georeferencing comparing with post-processing solutions. Experimental data sets were captured in agricultural and forested test sites using the system. The accuracy of onboard georeferencing data were better than 0.5 m. The results showed that the real-time remote sensing is promising and feasible in both test sites. © Authors 2018. CC BY 4.0 License.