Benchmarking of monocular camera UAV-based localization and mapping methods in vineyards

UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature. To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark. Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.

Saved in:
Bibliographic Details
Main Authors: Wang, Kaiwen, Kooistra, Lammert, Wang, Yaowu, Vélez, Sergio, Wang, Wensheng, Valente, João
Format: Article/Letter to editor biblioteca
Language:English
Subjects:Precision Agriculture, SLAM, Structure from Motion, Up-close Sensing,
Online Access:https://research.wur.nl/en/publications/benchmarking-of-monocular-camera-uav-based-localization-and-mappi
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:UAVs equipped with various sensors offer a promising approach for enhancing orchard management efficiency. Up-close sensing enables precise crop localization and mapping, providing valuable a priori information for informed decision-making. Current research on localization and mapping methods can be broadly classified into SfM, traditional feature-based SLAM, and deep learning-integrated SLAM. While previous studies have evaluated these methods on public datasets, real-world agricultural environments, particularly vineyards, present unique challenges due to their complexity, dynamism, and unstructured nature. To bridge this gap, we conducted a comprehensive study in vineyards, collecting data under diverse conditions (flight modes, illumination conditions, and shooting angles) using a UAV equipped with high-resolution camera. To assess the performance of different methods, we proposed five evaluation metrics: efficiency, point cloud completeness, localization accuracy, parameter sensitivity, and plant-level spatial accuracy. We compared two SLAM approaches against SfM as a benchmark. Our findings reveal that deep learning-based SLAM outperforms SfM and feature-based SLAM in terms of position accuracy and point cloud resolution. Deep learning-based SLAM reduced average position error by 87% and increased point cloud resolution by 571%. However, feature-based SLAM demonstrated superior efficiency, making it a more suitable choice for real-time applications. These results offer valuable insights for selecting appropriate methods, considering illumination conditions, and optimizing parameters to balance accuracy and computational efficiency in orchard management activities.