A novel UAV-integrated deep network detection and relative position estimation approach for weeds
Abdulsalam, M., Ahiska, K. & Aouf, N. ORCID: 0000-0001-9291-4077 A novel UAV-integrated deep network detection and relative position estimation approach for weeds. Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, 237(10), pp. 2211-2227. doi: 10.1177/09544100221150284
Abstract
This paper aims at presenting a novel monocular vision–based approach for drones to detect multiple type of weeds and estimate their positions autonomously for precision agriculture applications. The methodology is based on classifying and detecting the weeds using a proposed deep neural network architecture, named fused-YOLO on the images acquired from a monocular camera mounted on the unmanned aerial vehicle (UAV) following a predefined elliptical trajectory. The detection/classification is complemented by a new estimation scheme adopting unscented Kalman filter (UKF) to estimate the exact location of the weeds. Bounding boxes are assigned to the detected targets (weeds) such that the centre pixels of the bounding box will represent the centre of the target. The centre pixels are extracted and converted into world coordinates forming azimuth and elevation angles from the target to the UAV, and the proposed estimation scheme is used to extract the positions of the weeds. Experiments were conducted both indoor and outdoor to validate this integrated detection/classification/estimation approach. The errors in terms of misclassification and mispositioning of the weeds estimation were minimum, and the convergence of the position estimation results was short taking into account the affordable platform with cheap sensors used in the experiments.
Publication Type: | Article |
---|---|
Additional Information: | This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License (https://creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage). Request permissions for this article. |
Publisher Keywords: | deep neural networks, artificial intelligence, position estimation, robotic vision, weed detection, precision agriculture, unmanned aerial vehicles |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science S Agriculture > S Agriculture (General) T Technology > T Technology (General) |
Departments: | School of Science & Technology > Engineering |
SWORD Depositor: |
Available under License Creative Commons Attribution Non-commercial.
Download (3MB) | Preview
Export
Downloads
Downloads per month over past year