City Research Online

Robust deep learning LiDAR-based pose estimation for autonomous space landers

Chekakta, Z. ORCID: 0000-0002-4664-6283, Zenati, A., Aouf, N. ORCID: 0000-0001-9291-4077 & Dubois-Matra, O. (2022). Robust deep learning LiDAR-based pose estimation for autonomous space landers. Acta Astronautica, 201, pp. 59-74. doi: 10.1016/j.actaastro.2022.08.049

Abstract

Accurate relative pose estimation of a spacecraft during space landing operation is critical to ensure a safe and successful landing. This paper presents a 3D Light Detection and Ranging (LiDAR) based AI relative navigation architecture solution for autonomous space landing. The proposed architecture is based on a hybrid Deep Recurrent Convolutional Neural Network (DRCNN) combining a Convolutional Neural Network (CNN) with an Recurrent Neural Network (RNN) based on a Long–Short Term Memory (LSTM) network. The acquired 3D LiDAR data is converted into a multi-projected images and feed the DRCNN with depth and other multi-projected imagery. The CNN module of the architecture allows an efficient representation of features, and the RNN module, as an LSTM, provides robust navigation motion estimates. A variety of landing scenarios are considered, simulated and experimented to evaluate the efficiency of the proposed architecture. A LiDAR based imagery data (Range, Slope, and Elevation) is initially created using PANGU (Planet and Asteroid Natural Scene Generation Utility) software and an evaluation of the proposed solution using this data is conducted. Tests using an instrumented Aerial Robot in Gazebo software to simulate landing scenarios on a synthetic but representative lunar terrain (3D digital elevation model) is proposed. Finally, real experiments using a real flying drone equipped with a Velodyne VLP16 3D LiDAR sensor to generate real 3D scene point clouds while landing on a designed down scaled lunar moon landing surface are conducted. All the test results achieved show that the suggested architecture is capable of delivering good 6 Degree of Freedom (DoF) pose precision with a good and reasonable computation.

Publication Type: Article
Additional Information: © 2022. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/
Publisher Keywords: Space landing operations, Robotics, Deep Neural Network, Relative pose estimation, LiDAR navigation
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
T Technology > TJ Mechanical engineering and machinery
T Technology > TK Electrical engineering. Electronics Nuclear engineering
T Technology > TL Motor vehicles. Aeronautics. Astronautics
Departments: School of Science & Technology
School of Science & Technology > Engineering
SWORD Depositor:
[thumbnail of Robust_Deep_Learning_LiDAR_based_Pose_Estimation_for_Autonomous_Space_Landers.pdf]
Preview
Text - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (12MB) | Preview

Export

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login