Multi-modal sensor fusion constitutes an essential ingredient for safe autonomous navigation. In the last years, many works have improved the accuracy of Deep-Learning-based odometry estimators. However, the robustness of these algorithms to sensor failure or measurement degradation, which are very likely to happen during navigation, has been studied less extensively. Furthermore, works studying the robustness of the fusion modules are developed without modeling the correlation between sensor features, which is crucial to filter out features derived from noisy measurements and in sensor faults scenarios. To bridge this gap, in this paper, we propose a fault-resistant odometry estimator, which produces robust estimates even when the sensors completely fail, or measurements progressively degrade. Our framework models the correlation between the sensor embedding using Message Passing Neural Network (MPNN), a particular type of Graph Neural Network (GNN). A mask is then computed from the updated node features of the graph to weigh the multi-modal features computed from different sensors. We evaluate the proposed fusion strategy on the modified raw KITTI dataset with sensor degradation scenarios. Finally, we compare against state-of-the-art baselines based on trivial features concatenation and soft-fusion to demonstrate our method's superiority in terms of accuracy and robustness to sensor degradation and failures.

Fault Resistant Odometry Estimation using Message Passing Neural Network

Dahal, P;Mentasti, S;Paparusso, L;Arrigoni, S;Braghin, F
2023-01-01

Abstract

Multi-modal sensor fusion constitutes an essential ingredient for safe autonomous navigation. In the last years, many works have improved the accuracy of Deep-Learning-based odometry estimators. However, the robustness of these algorithms to sensor failure or measurement degradation, which are very likely to happen during navigation, has been studied less extensively. Furthermore, works studying the robustness of the fusion modules are developed without modeling the correlation between sensor features, which is crucial to filter out features derived from noisy measurements and in sensor faults scenarios. To bridge this gap, in this paper, we propose a fault-resistant odometry estimator, which produces robust estimates even when the sensors completely fail, or measurements progressively degrade. Our framework models the correlation between the sensor embedding using Message Passing Neural Network (MPNN), a particular type of Graph Neural Network (GNN). A mask is then computed from the updated node features of the graph to weigh the multi-modal features computed from different sensors. We evaluate the proposed fusion strategy on the modified raw KITTI dataset with sensor degradation scenarios. Finally, we compare against state-of-the-art baselines based on trivial features concatenation and soft-fusion to demonstrate our method's superiority in terms of accuracy and robustness to sensor degradation and failures.
2023
IEEE Intelligent Vehicles Symposium, Proceedings
979-8-3503-4691-6
Visual Inertial Odometry
Robustness
Sensor Fusion
Graph Neural Network
Message Passing Neural Network
File in questo prodotto:
File Dimensione Formato  
Fault_Resistant_Odometry_Estimation_using_Message_Passing_Neural_Network.pdf

Accesso riservato

Dimensione 2.71 MB
Formato Adobe PDF
2.71 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1250157
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact