Monocular-based relative guidance navigation and control chain plays a crucial role in the new missions for proximity operations in orbit. To provide high-quality images in the widest range of scenarios as possible without overconstraining the mission analysis or the mission planning, it is here proposed to use as input for the image processing and pose estimation algorithm, multispectral images obtained by fusing at pixel level two images from two different monocular cameras operating in the visible and the infrared rage of the spectrum respectively. Since fused multispectral images have never been used for this purpose, the main objective of the work is to verify if the content of information that can be retrieved is enough to assess the relative pose between a chaser and an uncooperative known target and if they can be safely used as the primary input. The tool used to synthetically generate the images is here described as well as the pose estimation algorithm applied. During the reimplementation of the baseline algorithm, modifications and improvements have been introduced to make the edge detection less sensitive to variations in images and to better estimate a priori the size of the match matrix needed for the pose estimation in a more general framework with the proposed mathematical formulation. The tests performed with the pose estimation algorithm on the multispectral images revealed that they can be adopted as the primary source of measurement since their content of information is higher than the single visible or infrared images used alone, avoiding the problems that characterize both these spectral bands. This result is also confirmed by the outcomes of the relative pose estimation algorithm which shows impressive results in terms of accuracy for a feature-based algorithm.

Robust Monocular Pose Initialization via Visual and Thermal Image Fusion

Bechini, M.;Civardi, G. L.;Quirino, M.;Lavagna, M.
2022-01-01

Abstract

Monocular-based relative guidance navigation and control chain plays a crucial role in the new missions for proximity operations in orbit. To provide high-quality images in the widest range of scenarios as possible without overconstraining the mission analysis or the mission planning, it is here proposed to use as input for the image processing and pose estimation algorithm, multispectral images obtained by fusing at pixel level two images from two different monocular cameras operating in the visible and the infrared rage of the spectrum respectively. Since fused multispectral images have never been used for this purpose, the main objective of the work is to verify if the content of information that can be retrieved is enough to assess the relative pose between a chaser and an uncooperative known target and if they can be safely used as the primary input. The tool used to synthetically generate the images is here described as well as the pose estimation algorithm applied. During the reimplementation of the baseline algorithm, modifications and improvements have been introduced to make the edge detection less sensitive to variations in images and to better estimate a priori the size of the match matrix needed for the pose estimation in a more general framework with the proposed mathematical formulation. The tests performed with the pose estimation algorithm on the multispectral images revealed that they can be adopted as the primary source of measurement since their content of information is higher than the single visible or infrared images used alone, avoiding the problems that characterize both these spectral bands. This result is also confirmed by the outcomes of the relative pose estimation algorithm which shows impressive results in terms of accuracy for a feature-based algorithm.
2022
73rd International Astronautical Congress (IAC 2022)
File in questo prodotto:
File Dimensione Formato  
BECHM02-22.pdf

accesso aperto

: Publisher’s version
Dimensione 675.54 kB
Formato Adobe PDF
675.54 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1221785
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
social impact