Recent studies demonstrate the possibility of navigating in proximity of uncooperative space resident objects by using only monocular images. Despite the results achieved, the development and testing of new algorithms are strongly constrained by the availability of spaceborne image datasets. To overcome this, a new algorithm embedded in a tool to generate synthetic high-fidelity spaceborne image datasets is presented here. The architecture developed can be tailored to a wide range of scenarios and it is based on an open-source ray-tracing software. All assumptions and simplifications adopted are discussed in detail for the different models considered, including a trade-off between accuracy and rendering time. The new method described is subsequently adapted to a baseline scenario where the optical properties of a reference spacecraft model are tuned. Both qualitative and quantitative validations are detailed and successfully carried out for the baseline case, demonstrating the high photo-realism achievable with the proposed method. As a consequence of this main outcome, the paper details the generation of labeled spaceborne image datasets publicly available and reports the analyses that confirm the high level of representativeness, making them suitable for training and testing image-based navigation algorithms. As another outcome, the most comprehensive multi-purpose labeled dataset of validated spaceborne synthetic images currently publicly available is presented.

Dataset generation and validation for spacecraft pose estimation via monocular images processing

Bechini, Michele;Lavagna, Michèle;Lunghi, Paolo
2023-01-01

Abstract

Recent studies demonstrate the possibility of navigating in proximity of uncooperative space resident objects by using only monocular images. Despite the results achieved, the development and testing of new algorithms are strongly constrained by the availability of spaceborne image datasets. To overcome this, a new algorithm embedded in a tool to generate synthetic high-fidelity spaceborne image datasets is presented here. The architecture developed can be tailored to a wide range of scenarios and it is based on an open-source ray-tracing software. All assumptions and simplifications adopted are discussed in detail for the different models considered, including a trade-off between accuracy and rendering time. The new method described is subsequently adapted to a baseline scenario where the optical properties of a reference spacecraft model are tuned. Both qualitative and quantitative validations are detailed and successfully carried out for the baseline case, demonstrating the high photo-realism achievable with the proposed method. As a consequence of this main outcome, the paper details the generation of labeled spaceborne image datasets publicly available and reports the analyses that confirm the high level of representativeness, making them suitable for training and testing image-based navigation algorithms. As another outcome, the most comprehensive multi-purpose labeled dataset of validated spaceborne synthetic images currently publicly available is presented.
2023
Image generation, Relative pose, Synthetic image datasets, Vision-based navigation
File in questo prodotto:
File Dimensione Formato  
BECHM01-23.pdf

accesso aperto

: Publisher’s version
Dimensione 3.09 MB
Formato Adobe PDF
3.09 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1227780
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? 4
social impact