Event cameras are novel neuromorphic sensors, which asynchronously capture pixel-level intensity changes in the form of "events". Event simulation from existing RGB datasets is commonly used to overcome the need of large amount of annotated data, which lacks due to the novelty of the event sensors. In this context, the possibility of using event simulation in synthetic scenarios, where data generation is not limited to pre-existing datasets, is to date still unexplored. In this work, we analyze the synth-to-real domain shift in event data, i.e., the gap arising between simulated events obtained from synthetic renderings and those captured with a real camera on real images. To this purpose, we extend to the event modality the popular RGB-D Object Dataset (ROD), which already comes with its synthetic version (SynROD). The resulting Neuromorphic ROD dataset (N-ROD) is the first to enable a synth-to-real analysis on event data, showing the effectiveness of Domain Adaptation techniques in reducing the synth-to-real shift. Moreover, through extensive experiments on multi-modal RGBE data, we show that events can be effectively combined with conventional visual information, encouraging further research in this area. The N-ROD dataset is available at https://N-ROD-dataset.github.io/home/.

N-ROD: A Neuromorphic Dataset for Synthetic-to-Real Domain Adaptation

Marco Cannici;Chiara Plizzari;Marco Ciccone;Matteo Matteucci
2021-01-01

Abstract

Event cameras are novel neuromorphic sensors, which asynchronously capture pixel-level intensity changes in the form of "events". Event simulation from existing RGB datasets is commonly used to overcome the need of large amount of annotated data, which lacks due to the novelty of the event sensors. In this context, the possibility of using event simulation in synthetic scenarios, where data generation is not limited to pre-existing datasets, is to date still unexplored. In this work, we analyze the synth-to-real domain shift in event data, i.e., the gap arising between simulated events obtained from synthetic renderings and those captured with a real camera on real images. To this purpose, we extend to the event modality the popular RGB-D Object Dataset (ROD), which already comes with its synthetic version (SynROD). The resulting Neuromorphic ROD dataset (N-ROD) is the first to enable a synth-to-real analysis on event data, showing the effectiveness of Domain Adaptation techniques in reducing the synth-to-real shift. Moreover, through extensive experiments on multi-modal RGBE data, we show that events can be effectively combined with conventional visual information, encouraging further research in this area. The N-ROD dataset is available at https://N-ROD-dataset.github.io/home/.
2021
Proceedings - 2021 IEEE Conference on Computer Vision and Pattern Recognition Workshops
File in questo prodotto:
File Dimensione Formato  
Cannici_N-ROD_A_Neuromorphic_Dataset_for_Synthetic-to-Real_Domain_Adaptation_CVPRW_2021_paper.pdf

accesso aperto

: Publisher’s version
Dimensione 1.16 MB
Formato Adobe PDF
1.16 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1181298
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 8
social impact