The evolution of Advanced Driver Assistance Systems (ADAS) towards the ultimate goal of autonomous driving relies on a conspicuous number of sensors, to perform a wide range of operations, from parking assistance to emergency braking and environment mapping for target recognition/classification. Low-cost Mass-Market Radars (MMRs) are today widely used for object detection at various ranges (up to 250 meters) but they might not be suited for high-precision environment mapping. In this context, vehicular Synthetic Aperture Radar (SAR) is emerging as a promising technique to augment radar imaging capability by exploiting the vehicle motion to provide two-dimensional (2D), or even three-dimensional (3D), images of the surroundings. SAR has a higher resolution compared to standard automotive radars, provided that motion is precisely known. In this regard, one of the most attractive solutions to increase the positioning accuracy is to fuse the information from multiple on-board sensors, such as Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), odometers and steering angle sensors. This paper proposes a multi-sensor fusion technique to support automotive SAR systems, experimentally validating the approach and demonstrating its advantages compared to standard navigation solutions. The results show that multi-sensor-aided SAR images the surrounding with centimeter-level accuracy over typical urban trajectories, confirming its potential for practical applications and leaving room for further improvements.

Navigation-aided Automotive SAR for High-resolution Imaging of Driving Environments

Tagliaferri D.;Rizzi M.;Nicoli M.;Tebaldini S.;Monti-Guarnieri A. V.;Prati C. M.;Spagnolini U.
2021-01-01

Abstract

The evolution of Advanced Driver Assistance Systems (ADAS) towards the ultimate goal of autonomous driving relies on a conspicuous number of sensors, to perform a wide range of operations, from parking assistance to emergency braking and environment mapping for target recognition/classification. Low-cost Mass-Market Radars (MMRs) are today widely used for object detection at various ranges (up to 250 meters) but they might not be suited for high-precision environment mapping. In this context, vehicular Synthetic Aperture Radar (SAR) is emerging as a promising technique to augment radar imaging capability by exploiting the vehicle motion to provide two-dimensional (2D), or even three-dimensional (3D), images of the surroundings. SAR has a higher resolution compared to standard automotive radars, provided that motion is precisely known. In this regard, one of the most attractive solutions to increase the positioning accuracy is to fuse the information from multiple on-board sensors, such as Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), odometers and steering angle sensors. This paper proposes a multi-sensor fusion technique to support automotive SAR systems, experimentally validating the approach and demonstrating its advantages compared to standard navigation solutions. The results show that multi-sensor-aided SAR images the surrounding with centimeter-level accuracy over typical urban trajectories, confirming its potential for practical applications and leaving room for further improvements.
2021
ADAS
Automotive SAR
Environment Mapping
IMU/GNSS Integration
In-car Navigation
Sensor Fusion
File in questo prodotto:
File Dimensione Formato  
RV_2021_IEEEaccess.pdf

accesso aperto

Descrizione: Full text
: Publisher’s version
Dimensione 7.3 MB
Formato Adobe PDF
7.3 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1164280
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 38
  • ???jsp.display-item.citation.isi??? 25
social impact