Perception of the surrounding is a crucial task in most of the autonomous driving scenarios. For this reason most vehicles are equipped with a broad range of sensors like lidar, radar, cameras and ultrasound to sense the space around the car. On the other end, planning algorithms need a simple and usable representation of the obstacle around. One of the biggest drawbacks of such a wide range of sensors is the need to resolve conflicting information and identify false positives. What we propose in this paper is an effective framework for sensor fusion and occupancy grid creation capable of retrieving a uniform representation of the ambient around the vehicle and able to handle conflictual information from different sensors.
Multi-layer occupancy grid mapping for autonomous vehicles navigation
MENTASTI, SIMONE;Matteucci M.
2019-01-01
Abstract
Perception of the surrounding is a crucial task in most of the autonomous driving scenarios. For this reason most vehicles are equipped with a broad range of sensors like lidar, radar, cameras and ultrasound to sense the space around the car. On the other end, planning algorithms need a simple and usable representation of the obstacle around. One of the biggest drawbacks of such a wide range of sensors is the need to resolve conflicting information and identify false positives. What we propose in this paper is an effective framework for sensor fusion and occupancy grid creation capable of retrieving a uniform representation of the ambient around the vehicle and able to handle conflictual information from different sensors.File | Dimensione | Formato | |
---|---|---|---|
08804556.pdf
accesso aperto
:
Publisher’s version
Dimensione
15.88 MB
Formato
Adobe PDF
|
15.88 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.