We describe a novel inertial navigation system based on measurement fusion which includes stereo-vision among its sensors. The vision-augmented system provides enhanced accuracy in the estimation of the vehicle states when flying in proximity of obstacles, and can operate without GPS signal, for example when flying under vegetation cover, indoors or in complex urban environments. Scene feature points are tracked between the left an right images and across time steps, yielding vision-based information on the state of motion of the vehicle which is fused together with other non-vision-based sensors. The proposed approach is demonstrated using simulation for an autonomous helicopter flying in an urban environment.
Vision-Augmented Inertial Navigation by Sensor Fusion for an Autonomous Rotorcraft Vehicle
BOTTASSO, CARLO LUIGI;LEONELLO, DOMENICO
2009-01-01
Abstract
We describe a novel inertial navigation system based on measurement fusion which includes stereo-vision among its sensors. The vision-augmented system provides enhanced accuracy in the estimation of the vehicle states when flying in proximity of obstacles, and can operate without GPS signal, for example when flying under vegetation cover, indoors or in complex urban environments. Scene feature points are tracked between the left an right images and across time steps, yielding vision-based information on the state of motion of the vehicle which is fused together with other non-vision-based sensors. The proposed approach is demonstrated using simulation for an autonomous helicopter flying in an urban environment.File | Dimensione | Formato | |
---|---|---|---|
BOTTC17-09.pdf
Accesso riservato
:
Altro materiale allegato
Dimensione
2.78 MB
Formato
Adobe PDF
|
2.78 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.