Monocular vision-based range estimation supported by proprioceptive motion
- Autores: Davidson P.1, Raunio J.1, Piché R.1
-
Afiliações:
- Tampere University of Technology
- Edição: Volume 8, Nº 2 (2017)
- Páginas: 150-158
- Seção: Article
- URL: https://journals.rcsi.science/2075-1087/article/view/204088
- DOI: https://doi.org/10.1134/S2075108717020043
- ID: 204088
Citar
Resumo
This paper describes an approach for fusion of monocular vision measurements, camera motion, odometer and inertial rate sensor measurements. The motion of the camera between successive images generates a baseline for range computations by triangulation. The recursive estimation algorithm is based on extended Kalman filtering. The depth estimation accuracy is strongly affected by the mutual observer and feature point geometry, measurement accuracy of observer motion parameters and line of sight to a feature point. The simulation study investigates how the estimation accuracy is affected by the following parameters: linear and angular velocity measurement errors, camera noise, and observer path. These results impose requirements to the instrumentation and observation scenarios. It was found that under favorable conditions the error in distance estimation does not exceed 2% of the distance to a feature point.
Sobre autores
P. Davidson
Tampere University of Technology
Autor responsável pela correspondência
Email: pavel.davidson@tut.fi
Finlândia, Tampere
J.-P. Raunio
Tampere University of Technology
Email: pavel.davidson@tut.fi
Finlândia, Tampere
R. Piché
Tampere University of Technology
Email: pavel.davidson@tut.fi
Finlândia, Tampere