Monocular vision-based range estimation supported by proprioceptive motion
- Авторлар: Davidson P.1, Raunio J.1, Piché R.1
-
Мекемелер:
- Tampere University of Technology
- Шығарылым: Том 8, № 2 (2017)
- Беттер: 150-158
- Бөлім: Article
- URL: https://journals.rcsi.science/2075-1087/article/view/204088
- DOI: https://doi.org/10.1134/S2075108717020043
- ID: 204088
Дәйексөз келтіру
Аннотация
This paper describes an approach for fusion of monocular vision measurements, camera motion, odometer and inertial rate sensor measurements. The motion of the camera between successive images generates a baseline for range computations by triangulation. The recursive estimation algorithm is based on extended Kalman filtering. The depth estimation accuracy is strongly affected by the mutual observer and feature point geometry, measurement accuracy of observer motion parameters and line of sight to a feature point. The simulation study investigates how the estimation accuracy is affected by the following parameters: linear and angular velocity measurement errors, camera noise, and observer path. These results impose requirements to the instrumentation and observation scenarios. It was found that under favorable conditions the error in distance estimation does not exceed 2% of the distance to a feature point.
Авторлар туралы
P. Davidson
Tampere University of Technology
Хат алмасуға жауапты Автор.
Email: pavel.davidson@tut.fi
Финляндия, Tampere
J.-P. Raunio
Tampere University of Technology
Email: pavel.davidson@tut.fi
Финляндия, Tampere
R. Piché
Tampere University of Technology
Email: pavel.davidson@tut.fi
Финляндия, Tampere