Monocular vision-based range estimation supported by proprioceptive motion


Цитировать

Полный текст

Открытый доступ Открытый доступ
Доступ закрыт Доступ предоставлен
Доступ закрыт Только для подписчиков

Аннотация

This paper describes an approach for fusion of monocular vision measurements, camera motion, odometer and inertial rate sensor measurements. The motion of the camera between successive images generates a baseline for range computations by triangulation. The recursive estimation algorithm is based on extended Kalman filtering. The depth estimation accuracy is strongly affected by the mutual observer and feature point geometry, measurement accuracy of observer motion parameters and line of sight to a feature point. The simulation study investigates how the estimation accuracy is affected by the following parameters: linear and angular velocity measurement errors, camera noise, and observer path. These results impose requirements to the instrumentation and observation scenarios. It was found that under favorable conditions the error in distance estimation does not exceed 2% of the distance to a feature point.

Об авторах

P. Davidson

Tampere University of Technology

Автор, ответственный за переписку.
Email: pavel.davidson@tut.fi
Финляндия, Tampere

J.-P. Raunio

Tampere University of Technology

Email: pavel.davidson@tut.fi
Финляндия, Tampere

R. Piché

Tampere University of Technology

Email: pavel.davidson@tut.fi
Финляндия, Tampere


© Pleiades Publishing, Ltd., 2017

Данный сайт использует cookie-файлы

Продолжая использовать наш сайт, вы даете согласие на обработку файлов cookie, которые обеспечивают правильную работу сайта.

О куки-файлах