Depth Estimation with Ego-Motion Assisted Monocular Camera


Цитировать

Полный текст

Открытый доступ Открытый доступ
Доступ закрыт Доступ предоставлен
Доступ закрыт Только для подписчиков

Аннотация

We propose a method to estimate the distance to objects based on the complementary nature of monocular image sequences and camera kinematic parameters. The fusion of camera measurements with the kinematics parameters that are measured by an IMU and an odometer is performed using an extended Kalman filter. Results of field experiments with a wheeled robot corroborated the results of the simulation study in terms of accuracy of depth estimation. The performance of the approach in depth estimation is strongly affected by the mutual observer and feature point geometry, measurement accuracy of the observer’s motion parameters and distance covered by the observer. It was found that under favorable conditions the error in distance estimation can be as small as 1% of the distance to a feature point. This approach can be used to estimate distance to objects located hundreds of meters away from the camera.

Об авторах

M. Mansour

Faculty of Information Technology and Communication Sciences, Tampere University; Department of Information and Navigation Systems, ITMO University

Автор, ответственный за переписку.
Email: mostafa.mansour@tuni.fi
Финляндия, Tampere; St. Petersburg

P. Davidson

Faculty of Information Technology and Communication Sciences, Tampere University

Email: mostafa.mansour@tuni.fi
Финляндия, Tampere

O. Stepanov

Department of Information and Navigation Systems, ITMO University

Email: mostafa.mansour@tuni.fi
Россия, St. Petersburg

J.-P. Raunio

AAC Technologies

Email: mostafa.mansour@tuni.fi
Финляндия, Tampere

M. Aref

Faculty of Engineering and Natural Sciences, Tampere University

Email: mostafa.mansour@tuni.fi
Финляндия, Tampere

R. Piché

Faculty of Information Technology and Communication Sciences, Tampere University

Email: mostafa.mansour@tuni.fi
Финляндия, Tampere


© Pleiades Publishing, Ltd., 2019

Данный сайт использует cookie-файлы

Продолжая использовать наш сайт, вы даете согласие на обработку файлов cookie, которые обеспечивают правильную работу сайта.

О куки-файлах