Depth Estimation with Ego-Motion Assisted Monocular Camera


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

We propose a method to estimate the distance to objects based on the complementary nature of monocular image sequences and camera kinematic parameters. The fusion of camera measurements with the kinematics parameters that are measured by an IMU and an odometer is performed using an extended Kalman filter. Results of field experiments with a wheeled robot corroborated the results of the simulation study in terms of accuracy of depth estimation. The performance of the approach in depth estimation is strongly affected by the mutual observer and feature point geometry, measurement accuracy of the observer’s motion parameters and distance covered by the observer. It was found that under favorable conditions the error in distance estimation can be as small as 1% of the distance to a feature point. This approach can be used to estimate distance to objects located hundreds of meters away from the camera.

About the authors

M. Mansour

Faculty of Information Technology and Communication Sciences, Tampere University; Department of Information and Navigation Systems, ITMO University

Author for correspondence.
Email: mostafa.mansour@tuni.fi
Finland, Tampere; St. Petersburg

P. Davidson

Faculty of Information Technology and Communication Sciences, Tampere University

Email: mostafa.mansour@tuni.fi
Finland, Tampere

O. Stepanov

Department of Information and Navigation Systems, ITMO University

Email: mostafa.mansour@tuni.fi
Russian Federation, St. Petersburg

J.-P. Raunio

AAC Technologies

Email: mostafa.mansour@tuni.fi
Finland, Tampere

M. M. Aref

Faculty of Engineering and Natural Sciences, Tampere University

Email: mostafa.mansour@tuni.fi
Finland, Tampere

R. Piché

Faculty of Information Technology and Communication Sciences, Tampere University

Email: mostafa.mansour@tuni.fi
Finland, Tampere


Copyright (c) 2019 Pleiades Publishing, Ltd.

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies