An Obstacle Detection Method for Visually Impaired Persons by Ground Plane Removal Using Speeded-Up Robust Features and Gray Level Co-Occurrence Matrix


如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

Rapid boost in the density of the pedestrians and vehicles on the roads have made the life of visually impaired people very difficult. In this direction, we present the design of a smart phone based cost-effective system to guide visually impaired people to walk safely on the roads by detecting obstacles in real-time scenarios. Monocular vision based method is used to capture the video and then frames are extracted out of it after removing the blurriness caused by the motion of camera. For each frame, a computationally simple approach based on the ground plane is proposed for detecting and removing the ground plane. After removing ground plane, features like Speeded-Up Robust Features (SURF) of the non-ground area are computed and compared with features of obstacles. An active contour model is used to segment the area of non-ground image whose SURF features are matched with obstacle features. This area is referred as Region of Interest (ROI). To check whether ROI belongs to an obstacle or not, Gray Level Co-occurrence matrix (GLCM) features are calculated and passed onto a classification model. Classification results show that this system is efficiently able to detect the obstacles that are known to the system in near real-time.

作者简介

A. Jindal

Computer Science and Engineering Department, University Institute of Engineering and Technology

编辑信件的主要联系方式.
Email: anishjindal90@gmail.com
印度, Chandigarh, 160014

N. Aggarwal

Computer Science and Engineering Department, University Institute of Engineering and Technology

Email: anishjindal90@gmail.com
印度, Chandigarh, 160014

S. Gupta

Computer Science and Engineering Department, University Institute of Engineering and Technology

Email: anishjindal90@gmail.com
印度, Chandigarh, 160014

补充文件

附件文件
动作
1. JATS XML

版权所有 © Pleiades Publishing, Ltd., 2018