Multi-cue based moving hand segmentation for gesture recognition


Citar

Texto integral

Acesso aberto Acesso aberto
Acesso é fechado Acesso está concedido
Acesso é fechado Somente assinantes

Resumo

This paper proposes a novel moving hand segmentation approach using skin color, grayscale, depth, and motion cues for gesture recognition. The proposed approach does not depend on unreasonable restrictions, and it can solve the problem of hand-over-face occlusion. First, an online updated skin color histogram (OUSCH) model is built to robustly represent skin color; second, according to the variance information of grayscale and depth optical flow, a motion region of interest (MRoI) is adaptively extracted to locate the moving body part (MBP) and reduce the impact of noise; then, Harris-Affine corners that satisfy skin color and adaptive motion constraints are adopted as skin seed points in the MRoI; next, the skin seed points are grown to obtain a candidate hand region utilizing skin color, depth and motion criteria; finally, boundary depth gradient, skeleton extraction, and shortest path search are employed to segment the moving hand region from the candidate hand region. Experimental results demonstrate that the proposed approach can accurately segment moving hand regions under different situations, especially when the face is occluded by a hand. Furthermore, this approach achieves higher segmentation accuracy than other state-of-the-art approaches.

Sobre autores

Jia Lin

Faculty of Information Technology

Autor responsável pela correspondência
Email: linjia.bjut@gmail.com
República Popular da China, Beijing, 100124

Xiaogang Ruan

Faculty of Information Technology

Email: linjia.bjut@gmail.com
República Popular da China, Beijing, 100124

Naigong Yu

Faculty of Information Technology

Email: linjia.bjut@gmail.com
República Popular da China, Beijing, 100124

Jianxian Cai

Faculty of Information Technology

Email: linjia.bjut@gmail.com
República Popular da China, Beijing, 100124

Arquivos suplementares

Arquivos suplementares
Ação
1. JATS XML

Declaração de direitos autorais © Allerton Press, Inc., 2017