Application of Superpixels to Segment Several Landmarks in Running Rodents


如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

Examining locomotion has improved our basic understanding of motor control and aided in treating motor impairment. Mice and rats are the model system of choice for basic neuroscience studies of human disease. High frame rates are needed to quantify the kinematics of running rodents, due to their high stride frequency. Manual tracking, especially for multiple body landmarks, becomes extremely time-consuming. To overcome these limitations, we proposed the use of superpixels based image segmentation as superpixels utilized both spatial and color information for segmentation. We segmented some parts of the body and tested the success of segmentation as a function of color space and SLIC segment size. We used a simple merging function to connect the segmented regions considered as a neighbor and having the same intensity value range. In addition, 28 features were extracted, and t-SNE was used to demonstrate how much the methods are capable to differentiate the regions. Finally, we compared the segmented regions to a manually outlined region. The results showed for segmentation, using the RGB image was slightly better compared to the hue channel. For merging and classification, however, the hue representation was better as it captures the relevant color information in a single channel.

作者简介

O. Maghsoudi

Spence Laboratory, Bioengineering, College of Engineering

编辑信件的主要联系方式.
Email: o.maghsoudi@temple.edu
美国, Philadelphia, PA, 19122

A. Vahedipour

Spence Laboratory, Bioengineering, College of Engineering

Email: o.maghsoudi@temple.edu
美国, Philadelphia, PA, 19122

B. Robertson

Spence Laboratory, Bioengineering, College of Engineering

Email: o.maghsoudi@temple.edu
美国, Philadelphia, PA, 19122

A. Spence

Spence Laboratory, Bioengineering, College of Engineering

Email: o.maghsoudi@temple.edu
美国, Philadelphia, PA, 19122

补充文件

附件文件
动作
1. JATS XML

版权所有 © Pleiades Publishing, Ltd., 2018