Efficiency analysis of information theoretic measures in image registration


如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

Efficiency analysis of some information theoretic measures that can be used in image registration as objective functions is carried out. Shannon mutual information, Renyi and Tsallis entropy are examined using synthesized images with correlation function, intensity and noise distributions close to Gaussian. Results show that Renyi entropy potentially provides a faster convergence rate and lower variance of parameters’ estimates when using recurrent image registration algorithms. According to these criteria, Tsallis entropy provides a little worse results; however, it has a larger effective range. Shannon mutual information loses to both entropy measures. Moreover, it is more sensitive to noise. Nevertheless, Shannon mutual information is more effective in terms of computational complexity.

作者简介

S. Voronov

Ulyanovsk State Technical University

编辑信件的主要联系方式.
Email: s.voronov@ulstu.ru
俄罗斯联邦, ul. Severnyi Venets 32, Ulyanovsk, 432027

A. Tashlinskii

Ulyanovsk State Technical University

Email: s.voronov@ulstu.ru
俄罗斯联邦, ul. Severnyi Venets 32, Ulyanovsk, 432027

补充文件

附件文件
动作
1. JATS XML

版权所有 © Pleiades Publishing, Ltd., 2016