Efficiency analysis of information theoretic measures in image registration


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

Efficiency analysis of some information theoretic measures that can be used in image registration as objective functions is carried out. Shannon mutual information, Renyi and Tsallis entropy are examined using synthesized images with correlation function, intensity and noise distributions close to Gaussian. Results show that Renyi entropy potentially provides a faster convergence rate and lower variance of parameters’ estimates when using recurrent image registration algorithms. According to these criteria, Tsallis entropy provides a little worse results; however, it has a larger effective range. Shannon mutual information loses to both entropy measures. Moreover, it is more sensitive to noise. Nevertheless, Shannon mutual information is more effective in terms of computational complexity.

About the authors

S. V. Voronov

Ulyanovsk State Technical University

Author for correspondence.
Email: s.voronov@ulstu.ru
Russian Federation, ul. Severnyi Venets 32, Ulyanovsk, 432027

A. G. Tashlinskii

Ulyanovsk State Technical University

Email: s.voronov@ulstu.ru
Russian Federation, ul. Severnyi Venets 32, Ulyanovsk, 432027

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2016 Pleiades Publishing, Ltd.