Efficiency analysis of information theoretic measures in image registration
- Authors: Voronov S.V.1, Tashlinskii A.G.1
-
Affiliations:
- Ulyanovsk State Technical University
- Issue: Vol 26, No 3 (2016)
- Pages: 502-505
- Section: Representation, Processing, Analysis, and Understanding of Images
- URL: https://journals.rcsi.science/1054-6618/article/view/194795
- DOI: https://doi.org/10.1134/S1054661816030226
- ID: 194795
Cite item
Abstract
Efficiency analysis of some information theoretic measures that can be used in image registration as objective functions is carried out. Shannon mutual information, Renyi and Tsallis entropy are examined using synthesized images with correlation function, intensity and noise distributions close to Gaussian. Results show that Renyi entropy potentially provides a faster convergence rate and lower variance of parameters’ estimates when using recurrent image registration algorithms. According to these criteria, Tsallis entropy provides a little worse results; however, it has a larger effective range. Shannon mutual information loses to both entropy measures. Moreover, it is more sensitive to noise. Nevertheless, Shannon mutual information is more effective in terms of computational complexity.
About the authors
S. V. Voronov
Ulyanovsk State Technical University
Author for correspondence.
Email: s.voronov@ulstu.ru
Russian Federation, ul. Severnyi Venets 32, Ulyanovsk, 432027
A. G. Tashlinskii
Ulyanovsk State Technical University
Email: s.voronov@ulstu.ru
Russian Federation, ul. Severnyi Venets 32, Ulyanovsk, 432027
Supplementary files
