On some extremal problems for mutual information and entropy


Citar

Texto integral

Acesso aberto Acesso aberto
Acesso é fechado Acesso está concedido
Acesso é fechado Somente assinantes

Resumo

The problem of determining the maximum mutual information I(X; Y) and minimum entropy H(X, Y) of a pair of discrete random variables X and Y is considered under the condition that the probability distribution of X is fixed and the error probability Pr{Y ≠ X} takes a given value ε, 0 ≤ ε ≤ 1. Precise values for these quantities are found, which in several cases allows us to obtain explicit formulas for both the maximum information and minimum entropy in terms of the probability distribution of X and the parameter ε.

Sobre autores

V. Prelov

Kharkevich Institute for Information Transmission Problems

Autor responsável pela correspondência
Email: prelov@iitp.ru
Rússia, Moscow

Arquivos suplementares

Arquivos suplementares
Ação
1. JATS XML

Declaração de direitos autorais © Pleiades Publishing, Inc., 2016