Wasserstein and weighted metrics for multidimensional Gaussian distributions

Cover Page

Cite item

Full Text

Abstract

We present a number of low and upper bounds for Levy – Prokhorov, Wasserstein, Frechet, and Hellinger distances between probability distributions of the same or different dimensions. The weighted (or context-sensitive) total variance and Hellinger  distances are introduced. The upper and low bounds for these weighted metrics are proved. The low bounds for the minimum of different errors in sensitive hypothesis testing are proved. 

About the authors

Mark Yakovlevich Kelbert

Higher School of Economics – National Research University

ORCID iD: 0000-0002-3952-2012
Scopus Author ID: 55884702600
20 Myasnitskaya St., Moscow 101000, Russia

Yurii M. Suhov

DPMMS, Penn State University; University of Cambridge

Scopus Author ID: 35582648200
201 Old Main, State College, PA 16802, USA

References

  1. Vallander S. S. Calculation of the Wasserstein distance between probability distributions on the line. Theory of Probability & Its Applications, 1974, vol. 18, iss. 4, pp. 784–786. https://doi.org/10.1137/1118101
  2. Rachev S. T. The Monge – Kantorovich mass transference problem and its stochastic applications. Theory of Probability & Its Applications, 1985, vol. 29, iss. 4, pp. 647–676. https://doi.org/10.1137/1129093
  3. Givens C. R., Shortt R. M. A class of Wasserstein metrics for probability distributions. The Michigan Mathematical Journal, 1984, vol. 31, iss. 2, pp. 231–240. https://doi.org/10.1307/mmj/1029003026
  4. Olkin I., Pukelsheim F. The distances between two random vectors with given dispersion matrices. Linear Algebra and its Applications, 1982, vol. 48, pp. 257–263. https://doi.org/10.1016/0024-3795(82)90112-4
  5. Dowson D. C., Landau B. V. The Frechet distance between multivariate Normal distributions. Journal of Multivariate Analysis, 1982, vol. 12, iss. 3, pp. 450–455. https://doi.org/10.1016/0047-259X(82)90077-X
  6. Cai Y., Lim L.-H., Distances between probability distributions of different dimensions. IEEE Transactions on Information Theory, 2022, vol. 68, iss. 6, pp. 4020–4031. https://doi.org/10.1109/TIT.2022.3148923
  7. Dwivedi A., Wang S., Tajer A. Discriminant analysis under f-divergence measures. Entropy, 2022, vol. 24, iss. 2, art. 188, 26 p. https://doi.org/10.3390/e24020188
  8. Devroye L., Mehrabian A., Reddad T. The total variation distance between high-dimensional Gaussians. ArXiv, 2020, ArXiv:1810.08693v5, pp. 1–12.
  9. Endres D. M., Schindelin J. E. A new metric for probability distributions. IEEE Transactions on Information Theory, 2003, vol. 49, iss. 7, pp. 1858–1860. https://doi.org/10.1109/TIT.2003.813506
  10. Stuhl I., Suhov Y., Yasaei Sekeh S., Kelbert M. Basic inequalities for weighted entropies. Aequationes Mathematicae, 2016, vol. 90, iss. 4, pp. 817–848. https://doi.org/10.1007/s00010-015-0396-5
  11. Stuhl I., Kelbert M., Suhov Y., Yasaei Sekeh S. Weighted Gaussian entropy and determinant inequalities. Aequationes Mathematicae, 2022, vol. 96, iss. 1, pp. 85–114. https://doi.org/10.1007/s00010-021-00861-3


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies