RUS  ENG JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB
 General information Latest issue Archive Impact factor Guidelines for authors Search papers Search references RSS Latest issue Current issues Archive issues What is RSS

 Probl. Peredachi Inf.: Year: Volume: Issue: Page: Find

 Probl. Peredachi Inf., 2007, Volume 43, Issue 1, Pages 15–27 (Mi ppi2)

Information Theory

On Inequalities between Information and Variation

V. V. Prelov

A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Abstract: We continue studying the relationship between mutual information and variational distance started in Pinsker?s paper [1], where an upper bound for the mutual information via variational distance was obtained. We present a simple lower bound, which in some cases is optimal or asymptotically optimal. A uniform upper bound for the mutual information via variational distance is also derived for random variables with a finite number of values. For such random variables, the asymptotic behaviour of the maximum of mutual information is also investigated in the cases where the variational distance tends either to zero or to its maximum value.

Full text: PDF file (1023 kB)
References: PDF file   HTML file

English version:
Problems of Information Transmission, 2007, 43:1, 12–22

Bibliographic databases:

UDC: 621.391.1

Citation: V. V. Prelov, “On Inequalities between Information and Variation”, Probl. Peredachi Inf., 43:1 (2007), 15–27; Problems Inform. Transmission, 43:1 (2007), 12–22

Citation in format AMSBIB
\Bibitem{Pre07} \by V.~V.~Prelov \paper On Inequalities between Information and Variation \jour Probl. Peredachi Inf. \yr 2007 \vol 43 \issue 1 \pages 15--27 \mathnet{http://mi.mathnet.ru/ppi2} \mathscinet{http://www.ams.org/mathscinet-getitem?mr=2304060} \transl \jour Problems Inform. Transmission \yr 2007 \vol 43 \issue 1 \pages 12--22 \crossref{https://doi.org/10.1134/S0032946007010024} \isi{http://gateway.isiknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&DestLinkType=FullRecord&DestApp=ALL_WOS&KeyUT=000255299000002} \scopus{http://www.scopus.com/record/display.url?origin=inward&eid=2-s2.0-34247558098} 

• http://mi.mathnet.ru/eng/ppi2
• http://mi.mathnet.ru/eng/ppi/v43/i1/p15

 SHARE:

Citing articles on Google Scholar: Russian citations, English citations
Related articles on Google Scholar: Russian articles, English articles

This publication is cited in the following articles:
1. Prelov V., “On relationship between mutual information and variation”, 2007 IEEE International Symposium on Information Theory Proceedings, 2007, 51–55
2. V. V. Prelov, E. C. van der Meulen, “Mutual Information, Variation, and Fano's Inequality”, Problems Inform. Transmission, 44:3 (2008), 185–197
3. V. V. Prelov, “Mutual information of several random variables and its estimation via variation”, Problems Inform. Transmission, 45:4 (2009), 295–308
4. V. V. Prelov, “On computation of information via variation and inequalities for the entropy function”, Problems Inform. Transmission, 46:2 (2010), 122–126
5. V. V. Prelov, “Generalization of a Pinsker problem”, Problems Inform. Transmission, 47:2 (2011), 98–116
6. Sason I., “Entropy Bounds for Discrete Random Variables via Maximal Coupling”, IEEE Trans. Inf. Theory, 59:11 (2013), 7118–7131
7. V. V. Prelov, “On some extremal problems for mutual information and entropy”, Problems Inform. Transmission, 52:4 (2016), 319–328
8. Wang Zh. Schaefer R.F. Skoglund M. Xiao M. Poor H.V., “Strong Secrecy For Interference Channels Based on Channel Resolvability”, IEEE Trans. Inf. Theory, 64:7 (2018), 5110–5130
•  Number of views: This page: 381 Full text: 121 References: 34 First page: 8