RUS  ENG JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Probl. Peredachi Inf.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Probl. Peredachi Inf., 2009, Volume 45, Issue 4, Pages 3–17 (Mi ppi1995)  

This article is cited in 4 scientific papers (total in 4 papers)

Information Theory

Mutual information of several random variables and its estimation via variation

V. V. Prelov

A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Abstract: We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1–3] to the multivariate case.

Full text: PDF file (302 kB)
References: PDF file   HTML file

English version:
Problems of Information Transmission, 2009, 45:4, 295–308

Bibliographic databases:

UDC: 621.391.1+519.2
Received: 12.05.2009

Citation: V. V. Prelov, “Mutual information of several random variables and its estimation via variation”, Probl. Peredachi Inf., 45:4 (2009), 3–17; Problems Inform. Transmission, 45:4 (2009), 295–308

Citation in format AMSBIB
\Bibitem{Pre09}
\by V.~V.~Prelov
\paper Mutual information of several random variables and its estimation via variation
\jour Probl. Peredachi Inf.
\yr 2009
\vol 45
\issue 4
\pages 3--17
\mathnet{http://mi.mathnet.ru/ppi1995}
\mathscinet{http://www.ams.org/mathscinet-getitem?mr=2641322}
\zmath{https://zbmath.org/?q=an:1190.94021}
\transl
\jour Problems Inform. Transmission
\yr 2009
\vol 45
\issue 4
\pages 295--308
\crossref{https://doi.org/10.1134/S0032946009040012}
\isi{http://gateway.isiknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&DestLinkType=FullRecord&DestApp=ALL_WOS&KeyUT=000273795900001}
\scopus{http://www.scopus.com/record/display.url?origin=inward&eid=2-s2.0-75649136907}


Linking options:
  • http://mi.mathnet.ru/eng/ppi1995
  • http://mi.mathnet.ru/eng/ppi/v45/i4/p3

    SHARE: VKontakte.ru FaceBook Twitter Mail.ru Livejournal Memori.ru


    Citing articles on Google Scholar: Russian citations, English citations
    Related articles on Google Scholar: Russian articles, English articles

    This publication is cited in the following articles:
    1. V. V. Prelov, “On computation of information via variation and inequalities for the entropy function”, Problems Inform. Transmission, 46:2 (2010), 122–126  mathnet  crossref  mathscinet  isi
    2. V. V. Prelov, “Generalization of a Pinsker problem”, Problems Inform. Transmission, 47:2 (2011), 98–116  mathnet  crossref  mathscinet  isi
    3. Ayuev V.V., “Metod dinamicheskoi rekonfiguratsii i obucheniya seti na osnove radialno-bazisnykh funktsii”, Prikladnaya informatika, 2011, no. 5, 118–126  elib
    4. Sason I., “Entropy Bounds for Discrete Random Variables via Maximal Coupling”, IEEE Trans. Inf. Theory, 59:11 (2013), 7118–7131  crossref  mathscinet  isi  elib
  • Проблемы передачи информации Problems of Information Transmission
    Number of views:
    This page:549
    Full text:108
    References:46
    First page:34

     
    Contact us:
     Terms of Use  Registration  Logotypes © Steklov Mathematical Institute RAS, 2020