BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE

作者:Popescu Pantelimon G*; Dragomir Sever S; Slusanschi Emil I; Stanasila Octavian N
来源:Electronic Journal of Differential Equations, 2016, 237.

摘要

Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in the information theory. The purpose of this paper is to present new bounds for relative entropy D(p parallel to q) of two probability distributions and then to apply them to simple entropy and mutual information. The relative entropy upper bound obtained is a refinement of a bound previously presented into literature.

  • 出版日期2016-8-30