摘要

Some natural phenomena are deviating from standard statistical behavior and their study has increased interest in obtaining new definitions of information measures. But the steps for deriving the best definition of the entropy of a given dynamical system remain unknown. In this paper, we introduce some parametric extended divergences combining Jeffreys divergence and Tsallis entropy defined by generalized logarithmic functions, which lead to new inequalities. In addition, we give lower bounds for one-parameter extended Fermi-Dirac and Bose-Einstein divergences. Finally, we establish some inequalities for the Tsallis entropy, the Tsallis relative entropy and some divergences by the use of the Young%26apos;s inequality.

  • 出版日期2012-1-1