摘要
In this paper, we present a generalization of the Kullerback-Leibler (KL) divergence in form of the Tsallis statistics. In parallel with the classical KL divergence, several important properties of this new generalization, including the pseudo-additivity, positivity and monotonicity, are shown. Moreover, some strengthened estimates on the positivity of the new divergence and the information loss during transformations are obtained.
- 出版日期2016-4-1
- 单位清华大学