摘要

Bayesian predictive densities when the observed data x and the target variable y to be predicted have different distributions are investigated by using the framework of information geometry. The performance of predictive densities is evaluated by the Kullback-Leibler divergence. The parametric models are formulated as Riemannian manifolds. In the conventional setting in which x and y have the same distribution, the Fisher-Rao metric and the Jeffreys prior play essential roles. In the present setting in which x and y have different distributions, a new metric, which we call the predictive metric, constructed by using the Fisher information matrices of x and y, and the volume element based on the predictive metric play the corresponding roles. It is shown that Bayesian predictive densities based on priors constructed by using non-constant positive superharmonic functions with respect to the predictive metric asymptotically dominate those based on the volume element prior of the predictive metric.

  • 出版日期2015-3
  • 单位RIKEN