摘要

We investigate the prior selection problem for predicting an input-output relation by a generalized Bayesian method, -Bayes prediction. The -Bayes predictive distribution is given by minimizing the Bayes risk corresponding to the -divergence that is a generalization of the Kullback-Leibler divergence. It is known that the effect of the prior to the performance of the usual Bayesian predictive distribution measured by the Kullback-Leibler divergence from the true distribution is asymptotically characterized by the Laplacian. We show that the -divergence between the -Bayes predictive distribution for next outputs and the true output distribution also has a similar characterization even if might be different from . We also investigate how the performance of the generalized Bayesian prediction behaves if the test and training input distributions are different.

  • 出版日期2010