摘要

We investigate the asymptotic construction of constant-risk Bayesian predictive densities under the Kullback-Leibler risk when the distributions of data and target variables are different and have a common unknown parameter. It is known that the Kullback-Leibler risk is asymptotically equal to a trace of the product of two matrices: the inverse of the Fisher information matrix for the data and the Fisher information matrix for the target variables. We assume that the trace has a unique maximum point with respect to the parameter. We construct asymptotically constant-risk Bayesian predictive densities using a prior depending on the sample size. Further, we apply the theory to the subminimax estimator problem and the prediction based on the binary regression model.

  • 出版日期2014-6

全文