摘要

Estimating normalizing constants is a common and often difficult problem in statistics, and path sampling (PS) is among the most powerful methods that have been put forward to this end. Using an identity that arises in the formulation of PS, we derive expressions for the Kullback-Leibler (KL) and J divergences between two distributions from possibly different parametric families These expressions naturally stem from PS when the geometric path is used to link the two extreme densities We examine the use of the KL and J divergence measures in PS in a variety of model selection examples In this context, one challenging aspect of PS is that of selecting an appropriate auxiliary density that will yield a high quality estimate of the marginal likelihood without incurring excessive computational effort The J divergenc