摘要

Bayesian networks (BNs) have become an essential tool for reasoning under uncertainty in complex models. In particular, the subclass of BNs, Gaussian Bayesian networks (GBNs) can be used to model continuous variables with Gaussian distributions. In this work, we propose a divergence measure between two Gaussian distributions, Jeffreys-Riemannian UR) divergence, which allows one to evaluate the global effects of small and large changes in the network parameters. Similar to the well known Jeffreys divergence, the proposed divergence reduces the difference between two Gaussians to the difference between their parameters. We discuss some theoretical properties on its ability to bound parameter changes and show that it is a metric under a few conditions. We then apply it to the task of sensitivity analysis in GBNs. The sensitivity analysis considers different sets of parameters, depending on which kinds of variables are perturbed, and whether the uncertainties lie in their mean vector or covariance matrix. This fact makes it possible to know which set of variables most strongly disturbs the network output after evidence propagation. Finally, a practical example is provided to illustrate the concepts and methods presented. The results obtained by the JR divergence are almost completely consistent with those obtained by the Jeffreys divergence.