摘要

This work explores learning LCGM (lattice-connected Gaussian mixture) models by annealed Kullback-Leibler (KL) divergence minimization for a hybrid of topological and statistical pattern analysis. The XL divergence measures the general criteria of learning an LCGM model that is composed of a lattice of multivariate Gaussian units. A planar lattice emulates topological order of cortex-like neighboring relations and built-in parameters of connected Gaussian units represent statistical features of unsupervised data. Learning an LCGM model involves collateral optimization tasks of resolving mixture combinatorics and extracting geometric features from high-dimensional patterns. Under assumption that mixture combinatorics encoded by Potts variables obey the Boltzmann distribution, approximating their joint probability by the product of individual probabilities is qualified by the KL divergence whose minimization under physical-like deterministic annealing faithfully optimizes involved mixture combinatorics and geometric features. Numerical simulations show the proposed annealed KL divergence minimization is effective and reliable for solving generalized TSP, spot identification, self-organization and visualization and sorting of yeast gene expressions.

全文