摘要

In this paper, we present a statistical parts-based model (PBM) of appearance, applied to the problem of modeling intersubject anatomical variability in magnetic resonance (MR) brain images. In contrast to global image models such as the active appearance model (AAM), the PBM consists of a collection of localized image regions, referred to as parts, whose appearance, geometry and occurrence frequency are quantified statistically. The parts-based approach explicitly addresses the case where one-to-one correspondence does not exist between all subjects in a population due to anatomical differences, as model parts are not required to appear in all subjects. The model is constructed through a fully automatic machine learning algorithm, identifying image patterns that appear with statistical regularity in a large collection of subject images. Parts are represented by generic scale-invariant features, and the model can, therefore, be applied to a wide variety of image domains. Experimentation based on 2-D MR slices shows that a PBM learned from a set of 102 subjects can be robustly fit to 50 new subjects with accuracy comparable to 3 human raters. Additionally, it is shown that unlike global models such as the AAM, PBM fitting is stable in the presence of unexpected, local perturbation.

  • 出版日期2007-4

全文