摘要

Although Hausdorff distance (HD) has been widely used in an object identification between same modality images, the object identification between different modality images are challenging because of the poor edge correspondence coming from heterogeneous image characteristics. This paper proposes a robust Hausdorff distance similarity (accurate M-HD: AMHD) between multi-modal sensor data. To improve robustness against the outliers when comparing the pairs of multi-modal images, the AMHD utilizes the orientation information of each point in addition to the distance transform (DT) map as a similarity criterion. In the AMHD scheme, the DT map is generated by applying dead-reckoning signed DT, and the distance orientation (DO) map is constructed by employing the Kirsch compass kernel to the DT map, respectively. Using the additional information on the DO, the proposed similarity can precisely examine the outliers including non-correspondent edges and noises, and discard false correspondent distances efficiently. The computer simulations show that the proposed AMHD yields superior performance at aligning multi-modal sensor data (visible-thermal IR face images) over those achieved by the conventional robust schemes in terms of the position error between the ground truth and the computed position.

  • 出版日期2011-5-1