摘要

Biometric discretization derives a binary string for each user based on an ordered set of real-valued biometric features. The false acceptance rate (FAR) and the false rejection rate (FRR) of a binary biometric-based system significantly relies on a Hamming distance threshold which decides whether the errors in the query bit string will be rectified with reference to the template bit string. Kelkboom et al. have recently modeled a basic framework to estimate the FAR and the FRR of one-bit biometric discretization. However, as the demand of a bit string with higher entropy (informative length) rises, single-bit discretization is getting less useful today due to its incapability of producing bit string that is longer than the total feature dimensions being extracted, thus causing Kelkboom's model to be of restricted use. In this paper, we extend the analytical framework to multibit discretization for estimating the performance and the decision threshold for achieving a specified FAR/FRR based on equal-probable quantization and linearly separable subcode encoding. Promising estimation results on a synthetic data set with independent feature components and Gaussian measurements vindicate the analytical expressions of our framework. However, for experiments on two popular face data sets, deviation in estimation results were obtained mainly due to the mismatch of independency assumption of our framework. We hence fit the analytical probability mass functions (pmfs) to the experimental pmfs through estimating the mean and the variance parameters from the difference between the corresponding analytical and experimental curves to alleviate such estimation inaccuracies on these data sets.

  • 出版日期2012-8