摘要

In this paper, we design a robust and friendly human-robot interface (HRI) system for our intelligent mobile robot based only on natural human gestures. It consists of a triple-face detection method and a fuzzy logic controller (FLC)-Kalman filter tracking system to check the users and predict their current position in a dynamic and cluttered working environment. In addition, through the combined classifier of the principal component analysis (PCA) and back-propagation artificial neural network (BPANN), single and successive commands defined by facial positions and hand gestures are identified for real-time command recognition after dynamic programming (DP). Therefore, the users can instruct this HRI system to make member recognition or expression recognition corresponding to their gesture commands, respectively based on the linear discriminant analysis (LDA) and BPANN. The experimental results prove that the proposed HRI system perform accurately in real-time face detection and tracking, and robustly react to the corresponding gesture commands at eight frames per second (fps).

  • 出版日期2010-9