摘要

In this paper, a framework for implicit human-centered tagging is presented. The proposed framework draws its inspiration from the psychologically established process of attribution. The latter strives to explain affect-related changes observed during an individual's participation in an emotional episode, by bestowing the corresponding affect changing properties on a selected perceived stimulus. Our framework tries to reverse-engineer this attribution process. By monitoring the annotator's focus of attention through gaze-tracking, we identify the stimulus attributed as the cause for the observed change in core affect. The latter is analyzed from the user's facial expressions. Experimental results attained by a lightweight, cost-efficient application based on the proposed framework show promising accuracy in both the assessment of topical relevance and direct annotation scenarios. These results are especially encouraging given the fact that the behavioral analyzers used to obtain user affective response and eye gaze lack the level of sophistication and high cost usually encountered in the related literature.

  • 出版日期2014-10

全文