摘要

This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.

  • 出版日期2012-3