摘要

A recent study by Guzman-Martinez, Ortega, Grabowecky, Mossbridge, and Suzuki (2012) showed that participants match the frequency of an amplitude-modulated auditory stimulus to visual spatial frequency with a linear relationship and suggested this crossmodal matching guided attention to specific spatial frequencies. Here, we replicated this matching relationship and used the visual search paradigm to investigate whether auditory signals guide attention to matched visual spatial frequencies. Participants were presented with a search display of Gabors, all with different spatial frequencies. When the auditory signal was informative, improved search efficiency occurred for some spatial frequencies. However, when uninformative, a matched auditory signal produced no effect on visual search performance whatsoever. Moreover, search benefits were also observed when the auditory signal was informative, but did not match the spatial frequency. Together, these findings suggest that an amplitude-modulated auditory signal can influence visual selection of a matched spatial frequency, but the effect is due to top-down knowledge rather than resulting from automatic attentional capture derived from low-level mapping.

  • 出版日期2013