We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
Cross-modal associations between vision, touch, and audition influence visual search through top-down attention, not bottom-up capture.
- Authors
Orchard-Mills, Emily; Alais, David; Van der Burg, Erik
- Abstract
Recently, Guzman-Martinez, Ortega, Grabowecky, Mossbridge, and Suzuki (Current Biology : CB, 22(5), 383–388, 2012 ) reported that observers could systematically match auditory amplitude modulations and tactile amplitude modulations to visual spatial frequencies, proposing that these cross-modal matches produced automatic attentional effects. Using a series of visual search tasks, we investigated whether informative auditory, tactile, or bimodal cues can guide attention toward a visual Gabor of matched spatial frequency (among others with different spatial frequencies). These cues improved visual search for some but not all frequencies. Auditory cues improved search only for the lowest and highest spatial frequencies, whereas tactile cues were more effective and frequency specific, although less effective than visual cues. Importantly, although tactile cues could produce efficient search when informative, they had no effect when uninformative. This suggests that cross-modal frequency matching occurs at a cognitive rather than sensory level and, therefore, influences visual search through voluntary, goal-directed behavior, rather than automatic attentional capture.
- Subjects
VISUAL perception; INFLUENCE; ATTENTION; COGNITIVE ability; ACTION theory (Psychology); AUDITORY selective attention
- Publication
Attention, Perception & Psychophysics, 2013, Vol 75, Issue 8, p1892
- ISSN
1943-3921
- Publication type
Article
- DOI
10.3758/s13414-013-0535-9