Publications

Anticipating multisensory environments: Evidence for a supra-modal predictive system

authors:

Abstract

Our perceptual experience is generally framed in multisensory environments abundant in predictive information. Previous research on statistical learning has shown that humans can learn regularities in different sensory modalities in parallel, but it has not yet determined whether multisensory predictions are generated through a modality-specific predictive mechanism or instead, rely on a supra-modal predictive system. Here, across two experiments, we tested these hypotheses by presenting participants with concurrent pairs of predictable auditory and visual low-level stimuli (i.e., tones and gratings). In different experimental blocks, participants had to attend the stimuli in one modality while ignoring stimuli from the other sensory modality (distractors), and perform a perceptual discrimination task on the second stimulus of the attended modality (targets). Orthogonal to the task goal, both the attended and unattended pairs followed transitional probabilities, so targets and distractors could be expected or unexpected. We found that participants performed better for expected compared to unexpected targets. This effect generalized to the distractors but only when relevant targets were expected. Such interactive effects suggest that predictions may be gated by a supra-modal system with shared resources across sensory modalities that are distributed according to their respective behavioural relevance.