Loading Events

« All Events

  • This event has passed.

Alexis Pérez – Brainvitge seminars

22/03/2021 · 12:00 - 13:30

From Cognition and Brain Plasticity Unit, Dinamics on memory formation

will present the talk titled

Seeing sounds: Brain mechanisms underlaying auditory contributions to visual detection

 

Abstract

How auditory information interacts with visual detection is a recurrent question in visual neuroscience. Whereas some studies propose that sounds interact automatically with incoming visual input, others instead claim that audiovisual interactions are dependent on top-down controlled processes like attention. In this study, we recorded magnetoencephalography (MEG) data while participants performed a visual detection task (where the audiovisual events were task-relevant) or a working memory task (where the audiovisual events were task-irrelevant). We trained multivariate pattern analysis classifiers and tested them at different time points to characterize how auditory information shaped visual stimulus representations over time in each task. Our results showed that sounds interact with visual detection via two different mechanisms. First, a mechanism by which observers actively used the auditory stimulus to orient their attention to the target onset, maintaining a stable representation of the visual stimulus along the whole trial. This mechanism allowed participants to improve their visual sensitivity and it was not automatic, as it required participants to attend the audiovisual signals. Second, a mechanism by which sounds elicit a neural response pattern akin to the one evoked by an actual visual stimulus. This latter mechanism was associated with an increase in false alarms and it is automatic since it was independent of participants attention to the audiovisual signals.
This work shed light on a classic debate in regard to the automaticity of auditory dependent modulations of visual detection by showing that 1) sounds improve visual detection sensitivity via a top-down controlled mechanism; and 2) changes in criterion (i.e. signal detection theory parameter) due to sound presentation in visual detection tasks might not merely reflect decisional biases. Instead, our results suggest that sounds automatically evoke neural activity patterns that could be interpreted by the brain as a veridical visual stimulus.

Location: Online (Microsoft Teams). Click here to join the meeting

Details

Date:
22/03/2021
Time:
12:00 - 13:30
Event Category: