Perceptual learning shapes multisensory causal inference via two distinct mechanisms

Crossmodal Multisensory Integration
DOI: 10.1038/srep24673 Publication Date: 2016-04-19T09:05:26Z
ABSTRACT
To accurately represent the environment, our brains must integrate sensory signals from a common source while segregating those independent sources. A reasonable strategy for performing this task is to restrict integration cues that coincide in space and time. However, because multisensory are subject differential transmission processing delays, brain retain degree of tolerance temporal discrepancies. Recent research suggests width 'temporal binding window' can be reduced through perceptual learning, however, little known about mechanisms underlying these experience-dependent effects. Here, separate experiments, we measure spatial windows human participants before after training on an audiovisual discrimination task. We show leads two distinct effects form (i) specific narrowing window does not transfer (ii) general reduction magnitude crossmodal interactions across all spatiotemporal disparities. These arise naturally Bayesian model causal inference which learning improves precision timing estimation, whilst concomitantly decreasing prior expectation stimuli emanate source.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (38)
CITATIONS (37)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....