Description: Jakub Limanowski and Karl Friston Abstract: To control our actions efficiently, our brain represents our body based on a combination of visual and proprioceptive cues, weighted according to how (un)reliable—how precise—each respective modality is in a given context. However, perceptual experiments in other modalities suggest that the weights assigned to sensory cues are also modulated ‘top-down’ by attention. Here, we asked whether attention can modulate the weights (i.e., precision) assigned to visual vs proprioceptive information about body position during action. Participants controlled a virtual hand via a data glove, matching either the virtual or their (unseen) real hand movements to a target, under varying levels of visuo-proprioceptive congruence and visibility. Functional magnetic resonance imaging (fMRI) revealed increased activity of the superior parietal lobe (SPL) during the virtual hand task and increased activity of the secondary somatosensory cortex (S2) during the real hand task. Dynamic causal modelling (DCM) showed that these activity changes were the result of selective gain modulations in the primary visual cortex (V1) and the S2. These results imply that attentional set diametrically changed the gain of visual vs proprioceptive brain areas, thus contextualising their influence on multisensory areas representing the body for action.
Related article: http://doi.org/10.1093/cercor/bhz192
If you use the data from this collection please include the following persistent identifier in the text of your manuscript:
https://identifiers.org/neurovault.collection:4868
This will help to track the use of this data in the literature. In addition, consider also citing the paper related to this collection.