Whole brain activation clusters during olfactory-visual integration (odor identification)

Contributed by schumannk on May 30, 2023

Collection: Elevated insulin levels engage the salience network during multisensory perception

Description: Subjects performing an odor identification task. Odor stimuli were applied birhinally for 5.2s with continuous air flow (3.0 L/min) using a computer controlled olfactometer. The application of the odors was conducted using the respiration-triggered method RETROS (Hoffmann-Hensel et. al. 2016). As baseline, aqua conservata was used to control for sniffing artifacts. Two food odors, ‘apple’ and ‘strawberry’ and two nonfood odors, ‘wood’ and ‘grass’ were presented along with four odor-matching pictures. During the stimulus presentation the participants pressed response buttons with the right hand to identify the odor category (food, nonfood, control). The stimuli were grouped into a 3x3 factorial design by category (food, nonfood, control) resulting in unimodal-visual, unimodal-olfactory, bimodal-congruent (matching semantic object information e.g., apple odor with apple picture), bimodal-incongruent (not matching) and baseline condition. Incongruent pairings of the same category (e.g., apple and strawberry) were not used. During the baseline condition, participants were not stimulated with odor or picture. They viewed a blank screen and received the aqua conservata through the nasal tubes. We collapsed congruent and incongruent conditions into bimodal, as we did not find a difference. This contrast shows olfactory-visual integration [olfactory bimodal - (olfactory unimodal + visual + baseline)].

Tags: odor identification olfactory-visual integration

Task View 3D View
Metadata
Field Value
Citation guidelines

If you use these data please include the following persistent identifier in the text of your manuscript:

https://identifiers.org/neurovault.image:797876

This will help to track the use of this data in the literature.