Previous studies have shown that the conceptual representation of food involves brain regions associated with taste perception. The specificity of this response, however, is unknown. Does viewing pictures of food produce a general, non-specific response in taste-sensitive regions of the brain? Or, is the response specific for how a particular food tastes? Building on recent findings that specific tastes can be decoded from taste-sensitive regions of insular cortex, we asked whether viewing pictures of foods associated with a specific taste (e.g., sweet, salty, sour) can also be decoded from these same regions and if so, are the patterns of neural activity elicited by the pictures and their associated tastes similar? Using ultra-high resolution functional magnetic resonance imaging at high magnetic field strength (7-Tesla), we were able to decode specific tastes delivered during scanning, as well as the specific taste category associated with food pictures within the dorsal mid-insula, a primary taste responsive region of brain. Thus, merely viewing food pictures triggers an automatic retrieval of specific taste quality information associated with the depicted foods, within gustatory cortex. However, the patterns of activity elicited by pictures and their associated tastes were unrelated, thus suggesting a clear neural distinction between inferred and directly experienced sensory events. These data show how higher-order inferences derived from stimuli in one modality (i.e. vision) can be represented in brain regions typically thought to represent only low-level information about a different modality (i.e. taste). ### Competing Interest Statement The authors have declared no competing interest.
- Downloaded 280 times
- Download rankings, all-time:
- Site-wide: 125,759
- In neuroscience: 18,861
- Year to date:
- Site-wide: None
- Since beginning of last month:
- Site-wide: 105,913
Downloads over time
Distribution of downloads per paper, site-wide
- 27 Nov 2020: The website and API now include results pulled from medRxiv as well as bioRxiv.
- 18 Dec 2019: We're pleased to announce PanLingua, a new tool that enables you to search for machine-translated bioRxiv preprints using more than 100 different languages.
- 21 May 2019: PLOS Biology has published a community page about Rxivist.org and its design.
- 10 May 2019: The paper analyzing the Rxivist dataset has been published at eLife.
- 1 Mar 2019: We now have summary statistics about bioRxiv downloads and submissions.
- 8 Feb 2019: Data from Altmetric is now available on the Rxivist details page for every preprint. Look for the "donut" under the download metrics.
- 30 Jan 2019: preLights has featured the Rxivist preprint and written about our findings.
- 22 Jan 2019: Nature just published an article about Rxivist and our data.
- 13 Jan 2019: The Rxivist preprint is live!