Rxivist logo

Evidence for a deep, distributed and dynamic semantic code in human ventral anterior temporal cortex

By Timothy T. Rogers, Christopher Cox, Qihong Lu, Akihiro Shimotake, Takayuki Kikuch, Takeharu Kunieda, Susumu Miyamoto, Ryosuke Takahashi, Akio Ikeda, Riki Matsumoto, Matthew A. Lambon Ralph

Posted 09 Jul 2019
bioRxiv DOI: 10.1101/695049

How does the human brain encode semantic information about objects? This paper reconciles two seemingly contradictory views. The first proposes that local neural populations independently encode semantic features; the second, that semantic representations arise as a dynamic distributed code that changes radically with stimulus processing. Combining simulations with a well-known neural network model of semantic memory, multivariate pattern classification, and human electrocorticography, we find that both views are partially correct: semantic information is distributed across ventral temporal cortex in a dynamic code that possesses stable feature-like elements in posterior regions but with elements that change rapidly and nonlinearly in anterior regions. This pattern is consistent with the view that anterior temporal lobes serve as a deep cross-modal "hub" in an interactive semantic network, and more generally suggests that tertiary association cortices may adopt dynamic distributed codes difficult to detect with common brain imaging methods.

Download data

  • Downloaded 696 times
  • Download rankings, all-time:
    • Site-wide: 35,255
    • In neuroscience: 4,889
  • Year to date:
    • Site-wide: 21,194
  • Since beginning of last month:
    • Site-wide: 23,161

Altmetric data


Downloads over time

Distribution of downloads per paper, site-wide


PanLingua

Sign up for the Rxivist weekly newsletter! (Click here for more details.)


News