Rxivist logo

Machine translation of cortical activity to text with an encoder-decoder framework

By Joseph G. Makin, David A. Moses, Edward F. Chang

Posted 22 Jul 2019
bioRxiv DOI: 10.1101/708206 (published DOI: 10.1038/s41593-020-0608-8)

A decade after the first successful attempt to decode speech directly from human brain signals, accuracy and speed remain far below that of natural speech or typing. Here we show how to achieve high accuracy from the electrocorticogram at natural-speech rates, even with few data (on the order of half an hour of spoken speech). Taking a cue from recent advances in machine translation and automatic speech recognition, we train a recurrent neural network to map neural signals directly to word sequences (sentences). In particular, the network first encodes a sentence-length sequence of neural activity into an abstract representation, and then decodes this representation, word by word, into an English sentence. For each participant, training data consist of several spoken repeats of a set of some 30-50 sentences, along with the corresponding neural signals at each of about 250 electrodes distributed over peri-Sylvian speech cortices. Average word error rates across a validation (held-out) sentence set are as low as 7% for some participants, as compared to the previous state of the art of greater than 60%. Finally, we show how to use transfer learning to overcome limitations on data availability: Training certain components of the network under multiple participants' data, while keeping other components (e.g., the first hidden layer) "proprietary," can improve decoding performance--despite very different electrode coverage across participants.

Download data

  • Downloaded 5,220 times
  • Download rankings, all-time:
    • Site-wide: 756 out of 94,912
    • In neuroscience: 78 out of 16,862
  • Year to date:
    • Site-wide: 249 out of 94,912
  • Since beginning of last month:
    • Site-wide: 2,046 out of 94,912

Altmetric data

Downloads over time

Distribution of downloads per paper, site-wide


Sign up for the Rxivist weekly newsletter! (Click here for more details.)