Rxivist logo

Towards standard practices for sharing computer code and programs in neuroscience

By Stephen J. Eglen, Ben Marwick, Yaroslav O. Halchenko, Michael Hanke, Shoaib Sufi, Padraig Gleeson, R. Angus Silver, Andrew P. Davison, Linda Lanyon, Mathew Abrams, Thomas Wachtler, David J. Willshaw, Christophe Pouzat, Jean-Baptiste Poline

Posted 24 Mar 2016
bioRxiv DOI: 10.1101/045104 (published DOI: 10.1038/nn.4550)

Many areas of neuroscience are now critically dependent on computational tools to help understand the large volumes of data being created. Furthermore, computer models are increasingly being used to help predict and understand the function of the nervous system. Many of these computations are complex and often cannot be concisely reported in the methods section of a scientific article. In a few areas there are widely used software packages for analysis (e.g., SPM, FSL, AFNI, BrainVoyager, FreeSurfer in neuroimaging) or simulation (e.g. NEURON, NEST, Brian). However, we often write new computer programs to solve specific problems in the course of our research. Some of these programs may be relatively small scripts that help analyze all of our data, and these rarely get described in papers. As authors, how best can we maximize the chances that other scientists can reproduce our computations or reuse our methods on their data? Is our research reproducible? Our article lists practical suggestions to maximise the reproducibility of our work.

Download data

  • Downloaded 2,591 times
  • Download rankings, all-time:
    • Site-wide: 8,098
    • In neuroscience: 674
  • Year to date:
    • Site-wide: 110,221
  • Since beginning of last month:
    • Site-wide: 129,012

Altmetric data

Downloads over time

Distribution of downloads per paper, site-wide