Rxivist logo

Adding Extra Knowledge in Scalable Learning of Sparse Differential Gaussian Graphical Models

By Arshdeep Sekhon, Beilun Wang, Yanjun Qi

Posted 28 Jul 2019
bioRxiv DOI: 10.1101/716852

We focus on integrating different types of extra knowledge (other than the observed samples) for estimating the sparse structure change between two p-dimensional Gaussian Graphical Models (i.e. differential GGMs). Previous differential GGM estimators either fail to include additional knowledge or cannot scale up to a high-dimensional (large p) situation. This paper proposes a novel method KDiffNet that incorporates Additional Knowledge in identifying Differential Networks via an Elementary Estimator. We design a novel hybrid norm as a superposition of two structured norms guided by the extra edge information and the additional node group knowledge. KDiffNet is solved through a fast parallel proximal algorithm, enabling it to work in large-scale settings. KDiffNet can incorporate various combinations of existing knowledge without re-designing the optimization. Through rigorous statistical analysis we show that, while considering more evidence, KDiffNet achieves the same convergence rate as the state-of-the-art. Empirically on multiple synthetic datasets and one real-world fMRI brain data, KDiffNet significantly outperforms the cutting edge baselines concerning the prediction performance, while achieving the same level of time cost or less.

Download data

  • Downloaded 197 times
  • Download rankings, all-time:
    • Site-wide: 69,583 out of 89,036
    • In bioinformatics: 7,104 out of 8,413
  • Year to date:
    • Site-wide: 67,308 out of 89,036
  • Since beginning of last month:
    • Site-wide: 70,277 out of 89,036

Altmetric data

Downloads over time

Distribution of downloads per paper, site-wide


Sign up for the Rxivist weekly newsletter! (Click here for more details.)