We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Data leakage inflates prediction performance in connectome-based machine learning models.
- Authors
Rosenblatt, Matthew; Tejavibulya, Link; Jiang, Rongtao; Noble, Stephanie; Scheinost, Dustin
- Abstract
Predictive modeling is a central technique in neuroimaging to identify brain-behavior relationships and test their generalizability to unseen data. However, data leakage undermines the validity of predictive models by breaching the separation between training and test data. Leakage is always an incorrect practice but still pervasive in machine learning. Understanding its effects on neuroimaging predictive models can inform how leakage affects existing literature. Here, we investigate the effects of five forms of leakage–involving feature selection, covariate correction, and dependence between subjects–on functional and structural connectome-based machine learning models across four datasets and three phenotypes. Leakage via feature selection and repeated subjects drastically inflates prediction performance, whereas other forms of leakage have minor effects. Furthermore, small datasets exacerbate the effects of leakage. Overall, our results illustrate the variable effects of leakage and underscore the importance of avoiding data leakage to improve the validity and reproducibility of predictive modeling.The effects of data leakage on predictive models in neuroimaging studies are not well understood. Here, the authors show that data leakage via feature selection and repeated subjects drastically inflates prediction performance, whereas other forms of leakage have more minor effects.
- Publication
Nature Communications, 2024, Vol 15, Issue 1, p1
- ISSN
2041-1723
- Publication type
Article
- DOI
10.1038/s41467-024-46150-w