We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Uncertainty-aware deep learning in healthcare: A scoping review.
- Authors
Loftus, Tyler J.; Shickel, Benjamin; Ruppert, Matthew M.; Balch, Jeremy A.; Ozrazgat-Baslanti, Tezcan; Tighe, Patrick J.; Efron, Philip A.; Hogan, William R.; Rashidi, Parisa; Upchurch Jr., Gilbert R.; Bihorac, Azra
- Abstract
Mistrust is a major barrier to implementing deep learning in healthcare settings. Entrustment could be earned by conveying model certainty, or the probability that a given model output is accurate, but the use of uncertainty estimation for deep learning entrustment is largely unexplored, and there is no consensus regarding optimal methods for quantifying uncertainty. Our purpose is to critically evaluate methods for quantifying uncertainty in deep learning for healthcare applications and propose a conceptual framework for specifying certainty of deep learning predictions. We searched Embase, MEDLINE, and PubMed databases for articles relevant to study objectives, complying with PRISMA guidelines, rated study quality using validated tools, and extracted data according to modified CHARMS criteria. Among 30 included studies, 24 described medical imaging applications. All imaging model architectures used convolutional neural networks or a variation thereof. The predominant method for quantifying uncertainty was Monte Carlo dropout, producing predictions from multiple networks for which different neurons have dropped out and measuring variance across the distribution of resulting predictions. Conformal prediction offered similar strong performance in estimating uncertainty, along with ease of interpretation and application not only to deep learning but also to other machine learning approaches. Among the six articles describing non-imaging applications, model architectures and uncertainty estimation methods were heterogeneous, but predictive performance was generally strong, and uncertainty estimation was effective in comparing modeling methods. Overall, the use of model learning curves to quantify epistemic uncertainty (attributable to model parameters) was sparse. Heterogeneity in reporting methods precluded the performance of a meta-analysis. Uncertainty estimation methods have the potential to identify rare but important misclassifications made by deep learning models and compare modeling methods, which could build patient and clinician trust in deep learning applications in healthcare. Efficient maturation of this field will require standardized guidelines for reporting performance and uncertainty metrics. Author summary: Deep learning prediction models perform better than traditional prediction models for several healthcare applications. For deep learning to achieve it's greatest impact on healthcare delivery, patients and providers must trust deep learning models and their outputs. This article describes the potential for deep learning to earn trust by conveying model certainty–the probability that a given model output is accurate. If a model could convey not only it's prediction but also it's level of certainty that the prediction is correct, patients and providers could make an informed decision to incorporate or ignore the prediction. The use of uncertainty estimation for deep learning entrustment is largely unexplored, and there is no consensus regarding optimal methods for quantifying uncertainty. Our purpose is to critically evaluate methods for quantifying uncertainty in deep learning for healthcare applications and propose a conceptual framework for specifying certainty of deep learning predictions. We systematically reviewed published scientific literature and summarized results from 30 studies, and found that uncertainty estimation methods have the potential to identify rare but important misclassifications made by deep learning models and compare modeling methods, which could build patient and clinician trust in deep learning applications in healthcare.
- Subjects
DEEP learning; ONLINE information services; MEDICAL information storage &; retrieval systems; SYSTEMATIC reviews; UNCERTAINTY; MEDICAL care; LITERATURE reviews; MEDLINE
- Publication
PLoS Digital Health, 2022, Vol 1, Issue 8, p1
- ISSN
2767-3170
- Publication type
Article
- DOI
10.1371/journal.pdig.0000085