Deep learning (DL) has revolutionized the field of computer vision and image processing. In medical imaging, algorithmic solutions based on DL have been shown to achieve high performance on tasks that previously required medical experts. However, DL-based solutions for disease detection have been proposed without methods to quantify and control their uncertainty in a decision. In contrast, a physician knows whether she is uncertain about a case and will consult more experienced colleagues if needed. Here we evaluate drop-out based Bayesian uncertainty measures for DL in diagnosing diabetic retinopathy (DR) from fundus images and show that it captures uncertainty better than straightforward alternatives. Furthermore, we show that uncertainty informed decision referral can improve diagnostic performance. Experiments across different networks, tasks and datasets show robust generalization. We analyse causes of uncertainty by relating intuitions from 2D visualizations to the high-dimensional image space. While uncertainty is sensitive to clinically relevant cases, sensitivity to unfamiliar data samples is task dependent, but can be rendered more robust. The opportunities and failure modes identifified here, will be put in a broader context by relating to recent developments.
After finishing his PhD at the University of Tuebingen, Christian Leibig joined Merantix, Germany's leading AI healthcare startup based in Berlin. His work on Uncertainty in Deep Learning was published in Nature Scientific Reports.His talk will be based on a Keynote he held at the "Bayesian Deep Learning Workshop" at Neurips 2018.
The event will take place on Thursday, 21 March, 2019 at 7pm at the DKFZ Communication Center (K1+K2), Im Neuenheimer Feld 280. Drinks and snacks will be provided, courtesy of the Division of Medical Image Computing at DKFZ. Kindly help us plan ahead by registering for the event on our meetup page.