Publication:
Learning Optimal Summaries of Clinical Time-series with Concept Bottleneck Models

No Thumbnail Available

Date

2022-05-23

Published Version

Published Version

Journal Title

Journal ISSN

Volume Title

Publisher

The Harvard community has made this article openly available. Please share how this access benefits you.

Research Projects

Organizational Units

Journal Issue

Citation

Wu, Carissa. 2022. Learning Optimal Summaries of Clinical Time-series with Concept Bottleneck Models. Bachelor's thesis, Harvard College.

Research Data

Abstract

Despite machine learning models' state-of-the-art performance in numerous clinical prediction and intervention tasks, their complex black-box processes pose a great barrier to their real-world deployment. Clinical experts must be able to understand the reasons behind a model's recommendation before taking action, as it is crucial to assess for criteria other than accuracy, such as trust, safety, fairness, and robustness. In this work, we improve the interpretability (while maintaining prediction quality) of clinical time-series prediction models by introducing one more stage into the prediction pipeline: we learn concepts that correspond to semantically-meaningful clinical ideas, e.g. illness severity or kidney function. We also propose an optimization method which then selects the most important features within each concept, learning sparse definitions that allow for organized inspection of the model. On a real-world task of predicting vasopressor onset in ICU units, our algorithm achieves predictive performance comparable to state-of-the-art models while learning concise groupings conducive for clinical inspection.

Description

Other Available Sources

Keywords

bottleneck models, concept learning, interpretability, interpretable machine learning, Computer science, Statistics

Terms of Use

This article is made available under the terms and conditions applicable to Other Posted Material (LAA), as set forth at Terms of Service

Endorsement

Review

Supplemented By

Referenced By

Related Stories