Shickel, Benjamin and Silva, Brandon and Ozrazgat-Baslanti, Tezcan and Ren, Yuanfang and Khezeli, Kia and Guan, Ziyuan and Tighe, Patrick J. and Bihorac, Azra and Rashidi, Parisa (2022) Multi-dimensional patient acuity estimation with longitudinal EHR tokenization and flexible transformer networks. Frontiers in Digital Health, 4. ISSN 2673-253X
pubmed-zip/versions/2/package-entries/fdgth-04-1029191-r1/fdgth-04-1029191.pdf - Published Version
Download (719kB)
Abstract
Transformer model architectures have revolutionized the natural language processing (NLP) domain and continue to produce state-of-the-art results in text-based applications. Prior to the emergence of transformers, traditional NLP models such as recurrent and convolutional neural networks demonstrated promising utility for patient-level predictions and health forecasting from longitudinal datasets. However, to our knowledge only few studies have explored transformers for predicting clinical outcomes from electronic health record (EHR) data, and in our estimation, none have adequately derived a health-specific tokenization scheme to fully capture the heterogeneity of EHR systems. In this study, we propose a dynamic method for tokenizing both discrete and continuous patient data, and present a transformer-based classifier utilizing a joint embedding space for integrating disparate temporal patient measurements. We demonstrate the feasibility of our clinical AI framework through multi-task ICU patient acuity estimation, where we simultaneously predict six mortality and readmission outcomes. Our longitudinal EHR tokenization and transformer modeling approaches resulted in more accurate predictions compared with baseline machine learning models, which suggest opportunities for future multimodal data integrations and algorithmic support tools using clinical transformer networks.
Item Type: | Article |
---|---|
Subjects: | Institute Archives > Multidisciplinary |
Depositing User: | Managing Editor |
Date Deposited: | 13 Jan 2023 06:01 |
Last Modified: | 04 Mar 2024 03:39 |
URI: | http://eprint.subtopublish.com/id/eprint/1070 |