Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

TOO-BERT: A Trajectory Order Objective BERT for self-supervised representation learning of temporal healthcare data

View through CrossRef
Abstract The growing availability of Electronic Health Records (EHRs) presents an opportunity to enhance patient care by uncovering hidden health risks and improving informed decisions through advanced deep learning methods. However, modeling EHR sequential data, denoted patient trajectories, is complex due to the evolving relationships between diagnoses and treatments over time, where medical conditions and interventions alter the likelihood of future health outcomes over time. While BERT-inspired models have shown promise in modeling EHR sequences by pretraining on the masked language modeling (MLM) objective, they struggle to fully capture the intricate, temporal dynamics of disease progression and medical interventions. In this study, we introduce TOO-BERT, a novel adaptation that enhances MLM-pretrained transformers by explicitly incorporating temporal information from patient trajectories. TOO-BERT encourages the model to learn complex causal relationships between diagnoses and treatments using a new self-supervised learning task, the Temporal Order Objective (TOO). This is achieved through two proposed methods: Conditional Code Swapping (CCS) and Conditional Visit Swapping (CVS). We evaluate TOO-BERT on two datasets, MIMIC-IV hospitalization records and the Malmö Diet cohort—comprising approximately 10 and 8 million medical codes, respectively. TOO-BERT demonstrates superior performance in predicting Heart Failure (HF), Alzheimer's Disease (AD), and Prolonged Length of Stay (PLS) compared to standard MLM-pretrained transformers, and notably excels in HF prediction even with limited fine-tuning data. Our results underscore the effectiveness of integrating temporal ordering objectives into MLM-pretrained models, enabling deeper insights into the complex relationships in EHR data. Attention analysis further reveals TOO-BERT’s ability to capture and represent sophisticated structural patterns within patient trajectories.
Title: TOO-BERT: A Trajectory Order Objective BERT for self-supervised representation learning of temporal healthcare data
Description:
Abstract The growing availability of Electronic Health Records (EHRs) presents an opportunity to enhance patient care by uncovering hidden health risks and improving informed decisions through advanced deep learning methods.
However, modeling EHR sequential data, denoted patient trajectories, is complex due to the evolving relationships between diagnoses and treatments over time, where medical conditions and interventions alter the likelihood of future health outcomes over time.
While BERT-inspired models have shown promise in modeling EHR sequences by pretraining on the masked language modeling (MLM) objective, they struggle to fully capture the intricate, temporal dynamics of disease progression and medical interventions.
In this study, we introduce TOO-BERT, a novel adaptation that enhances MLM-pretrained transformers by explicitly incorporating temporal information from patient trajectories.
TOO-BERT encourages the model to learn complex causal relationships between diagnoses and treatments using a new self-supervised learning task, the Temporal Order Objective (TOO).
This is achieved through two proposed methods: Conditional Code Swapping (CCS) and Conditional Visit Swapping (CVS).
We evaluate TOO-BERT on two datasets, MIMIC-IV hospitalization records and the Malmö Diet cohort—comprising approximately 10 and 8 million medical codes, respectively.
TOO-BERT demonstrates superior performance in predicting Heart Failure (HF), Alzheimer's Disease (AD), and Prolonged Length of Stay (PLS) compared to standard MLM-pretrained transformers, and notably excels in HF prediction even with limited fine-tuning data.
Our results underscore the effectiveness of integrating temporal ordering objectives into MLM-pretrained models, enabling deeper insights into the complex relationships in EHR data.
Attention analysis further reveals TOO-BERT’s ability to capture and represent sophisticated structural patterns within patient trajectories.

Related Results

Perceptions of Telemedicine and Rural Healthcare Access in a Developing Country: A Case Study of Bayelsa State, Nigeria
Perceptions of Telemedicine and Rural Healthcare Access in a Developing Country: A Case Study of Bayelsa State, Nigeria
Abstract Introduction Telemedicine is the remote delivery of healthcare services using information and communication technologies and has gained global recognition as a solution to...
Role of the Frontal Lobes in the Propagation of Mesial Temporal Lobe Seizures
Role of the Frontal Lobes in the Propagation of Mesial Temporal Lobe Seizures
Summary: The depth ictal electroencephalographic (EEG) propagation sequence accompanying 78 complex partial seizures of mesial temporal origin was reviewed in 24 patients (15 from...
Control-Oriented Real-Time Trajectory Planning for Heterogeneous UAV Formations
Control-Oriented Real-Time Trajectory Planning for Heterogeneous UAV Formations
Aiming at the trajectory planning problem for heterogeneous UAV formations in complex environments, a trajectory prediction model combining Convolutional Neural Networks (CNNs) and...
A Trajectory Similarity Computation Method based on GAT-based Transformer and CNN model
A Trajectory Similarity Computation Method based on GAT-based Transformer and CNN model
Trajectory similarity computation is very important for trajectory data mining. It is applied into many trajectory mining tasks, including trajectory clustering, trajectory classif...
Self-Supervised Contrastive Representation Learning in Computer Vision
Self-Supervised Contrastive Representation Learning in Computer Vision
Although its origins date a few decades back, contrastive learning has recently gained popularity due to its achievements in self-supervised learning, especially in computer vision...
Enhancing Non-Formal Learning Certificate Classification with Text Augmentation: A Comparison of Character, Token, and Semantic Approaches
Enhancing Non-Formal Learning Certificate Classification with Text Augmentation: A Comparison of Character, Token, and Semantic Approaches
Aim/Purpose: The purpose of this paper is to address the gap in the recognition of prior learning (RPL) by automating the classification of non-formal learning certificates using d...
A Pre-Training Technique to Localize Medical BERT and to Enhance Biomedical BERT
A Pre-Training Technique to Localize Medical BERT and to Enhance Biomedical BERT
Abstract Background: Pre-training large-scale neural language models on raw texts has been shown to make a significant contribution to a strategy for transfer learning in n...

Back to Top