ACM Transactions on the Web

Towards Effective Time-Aware Language Representation: Exploring Enhanced Temporal Understanding in Language Models

Publication typeJournal Article
Publication date2025-03-13
scimago Q2
wos Q2
SJR0.787
CiteScore4.9
Impact factor2.6
ISSN15591131, 1559114X
Abstract

In the evolving field of Natural Language Processing (NLP), understanding the temporal context of text is increasingly critical for applications requiring advanced temporal reasoning. Traditional pre-trained language models like BERT, which rely on synchronic document collections such as BookCorpus and Wikipedia, often fall short in effectively capturing and leveraging temporal information. To address this limitation, we introduce BiTimeBERT 2.0, a novel time-aware language model pre-trained on a temporal news article collection. BiTimeBERT 2.0 incorporates temporal information through three innovative pre-training objectives: Extended Time-Aware Masked Language Modeling (ETAMLM), Document Dating (DD), and Time-Sensitive Entity Replacement (TSER). Each objective is specifically designed to target a distinct dimension of temporal information: ETAMLM enhances the model’s understanding of temporal contexts and relations, DD integrates document timestamps as explicit chronological markers, and TSER focuses on the temporal dynamics of ”Person” entities. Moreover, our refined corpus preprocessing strategy reduces training time by nearly 53%, making BiTimeBERT 2.0 significantly more efficient while maintaining high performance. Experimental results show that BiTimeBERT 2.0 achieves substantial improvements across a broad range of time-related tasks and excels on datasets spanning extensive temporal ranges. These findings underscore BiTimeBERT 2.0’s potential as a powerful tool for advancing temporal reasoning in NLP.

Found 

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Share
Cite this
GOST | RIS | BibTex
Found error?