A survey of text classification based on pre-trained language model
1
School of Information Science and Technology, Sanda University, Shanghai, 201209, China
|
Publication type: Journal Article
Publication date: 2025-02-01
scimago Q1
wos Q1
SJR: 1.471
CiteScore: 13.6
Impact factor: 6.5
ISSN: 09252312, 18728286
Abstract
The utilization of text classification is widespread within the domain of Natural Language Processing (NLP). In recent years, pre-trained language models (PLMs) based on the Transformer architecture have made significant strides across various artificial intelligence tasks. Currently, text classification employing PLMs has emerged as a prominent research focus within NLP. While several review papers examine text classification and Transformer models, there is a notable lack of comprehensive surveys specifically addressing text classification grounded in PLMs. To address this gap, the present survey provides an extensive overview of text classification techniques that leverage PLMs. The primary components of this review include: (1) an introduction, (2) a systematic examination of PLMs, (3) deep learning-based text classification methodologies, (4) text classification approaches utilizing pre-trained models, (5) commonly used datasets and evaluation metrics in text classification, (6) prevalent challenges and emerging trends in the field, and (7) a conclusion.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
|
|
|
Information (Switzerland)
1 publication, 12.5%
|
|
|
IEEE Transactions on Geoscience and Remote Sensing
1 publication, 12.5%
|
|
|
Journal of Marine Science and Engineering
1 publication, 12.5%
|
|
|
Applied Intelligence
1 publication, 12.5%
|
|
|
Natural Resources Research
1 publication, 12.5%
|
|
|
Mathematics
1 publication, 12.5%
|
|
|
Applied Soft Computing Journal
1 publication, 12.5%
|
|
|
1
|
Publishers
|
1
2
3
|
|
|
MDPI
3 publications, 37.5%
|
|
|
Institute of Electrical and Electronics Engineers (IEEE)
2 publications, 25%
|
|
|
Springer Nature
2 publications, 25%
|
|
|
Elsevier
1 publication, 12.5%
|
|
|
1
2
3
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
8
Total citations:
8
Citations from 2024:
8
(100%)
Cite this
GOST |
RIS |
BibTex
Cite this
RIS
Copy
TY - JOUR
DO - 10.1016/j.neucom.2024.128921
UR - https://linkinghub.elsevier.com/retrieve/pii/S0925231224016928
TI - A survey of text classification based on pre-trained language model
T2 - Neurocomputing
AU - Wu, Yujia
PY - 2025
DA - 2025/02/01
PB - Elsevier
SP - 128921
VL - 616
SN - 0925-2312
SN - 1872-8286
ER -
Cite this
BibTex (up to 50 authors)
Copy
@article{2025_Wu,
author = {Yujia Wu},
title = {A survey of text classification based on pre-trained language model},
journal = {Neurocomputing},
year = {2025},
volume = {616},
publisher = {Elsevier},
month = {feb},
url = {https://linkinghub.elsevier.com/retrieve/pii/S0925231224016928},
pages = {128921},
doi = {10.1016/j.neucom.2024.128921}
}