MixPro: Simple yet Effective Data Augmentation for Prompt-based Learning
Bohan Li
1
,
Longxu Dou
2
,
Yutai Hou
1
,
Yunlong Feng
1
,
Honglin Mu
1
,
ENBO WANG
1
,
Qingfu Zhu
1
,
Qinghua Sun
3, 4
,
Wanxiang Che
1
2
Individual Researcher, Beijing, China
|
3
Jilin Kexun Information Technology Co., Ltd., Beijing, China
|
4
iFLYTEK Research, Beijing, China
|
Publication type: Journal Article
Publication date: 2025-02-27
scimago Q2
wos Q3
SJR: 0.694
CiteScore: 6.6
Impact factor: 2.7
ISSN: 18688071, 1868808X
Abstract
Prompt-based learning has shown considerable promise in reformulating various downstream tasks as cloze problems by combining original input with a predetermined template. This approach demonstrates its effectiveness, especially in few-shot learning scenarios, where the model is trained on a scarce amount of data. Despite its successes, the limited templates and text in few-shot prompt-based learning scenarios leave significant room for performance improvement. Moreover, existing methods sometimes resort to model ensembles, which, while effective, could potentially hamper model efficiency due to increased computational demands [1]. To address these issues, we introduce MixPro, an augmentation method designed to augment both the vanilla input text and the templates. We implement this through the token-level, the sentence-level, and the template-level Mixup strategies. We conduct experiments on five few-shot datasets, and the results show that our MixPro achieves an average performance improvement of 5.08% compared to the backbone model before augmentation. Moreover, it outperforms other augmentation baselines, demonstrating its superior effectiveness.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
|
|
|
Pattern Recognition
1 publication, 100%
|
|
|
1
|
Publishers
|
1
|
|
|
Elsevier
1 publication, 100%
|
|
|
1
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
1
Total citations:
1
Citations from 2024:
0
Cite this
GOST |
RIS |
BibTex
Cite this
GOST
Copy
Li B. et al. MixPro: Simple yet Effective Data Augmentation for Prompt-based Learning // International Journal of Machine Learning and Cybernetics. 2025.
GOST all authors (up to 50)
Copy
Li B., Dou L., Hou Y., Feng Y., Mu H., WANG E., Zhu Q., Sun Q., Che W. MixPro: Simple yet Effective Data Augmentation for Prompt-based Learning // International Journal of Machine Learning and Cybernetics. 2025.
Cite this
RIS
Copy
TY - JOUR
DO - 10.1007/s13042-025-02548-6
UR - https://link.springer.com/10.1007/s13042-025-02548-6
TI - MixPro: Simple yet Effective Data Augmentation for Prompt-based Learning
T2 - International Journal of Machine Learning and Cybernetics
AU - Li, Bohan
AU - Dou, Longxu
AU - Hou, Yutai
AU - Feng, Yunlong
AU - Mu, Honglin
AU - WANG, ENBO
AU - Zhu, Qingfu
AU - Sun, Qinghua
AU - Che, Wanxiang
PY - 2025
DA - 2025/02/27
PB - Springer Nature
SN - 1868-8071
SN - 1868-808X
ER -
Cite this
BibTex (up to 50 authors)
Copy
@article{2025_Li,
author = {Bohan Li and Longxu Dou and Yutai Hou and Yunlong Feng and Honglin Mu and ENBO WANG and Qingfu Zhu and Qinghua Sun and Wanxiang Che},
title = {MixPro: Simple yet Effective Data Augmentation for Prompt-based Learning},
journal = {International Journal of Machine Learning and Cybernetics},
year = {2025},
publisher = {Springer Nature},
month = {feb},
url = {https://link.springer.com/10.1007/s13042-025-02548-6},
doi = {10.1007/s13042-025-02548-6}
}