,
pages 343-352
Global memory transformer for processing long documents
Publication type: Book Chapter
Publication date: 2022-10-19
scimago Q4
SJR: 0.189
CiteScore: 2.3
Impact factor: —
ISSN: 1860949X, 18609503
Abstract
Transformer variants dominate the state of the art in different natural language processing tasks such as translation, reading comprehension and summarization. Our paper is more directed to use general memory slots added to the inputs and studying the results of adding these slots. This paper is a go on study of general memory slots rule that were added to the input of the proposed model in previous work [1]. We have two main tasks;1) pretraining task using masked language modeling and b) fine tuning task using HotpotQA . This study aims to verify the ability of the proposed model to handle chunks as if they were one chunk comparing with the base model. As baseline we used T5 transformer. We studied the rule of memory slots augmented to each input chunk and studied the model performance without selector. We found that adding memory to input chunks helped the proposed model to overcome the baseline on Masked language modeling task with specific training parameters. Ablation study reveals the ability of using the compressed input chunks with a degradation in performance.
Found
Nothing found, try to update filter.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
0
Total citations:
0
Cite this
GOST |
RIS |
BibTex
Cite this
GOST
Copy
Al Adel A. Global memory transformer for processing long documents // Studies in Computational Intelligence. 2022. pp. 343-352.
GOST all authors (up to 50)
Copy
Al Adel A. Global memory transformer for processing long documents // Studies in Computational Intelligence. 2022. pp. 343-352.
Cite this
RIS
Copy
TY - GENERIC
DO - 10.1007/978-3-031-19032-2_36
UR - https://doi.org/10.1007/978-3-031-19032-2_36
TI - Global memory transformer for processing long documents
T2 - Studies in Computational Intelligence
AU - Al Adel, Arij
PY - 2022
DA - 2022/10/19
PB - Springer Nature
SP - 343-352
SN - 1860-949X
SN - 1860-9503
ER -
Cite this
BibTex (up to 50 authors)
Copy
@incollection{2022_Al Adel,
author = {Arij Al Adel},
title = {Global memory transformer for processing long documents},
publisher = {Springer Nature},
year = {2022},
pages = {343--352},
month = {oct}
}
Profiles