Cognitive Systems Research, volume 75, pages 16-24

Complexity of symbolic representation in working memory of Transformer correlates with the complexity of a task

Publication typeJournal Article
Publication date2022-09-01
Quartile SCImago
Q1
Quartile WOS
Q1
Impact factor3.9
ISSN13890417
Artificial Intelligence
Software
Experimental and Cognitive Psychology
Cognitive Neuroscience
Abstract
Even though Transformers are extensively used for Natural Language Processing tasks, especially for machine translation, they lack an explicit memory to store key concepts of processed texts. This paper explores the properties of the content of symbolic working memory added to the Transformer model decoder. Such working memory enhances the quality of model predictions in machine translation task and works as a neural-symbolic representation of information that is important for the model to make correct translations. The study of memory content revealed that translated text keywords are stored in the working memory, pointing to the relevance of memory content to the processed text. Also, the diversity of tokens and parts of speech stored in memory correlates with the complexity of the corpora for machine translation task. • Working memory in Transformer helps to improve the machine translation quality. • Translated text keywords are stored in the working memory. • Working memory diversity correlates with the corpora complexity.

Citations by journals

1
Computers in Biology and Medicine
Computers in Biology and Medicine, 1, 100%
Computers in Biology and Medicine
1 publication, 100%
1

Citations by publishers

1
Elsevier
Elsevier, 1, 100%
Elsevier
1 publication, 100%
1
  • We do not take into account publications that without a DOI.
  • Statistics recalculated only for publications connected to researchers, organizations and labs registered on the platform.
  • Statistics recalculated weekly.
Metrics
Share
Cite this
GOST |
Cite this
GOST Copy
Sagirova A., Burtsev M. Complexity of symbolic representation in working memory of Transformer correlates with the complexity of a task // Cognitive Systems Research. 2022. Vol. 75. pp. 16-24.
GOST all authors (up to 50) Copy
Sagirova A., Burtsev M. Complexity of symbolic representation in working memory of Transformer correlates with the complexity of a task // Cognitive Systems Research. 2022. Vol. 75. pp. 16-24.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1016/j.cogsys.2022.05.002
UR - https://doi.org/10.1016%2Fj.cogsys.2022.05.002
TI - Complexity of symbolic representation in working memory of Transformer correlates with the complexity of a task
T2 - Cognitive Systems Research
AU - Sagirova, Alsu
AU - Burtsev, Mikhail
PY - 2022
DA - 2022/09/01 00:00:00
PB - Elsevier
SP - 16-24
VL - 75
SN - 1389-0417
ER -
BibTex
Cite this
BibTex Copy
@article{2022_Sagirova,
author = {Alsu Sagirova and Mikhail Burtsev},
title = {Complexity of symbolic representation in working memory of Transformer correlates with the complexity of a task},
journal = {Cognitive Systems Research},
year = {2022},
volume = {75},
publisher = {Elsevier},
month = {sep},
url = {https://doi.org/10.1016%2Fj.cogsys.2022.05.002},
pages = {16--24},
doi = {10.1016/j.cogsys.2022.05.002}
}
Found error?