Sanjib Narzary
1
,
Maharaj Brahma
1
,
Bobita Singha
1
,
Rangjali Brahma
1
,
Bonali Dibragede
1
,
Sunita Barman
1
,
Sukumar Nandi
2
,
Bidisha Som
2
1
Department of Computer Science and Engineering, Central Institute of Technology Kokrajhar, Kokrajhar, India
|
Publication type: Proceedings Article
Publication date: 2019-03-01
Abstract
Bodo language is a relatively low resource language. Other than the text-book, novels and some print publication of newspaper, there appears to be very few resources available in the public domain. As the technology becomes affordable there is a growing number of active Bodo internet users. It requires a technology that can bring information in their own language. Machine translation appears to be a promising solution for that purpose. In this work we build an English-Bodo Neural Machine Translation by adopting a two layered bidirectional Long Short Term Memory (LSTM) cells that can capture the long term dependencies. As very few work has been done on English-Bodo NMT, we make our baseline model which produced a BLEU Score of 11.8 . We then gradually overcome the baseline model by introducing several attention mechanism. We achieved a BLEU Score of 16.71 using the approach presented in Bahdanu. Furthermore we got a better BLEU score of 17.9 when we introduced beam search with a beam width of 5. We found that the model performs very well despite the few dataset available.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
2
|
|
|
Multimedia Tools and Applications
2 publications, 40%
|
|
|
Natural Language Processing
1 publication, 20%
|
|
|
1
2
|
Publishers
|
1
2
|
|
|
Springer Nature
2 publications, 40%
|
|
|
Institute of Electrical and Electronics Engineers (IEEE)
2 publications, 40%
|
|
|
Cambridge University Press
1 publication, 20%
|
|
|
1
2
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
5
Total citations:
5
Citations from 2024:
2
(40%)