Laboratory of Neural Systems and Deep Learning
Publications
56
Citations
466
h-index
9
Authorization required.
The Laboratory of Neural Systems and Deep Learning at MIPT conducts research in the field of deep neural network architectures for working with natural language text (NLP & Conversational AI), as well as in the fields of deep learning theory and neural architecture search.
- Reinforcement learning
- Natural Language Processing (NLP)
- Machine learning
Mikhail Burtsev
Head of Laboratory
Research directions
Knowledge Graph-Based Dialogue Generation
+
Responses can be more meaningful compared to chit-chat generated responses thanks to usage of facts coming from a knowledge base like Wikidata. The system consists of three components: extraction of triplets from Wikidata for entities from the user's utterance; triplets ranking (choosing the triplet which is the most appropriate to use for generation of the response utterance); generation of the response utterance.
New Transformer Architectures
+
We are working on memory augmented transformer-based language models, knowledge graph integration to language models, knowledge distillation and we are experimenting with transformer architecture modifications: multiple streams, bottlenecks, sentence-level representations.
Publications and patents
Lab address
Долгопрудный, Институтский переулок, 9
Authorization required.