NLP #6: The next generation of language models.

Mikhail Burtsev gives a talk about the problems of neural network architectures based on transformers (first of all, BERT and its variants) in relation to the task of language modeling, and offer research directions to overcome these problems. Presentation: Mikhail Burtsev is head of the DeepPavlov project & the Neural Networks and Deep Learning Lab of MIPT.
Back to Top