DL2022: Трансформер (часть 2)

Курс “Глубокое обучение (Deep Learning)“ страница курса: автор курса: Александр Дьяконов () В этой лекции... BERT = Bidirectional Encoder Representations from Transformers. RoBERTa: A Robustly Optimized BERT Pretraining Approach. SpanBERT. ALBERT = A Lite BERT. T5: Text-To-Text Transfer Transformer. ELECTRA = Efficiently Learning an Encoder that Classifies Token Re-placements Accurately.
Back to Top