The Illustrated Retrieval Transformer

The latest batch of language models can be much smaller yet achieve GPT-3 like performance by being able to query a database or search the web for information. A key indication is that building larger and larger models is not the only way to improve performance. This video provides a gentle intro RETRO, DeepMind’s retrieval-augmented Transformer. There are more details in the associated blog post: Corrections: 1:15 - *knowledge-intensive tasks --- Contents: Introduction (0:00) Retrieval-enhanced language models (0:40) The Illustrated Retrieval Transformer (1:55) World-knowledge vs. language knowledge (2:57) The database and how to query it and retrieve information (4:45) Feeding the prompt and retrieved information into the model (5:39) The architecture of the model (6:00) --- Paper: Authors: Sebastian Borgeaud, Arthur Mensch, Jordan Hoffmann, Trevor Cai, Eliza Rutherford, Katie Millican, George van den Driessche, Jean-Baptiste Lespiau, Bogdan Damoc, Aidan Clark, Diego de Las Casas, Aurelia Guy, Jacob Menick, Roman Ring, Tom Hennigan, Saffron Huang, Loren Maggiore, Chris Jones, Albin Cassirer, Andy Brock, Michela Paganini, Geoffrey Irving, Oriol Vinyals, Simon Osindero, Karen Simonyan, Jack W. Rae, Erich Elsen, Laurent Sifre ---- Twitter: Blog: Mailing List: --- More videos by Jay: Experience Grounds Language: Improving language models beyond the world of text Language Processing with BERT: The 3 Minute Intro (Deep learning for NLP) Explainable AI Cheat Sheet - Five Key Categories The Narrated Transformer Language Model
Back to Top