Attention and Transformers Part 1/3

Event page: Hello dear friends 🌎 🌍, We hope you are enjoying our latest sessions 🏖️ deploying models and cool applications to our phones 📱 and edge devices 🕹️ Our friend Dmitri has offered to lead some very interesting papers, code and content all on the attention mechanism in deep learning models and transformers! 😲 Attention and Transformers Part 1/2 -Presentation slides: - Introduce “Attention is all you need”, the groundbreaking paper from 2017 describing the attention mechanism for the encoder-decoder architecture: 💪 10-15 minutes - Go over key (but not all) visuals explaining attention in Alammar’s blog: 📚 15-20 minutes - Run and discuss a TF tutorial notebook implementing the classical transformer for the neural machine translation:
Back to Top