But what is a GPT? Visual intro to transformers | Chapter 5, Deep Learning
Breaking down how Large Language Models work
Instead of sponsored ad reads, these lessons are funded directly by viewers:
---
Here are a few other relevant resources
Build a GPT from scratch, by Andrej Karpathy
If you want a conceptual understanding of language models from the ground up, @vcubingx just started a short series of videos on the topic:
If you’re interested in the herculean task of interpreting what these large networks might actually be doing, the Transformer Circuits posts by Anthropic are great. In particular, it was only after reading one of these that I started thinking of the combination of the value and output matrices as being a combined low-rank map from the embedding space to itself, which, at least in my mind, made things much clearer than other sources.
Site with exercises related to ML programming and GPTs
History of language models by Brit Cruise, @ArtOfTheProblem
An early paper on how directions in embedding spaces have meaning:
---
Timestamps
0:00 - Predict, sample, repeat
3:03 - Inside a transformer
6:36 - Chapter layout
7:20 - The premise of Deep Learning
12:27 - Word embeddings
18:25 - Embeddings beyond words
20:22 - Unembedding
22:22 - Softmax with temperature
26:03 - Up next
1 view
862
240
5 days ago 00:04:14 1
Paramore: Decode [OFFICIAL VIDEO]
7 days ago 00:00:32 1
…but the people are retarded
1 week ago 00:03:02 1
SPX Options Trading : Strategies for Big Gains!
2 weeks ago 00:21:40 1
La Toya Jackson On Michael’s Allegations | What Changed Her Mind? | the detail.
2 weeks ago 00:37:34 1
Chorallas - Desert Lambs (1969) [Full Album]
2 weeks ago 00:02:50 1
We Are Number One but it contains spoilers from Madoka Magica Concept Movie (and Rebellion)