#eleuther #gptneo #gptj
EleutherAI announces GPT-NeoX-20B, a 20 billion parameter open-source language model, inspired by GPT-3. Connor joins me to discuss the process of training, how the group got their hands on the necessary hardware, what the new model can do, and how anyone can try it out!
OUTLINE:
0:00 - Intro
1:00 - Start of interview
2:00 - How did you get all the hardware?
3:50 - What’s the scale of this model?
6:00 - A look into the experimental results
11:15 - Why are there GPT-Neo, GPT-J, and GPT-NeoX?
14:15 - How difficult is training these big models?
17:00 - Try out the model
...on GooseAI
19:00 - Final thoughts
Read the announcement:
Try out the model:
Check out EleutherAI:
Read the code:
Hardware sponsor:
Links:
TabNine Code Completion (Referral):Show more