Checking out a 6-Billion parameter GPT model, GPT-J, from Eleuther AI

An actually-open-AI model is available for you to download the weights to and tinker with, GPT-J. It’s a 6 billion parameter large language model that can do long-form natural language generation, do programming, be a chatbot, do translations, and much more. GPT-J Text examples: GPT-J Github: The Pile paper: GPT-J Web demo: EleutherAI: Multimodal Few-Shot Learning with Frozen Language Models: Neural Networks from Scratch book: Channel membership: Discord: Reddit: Support the content: Twitter: Instagram: Facebook:
Back to Top