10,000 Of These Train ChatGPT In 4 Minutes!

❤️ Check out Lambda here and sign up for their GPU Cloud: NVIDIA H200: The Bitter Lesson: Fellow Scholars! The ChatGPT and Stable Diffusion training times at 3:12 were recorded on a set of H100 GPUs. I made sure to not say that it was trained on one card, but I’d like to add a note here for clarity just to make sure. I apologize as I should have made this clearer. 📝 My latest paper on simulations that look almost like reality is available for free here: Or this is the orig. Nature Physics link with clickable citations: 🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible: Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bret Brizzee, Bryan Learn, B Shang, Christian Ahlin, Gaston Ingaramo, Geronimo Moralez, Gordon Child, Jace O’Brien, Jack Lukic, John Le, Kenneth Davis, Klaus Busse, Kyle Davis, Lukas Biewald, Martin, Matthew Valle, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Richard Sundvall, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi. If you wish to appear here or pick up other perks, click here: Thumbnail background design: Felícia Zsolnai-Fehér - Károly Zsolnai-Fehér’s research works: ~zsolnai/ Twitter: #nvidia
Back to Top