JAX Crash Course - Accelerating Machine Learning code!

Learn how to get started with JAX in this Crash Course. JAX is NumPy on the CPU, GPU, and TPU, with great automatic differentiation for high-performance machine learning research. Get your Free Token for AssemblyAI Speech-To-Text API 👇?utm_source=youtube&utm_medium=referral&utm_campaign=yt_pat_43 Colab: Website: JAX blog post: ▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬▬ 🖥️ Website: 🐦 Twitter: 🦾 Discord: ▶️ Subscribe: 🔥 We’re hiring! Check our open roles: ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ 00:00 Intro & Outline 01:22 What is JAX 02:55 Speed comparison 05:00 Drop-in Replacement for NumPy 06:56 jit(): just-in-time compiler 11:32 Limitations of JIT 14:35 grad(): Automatic Gradients 19:57 vmap(): Automatic Vectorization 21:48 pmap(): Automatic Parallelization 22:18 Example Training Loop 24:38 What’s the catch? #MachineLearning #DeepLearning
Back to Top