The Complete Mathematics of Neural Networks and Deep Learning
A complete guide to the mathematics behind neural networks and backpropagation.
In this lecture, I aim to explain the mathematical phenomena, a combination of linear algebra and optimization, that underlie the most important algorithm in data science today: the feed forward neural network.
Through a plethora of examples, geometrical intuitions, and not-too-tedious proofs, I will guide you from understanding how backpropagation works in single neurons to entire networks, and why we need backpropagation anyways.
It’s a long lecture, so I encourage you to segment out your learning time - get a notebook and take some notes, and see if you can prove the theorems yourself.
As for me: I’m Adam Dhalla, a high school student from Vancouver, BC. I’m interested in how we can use algorithms from computer science to gain intuition about natural systems and environments.
My website:
I write here a lot:
Contact me: adamdhalla@pr
1 view
174
37
3 weeks ago 00:34:48 2
Creature Features Episode 422 Lite
3 weeks ago 00:24:48 2
Jim Caviezel on Darkness, the Devil, Hollywood, and Sacrifice
3 weeks ago 00:09:41 8
HOPE AMIDST THE ASHES - MIRACLE IN THE PALISADES
4 weeks ago 00:01:00 1
Commander of Battlefront Development Demo Version New trailer 【Indie Game】