Neural Networks from Scratch - P.7 Calculating Loss with Categorical Cross-Entropy
In order to do backpropagation and optimization, we need to have some measure of how wrong the model is. For this, we use a loss function. In our case, with a softmax classifier, we’ll be using categorical cross-entropy.
Next video:
Neural Networks from Scratch book:
Playlist for this series:
Channel membership:
Discord:
Support the content:
Twitter:
Instagram:
Facebook:
Twitch:
#nnfs #python #neuralnetworks
1 view
436
124
9 years ago 00:33:09 22
Overview: Neural Networks
5 years ago 00:05:04 17
Neural networks learning spirals
4 years ago 00:49:30 13
Liquid Neural Networks
6 years ago 00:11:10 31
Randomly Wired Neural Networks
5 years ago 00:05:38 24
Neural Networks (E01: introduction)
4 years ago 00:17:07 10
Universality of Neural Networks
5 years ago 00:29:32 25
Training deep quantum neural networks
5 years ago 00:57:09 13
Lagrangian Neural Networks | AISC
10 years ago 00:02:15 23
Deploying Deep Neural Networks Efficiently
10 years ago 01:00:11 29
HC27-K1: Convolutional Neural Networks
9 years ago 00:59:40 48
Alexander Novikov (HSE) “Tensorizing Neural Networks“
9 years ago 00:18:08 52
Machine Learning on FPGAs: Neural Networks
9 years ago 00:03:07 94
Composing graphical models with neural networks
5 years ago 00:37:20 18
Convolutional Neural Networks | MIT
9 years ago 01:22:50 35
Recurrent Neural Networks Yoshua Bengio
6 years ago 00:46:48 10
Interpreting Deep Neural Networks (DNNs)
5 years ago 01:03:37 25
Deep Neural Networks for Structured Prediction
5 years ago 00:45:56 10
Deep Learning - Convolutional Neural Networks Explained