Simulated Annealing x SGD x Mini-batch | Machine Learning w TensorFlow & scikit-learn #9
📚About
This lecture is dedicated for variations of gradient descent algorithms. We talk about Stochastic & mini-batch Gradient Descent along with Simulated Annealing. Python implementations are done on Jupyter.
⏲Outline⏲
00:00 Introduction
00:26 Simulated Annealing
02:42 Stochastic Gradient Descent with variable step size
10:12 Mini-batch Gradient Descent
🔴 Subscribe for more videos on Machine Learning and Python.
👍 Smash that like button, in case you find this tutorial useful.
👁🗨 Speak up and comment,
9 views
19
5
9 months ago 00:01:19 7
Same Stats, Different Graphs - CHI 2017
1 year ago 00:01:31 1
Press Brake Machine & Shearing Machine
1 year ago 00:09:17 2
GPCRs in VR: Identification of Cannabinoid CB1 Receptor Allosteric Sites to Treat Epilepsy
1 year ago 01:44:45 1
Optimization - Lecture 3 - CS50’s Introduction to Artificial Intelligence with Python 2020
2 years ago 01:32:48 1
Как с помощью эволюционных алгоритмов решать задачи в Grasshopper