Gradients are Not All You Need (Machine Learning Research Paper Explained)
#deeplearning #backpropagation #simulation
More and more systems are made differentiable, which means that accurate gradients of these systems’ dynamics can be computed exactly. While this development has led to a lot of advances, there are also distinct situations where backpropagation can be a very bad idea. This paper characterizes a few such systems in the domain of iterated dynamical systems, often including some source of stochasticity, resulting in chaotic behavior. In these systems, it is often better to use black-box estimators for gradients than computing them exactly.
OUTLINE:
0:00 - Foreword
1:15 - Intro & Overview
3:40 - Backpropagation through iterated systems
12:10 - Connection to the spectrum of the Jacobian
15:35 - The Reparameterization Trick
21:30 - Problems of reparameterization
26:35 - Example 1: Policy Learning in Simulation
33:05 - Example 2: Meta-Learning Optimizers
36:15 - Example 3: Disk packing
37:45 - Analysis of Jacobians
40:20 - What can be done?
45:40 - Just use Black-Box meth
7 views
11
2
1 month ago 00:02:33 1
Spaced Repetition: The most powerful study technique
2 months ago 00:13:00 1
★ ГЛАЗА ДЛЯ КУКОЛ (полимерная глина + УФ гель) / Eyes for dolls (polymer clay + UV gel)