Directions in ML: Taking Advantage of Randomness in Expensive Optimization Problems
Optimization is at the heart of machine learning, and gradient computation is central to many optimization techniques. Stochastic optimization, in particular, has taken center stage as the principal method of fitting many models, from deep neural networks to variational Bayesian posterior approximations. Generally, one uses data subsampling to efficiently construct unbiased gradient estimators for stochastic optimization, but this is only one possibility. In this talk, I discuss two alternative approaches to constructing unbiased gradient estimates in machine learning problems. The first approach uses randomized truncation of objective functions defined as loops or limits. Such objectives arise in settings ranging from hyperparameter selection, to fitting parameters of differential equations, to variational inference using lower bounds on the log-marginal likelihood. The second approach revisits the Jacobian accumulation problem at the heart of automatic differentiation, observing that it is possible to colla
1 view
8
0
22 hours ago 00:08:56 1
đ„EILT: Merz GREIFT AfD an ABER DAS geht in die HOSE
22 hours ago 00:01:32 1
DAS gibt es nur in Deutschland - das ist es was falsch lÀuft bei uns!
1 day ago 00:03:46 3
Billy Idol - Still Dancing (Official Music Video)
2 days ago 00:02:43 1
241108 TXT (íŹëȘšëĄì°ë°ìŽíŹêČë) âOver The Moon (Our Sanctuary ver.)â Official MV
3 days ago 00:03:00 18
Architects - âBrain Dead (feat. House of Protection)â
3 days ago 00:00:00 8
FIN DE LA GUERRE EN UKRAINE : POUTINE DEVIENT LE MAĂTRE DU JEU ! | LA MATINALE GPTV
4 days ago 00:14:03 1
Kochen Sie das Huhn und die Kartoffeln auf diese Weise, das Ergebnis ist erstaunlich
4 days ago 00:12:01 159
RTX 5070 Ti vs RTX 4070 Ti Benchmark đȘ Max Settings Gameplay In 10 Games at 4K!
5 days ago 00:05:44 9
Grand Theft Auto Vice City for the Sega Dreamcast - Itâs Finally Happening!