A Review of Learning Rules in Machine Learning - March 8, 2021
In this research meeting, our research intern Alex Cuozzo reviews some notable papers and explains high level concepts related to learning rules in machine learning. Moving away from backpropagation with gradient descent, he talks about various attempts at biologically plausible learning regimes which avoid the weight transport problem and use only local information at the neuron level. He then moves on to discuss work which infers a learning rule from weight updates, and further work using machine learning to create novel optimizers and local learning rules.
Papers / Talks mentioned (in order of presentation):
• “Random synaptic feedback weights support error backpropagation for deep learning“ by Lillicrap et al.:
• Talk: A Theoretical Framework for Target Propagation:
• “Decoupled Neural Interfaces using Synthetic Gradients“ by DeepMind:
• Talk: Brains@Bay Meetup (Ra
3 views
11
3
2 weeks ago 00:01:48 1
Christian Louboutin Hot Chick High Heels 👠 Review
2 weeks ago 00:00:00 1
CASTLEVANIA: NOCTURNE Season 2 Netflix Review (2025)
3 weeks ago 00:13:50 1
МЕСЯЦ Я СДАВАЛ СВОЙ ПК В АРЕНДУ ДЛЯ ИГР, СКОЛЬКО ЗАРАБОТАЛ?