Neural Networks Demystified [Part 6: Training]

After all that work it’s finally time to train our Neural Network. We’ll use the BFGS numerical optimization algorithm and have a look at the results. Supporting Code: Yann Lecun’s Efficient BackProp Paper: More on BFGS: –Fletcher–Goldfarb–Shanno_algorithm In this series, we will build and train a complete Artificial Neural Network in python. New videos every other friday. Part 1: Data Architecture Part 2: Forward Propagation Part 3: Gradient Descent Part 4: Backpropagation Part 5: Numerical Gradient Checking Part 6: Training Part 7: Overfitting, Testing, and Regularization Follow me on Twitter for updates: @stephencwelch
Back to Top