1. EachPod

Backpropagation

Author
[email protected] (Ben Jaffe and Katie Malone)
Published
Mon 29 Feb 2016
Episode Link
https://soundcloud.com/linear-digressions/backpropagation

The reason that neural nets are taking over the world right now is because they can be efficiently trained with the backpropagation algorithm. In short, backprop allows you to adjust the weights of the neural net based on how good of a job the neural net is doing at classifying training examples, thereby getting better and better at making predictions. In this episode: we talk backpropagation, and how it makes it possible to train the neural nets we know and love.

Share to: