1. EachPod

KL Divergence

Author
[email protected] (Ben Jaffe and Katie Malone)
Published
Mon 07 Aug 2017
Episode Link
https://soundcloud.com/linear-digressions/kl-divergence

Kullback Leibler divergence, or KL divergence, is a measure of information loss when you try to approximate one distribution with another distribution.  It comes to us originally from information theory, but today underpins other, more machine-learning-focused algorithms like t-SNE.  And boy oh boy can it be tough to explain.  But we're trying our hardest in this episode!

Share to: