1. EachPod

Ensemble Algorithms

Author
[email protected] (Ben Jaffe and Katie Malone)
Published
Mon 23 Jan 2017
Episode Link
https://soundcloud.com/linear-digressions/ensemble-algorithms

If one machine learning model is good, are two models better? In a lot of cases, the answer is yes. If you build many ok models, and then bring them all together and use them in combination to make your final predictions, you've just created an ensemble model. It feels a little bit like cheating, like you just got something for nothing, but the results don't like: algorithms like Random Forests and Gradient Boosting Trees (two types of ensemble algorithms) are some of the strongest out-of-the-box algorithms for classic supervised classification problems. What makes a Random Forest random, and what does it mean to gradient boost a tree? Have a listen and find out.

Share to: