Folks know that gradient-boosted trees generally perform better than a random forest, although there is a price for that: GBT have a few hyperparams to tune, while random forest is practically tuning-free. Let’s look at what the literature says about how these two methods compare. Supervised learning in 2005 In 2005, Caruana et al. made an empirical comparison of supervised learning algorithms [vi