The University of Massachusetts Amherst
University of Massachusetts Amherst

Search Google Appliance

Links

Abraham Wyner - Explaining the Success of AdaBoost, Random Forests and Deep Neural Nets as Interpolating Classifiers

March 26, 4:00pm
LGRT 1634

Refreshments will be served at 3:45pm

Abstract: AdaBoost, random forests and deep neural networks are the present day workhorses of the machine learning universe. We introduce a novel perspective on AdaBoost and random forests that proposes that the two algorithms work for similar reasons. While both classifiers achieve similar predictive accuracy, random forests cannot be conceived as a direct optimization procedure. Rather, random forests is a self-averaging, "interpolating" algorithm which creates what we denote as a “spiked-smooth” classifier, and we view AdaBoost in the same light.  We conjecture that both AdaBoost and random forests succeed because of this mechanism. We provide a number of examples to support this explanation. We conclude with a brief mention of new research that suggests that deep neural nets are effective (at least in part and in some contexts) for the same reasons.