Comparaison des algorithmes Random Forest & Bagging

Expecting combined classifers to perform better than a single classifer in an important guideline of Machine Learning. Therefore, we can expect a better classification performance when we have numerous decision trees from the same data set, as opposed th having only one decision tree. Fos instance, the authors in show empirically that a random forest composed of 100 trees yields a signicative (although small) improvement in Accuracy. In fact, techniques suach as Random Forests and Bagging, basse their preditions on the combination of the outcomes of all the decision trees that compose them.