Random forest on classification

The XGB documentation states that

For instance, objective will typically be reg:squarederror for regression and binary:logistic for classification

Since binary:logistic merely accepts 0 and 1 as labels. Do random forests in XGBoost merely work on binary classification?

The objective should be set to multi:softprob when performing multi-class classification.

1 Like

Hi, I have a further question. Are the random forests in XGBoost work on classification or use gradient boosted trees to simulate classification? I have implemented a random forests program on classification and want to compare it with a good random forests implementation. I do not know whether the kernel of XGBoost random forests is also random forests on classification.