Training a Random Forest with eta <> 1?

Hi there,

I’d like to ask what happens if you train an xgb random forest with the learning_rate set lower than 1. According to the instructions on the official website ( the " eta (alias: learning_rate ) must be set to 1 when training random forest regression". Could you please elaborate on the reason? What happens if the rf is trained with eta, say, 0.3?

Thanks a lot in advance!


XGBoost is a gradient boosting library, which means the learning_rate is used for each boosting iteration. Random forest can be considered as the first boosting iteration (1 base model), so it should have initial learning rate, which is 1.