When I found my model was overfitting, shall I increase eta or not?
My model has eta=0.02. Lower eta model usually took longer time to train. Thanks.
Eta and overfitting
Use the overfitting controlling factors, such as
- max_depth
- min_child_weight
- gamma
- subsample
- colsample_bytree
- nfold
The docs have suggestions on overfitting:
There are in general two ways that you can control overfitting in XGBoost:
The first way is to directly control model complexity.
This includes max_depth, min_child_weight and gamma.
The second way is to add randomness to make training robust to noise.
This includes subsample and colsample_bytree.
You can also reduce stepsize eta. Remember to increase num_round when you do so.