Eta and overfitting

When I found my model was overfitting, shall I increase eta or not?
My model has eta=0.02. Lower eta model usually took longer time to train. Thanks.

Use the overfitting controlling factors, such as

  1. max_depth
  2. min_child_weight
  3. gamma
  4. subsample
  5. colsample_bytree
  6. nfold

The docs have suggestions on overfitting:

There are in general two ways that you can control overfitting in XGBoost:

The first way is to directly control model complexity.
    This includes max_depth, min_child_weight and gamma.
The second way is to add randomness to make training robust to noise.
    This includes subsample and colsample_bytree.
    You can also reduce stepsize eta. Remember to increase num_round when you do so.