I’m using the Python API XGBRegressor() (version 1.0.2) and setting the objective to ‘reg:squaredlogerror’, most other parameters with default values. When I run it on the Ames Housing dataset I get a massive RMSLE of 7.572. However, when I change the objective to ‘reg:squarederror’ I get a more reasonable RMSLE of 0.138. At least in theory, squared log loss should optimise the RMSLE metric better than squared loss. Am I missing something obvious or is it a bug? I checked and targets or predictions are not negative so this couldn’t account for the problem.
xgb_model_params = {
‘objective’: ‘reg:squaredlogerror’,
‘booster’: ‘gbtree’,
‘random_state’: 69,
‘n_estimators’: 100,
‘learning_rate’: 0.05,
‘tree_method’: ‘hist’,
‘verbosity’: 1,
}