XGboost automatically changes objective function?

Hi,

I am running xgboost for a regression model. However, my response variable has only three values, [-1,0,1]. My intended problem is more like an ordinal classification problem but I “pretend” it to be a linear regression model. But I found xgboost automatically changed my objective function from “objective = ‘reg:linear’” to “objective=‘multi:softprob’”. Is this expected?

Also, I would like to run an ordinal classification problem. Is "objective = ‘rank:pairwise’ " be more appropriate for what I want to do?

N_ESTIMATOR = 400
model_overall = XGBClassifier(
learning_rate =0.036,
n_estimators= N_ESTIMATOR,
max_depth=5,
min_child_weight=1,
gamma=0.1,
reg_alpha = 0,
subsample=0.8,
colsample_bytree=0.6,
objective= ‘reg:linear’,
nthread=4,
scale_pos_weight=1,
seed=14
)

model_overall

Output:
XGBClassifier(base_score=0.5, booster=‘gbtree’, colsample_bylevel=1,
colsample_bytree=0.6, gamma=0.1, learning_rate=0.036,
max_delta_step=0, max_depth=5, min_child_weight=1, missing=None,
n_estimators=40, n_jobs=1, nthread=4, objective=‘reg:linear’,
random_state=0, reg_alpha=0, reg_lambda=1, scale_pos_weight=1,
seed=14, silent=True, subsample=0.8)

after fitting

model_overall.fit(X,Y, verbose=False)
XGBClassifier(base_score=0.5, booster=‘gbtree’, colsample_bylevel=1,
colsample_bytree=0.6, gamma=0.1, learning_rate=0.036,
max_delta_step=0, max_depth=5, min_child_weight=1, missing=None,
n_estimators=40, n_jobs=1, nthread=4, objective=‘multi:softprob’,
random_state=0, reg_alpha=0, reg_lambda=1, scale_pos_weight=1,
seed=14, silent=True, subsample=0.8)

I realized I need to use XGBRegressor instead of ** XGBClassifier** in my example. Problem solved!