XGBOOST classification for ordinal classes with reg:squareerror

I am trying the following:

model = xgb.XGBClassifier( learning_rate = 0.3,
                                    n_estimators  = 800,
                                    max_depth     = 8,
                                    eval_metric='rmse',
                                    #eta = 0.1,
                                    colsample_bytree= 1,
                                    objective= 'reg:squarederror',
                                    subsample= 1.0,
                                    min_child_weight= 4,
                                    num_class = 6)

I would like to indeed verify if this works. That is it will optimize the square error between the classes.

When I call the fit function and reprint the parametrs of the object model i see it it reset to multi:softprob. Is there any way I cam make this work?

You may try xgb.XGBRegressor() instead of xgb.XGBClassifier()