Unbalanced multi-class data using XGBOOST

I am training an XGBOOST classifier with a highly unbalanced dataset having 50 classes. After following the comments from - https://datascience.stackexchange.com/a/60134/117860 , I tried to add “sample_weight” parameter -
xgb_model = xgb.XGBClassifier(learning_rate=0.001,
max_depth = 3,
n_estimators = 100,
objective=‘multi:softmax’, num_classes=50,
eval_metric = ‘merror’, sample_weight=classes_weights)

But then I see no improvement in the performance. Also, I don’t see the parameter sample_weight in the XGBoost Doc.

I need your help here.

Thanks!

You need to set the weight in the fit function. Check out the document. Also, it’s sample weight instead of class weight.