Creating a custom objective function for imbalanced classification task

I am new to the usage of a custom loss function for a model particularly for Xgboost. I have a binary classification problem which is highly imbalanced and I need to predict the probabilities for the minority class (1). For this the objective function I am using is objective = ‘binary:logistic’. I did built an Xgboost model using the above ojective function and my evaluation metric being the average precision score. The score seems to be decent enough. But now I want to build a custom objective function for the model. So looking after many links and searching online, I used this as my custom objective function:

scale_pos_weight = 75

def obj_func(preds, y_train):
   weights = np.where(y_train == 1.0, scale_pos_weight, 1)   #as I use this parameter in my Xgbclassifier as well - to give weights to the minority class
   preds = 1.0 / (1.0 + np.exp(-preds))
   grad = preds - y_train       #gradient - 1st order derivative
   hess = preds * (1.0 - preds) #Hessian - 2nd order derivative
   return grad*weights, hess*weights

I am not sure if this implementation is right. My Xgb classifier is defined as:

xgb = XGBClassifier(learning_rate =0.07,
                 objective= obj_func,
                 scale_pos_weight = 75,

model_xgb =, y_train)

After fitting the model I evaluate my model against the validation set using the average precision from sklearn.metrics. The evaluation was decent when I was using the default binary:logistic objective function from the library but after using the custom objective function the average precision has gotten worse (0.03) from (0.65 when using normal objective function). Is there something wrong that I am doing with my objective function or do I need to add something more?

Maybe you should remove the hyperparameter scale_pos_weight from XGBClassifier, since you are already applying the weight directly to the loss function.

Thank you for the reply. I would try that, but there was a question I had in mind - it is suggested that we should not be using weights when we are more concerned about probabilities (Please correct me if I am wrong) So what can be any other custom loss function which we can use particularly for highly imbalanced classification problem where we are concerned about the probability of the minority class.

Can’t really think of any good custom loss function. I suggest that you use a hyperparameter search and use a custom evaluation metric that gives great importance to the minority class. This way, you can favor the set of hyperparameters that optimizes the custom evaluation metric.

1 Like

Yes custom evaluation metric makes more sense I guess. Out of curiosity I wanted to ask, would it be the same if I take out the scale_pos_weight hyperparameter and give weights in my objective function or vice versa or do the weights work differently in an objective function compared to a hyperparameter?

scale_pos_weight is equivalent to assigning greater weight to the data points belonging to the positive class.