XGBclassifier with custom objective function gives different results in different spark versions

I am training a XGBoost binary classifier with custom objective function in Databricks. I was getting decent results when my cluster was using Spark 2.4.5 (and python 3.7). Databricks updated the spark version to 3.1.2 (with python 3.8) and my code started to give only 0 predictions. Data is same, code is same but results are different. If I use default settings of XGBclassifier, I get same results. But my custom objective function logic does not work in the new environment. I searched documentation to see if the custom objective function logic implementation changed in newer versions but could not find anything helpful.

Below is my code that works well in old cluster and not in new one.

def logregobj(label, preds):
y = label
p = preds
beta = 4
grad = (beta - 1) * p * y + p - beta * y
hess = ((beta-1) * y + 1) * (p * (1.0 - p))
return grad, hess

params = {‘learning_rate’: 0.1, ‘n_estimators’: 4, ‘subsample’ : 0.8, ‘max_depth’ : 7, ‘min_child_weight’ : 5,
‘gamma’: 0, ‘colsample_bytree’: 0.7, ‘scale_pos_weight’: 1,
‘tree_method’: ‘hist’, 'random_state ': 42, ‘disable_default_eval_metric’: 1}

XGB = xgb.XGBClassifier(**params, objective=logregobj)