I failed to load new xgboost booster with old spark, here is my details:
-
booster train by python xgboost 1.6.2
** using on GPU with “reg:pseudohubererror”
** mod=xgb.dask.DaskXGBRegressor().fit
** saved by mod.get_booster().save_model(bst_path) -
loader for production by scala(2.11) spark(2.4.0) xgboost(1.1.1)
** load method is through standard way at https://github.com/dmlc/xgboost/issues/3689
** it first failed due to "Unknown objective function:reg:pseudohubererror
" - well understood because 1.1.1 has no this objective
** then I modified booster in python training env by: booster.set_param(‘objective’, ‘reg:squarederror’) and booster.save_model; now model is load without any issue, but failed at actual scoring (mod.transform(df)) time with error:
Caused by: ml.dmlc.xgboost4j.java.XGBoostError: [10:39:43] /xgboost/src/learner.cc:946: Check failed: mparam_.num_feature != 0 (0 vs. 0) : 0 feature is supplied. Are you using raw Booster interface?
I am sure my data is correct,
Any help to solve this is highly appreciated? Or this version back compatibility is impossible?