GPU trained model not using GPU when predicting

Dear everyone,

I trained a xgboost model using the gpu_hist algorithm and verified that it was indeed trained using a GPU (watch -n 1 nvidia-smi).
Then I saved the model via xgb_model.save_model(model_data_path).
Next I am loading the model via
model = xgb.Booster()
model.load_model(os.path.abspath(path_to_xgboost_model))

However, when I now predict using
model.predict(data_to_predict)
the predictions are performed using the CPU. What am I doing wrong?

XGboost 1.2.0, Python 3.7

Thank you very much

Try this: model.set_param({'predictor': 'gpu_predictor'}).

2 Likes

Thanks. This solved it.
Is this documented somewhere?
I didn’t really find that I had to manually set the predictor to a GPU predictor. I rather read that it uses the GPU by default?

Yes, it’s documented in https://xgboost.readthedocs.io/en/latest/parameter.html. The GPU predictor is used by default when you are training with gpu_hist. On the other hand, when you load the model from the disk, the default is to use the CPU predictor. This is so that you can train a model with GPU and then use it on a machine without a GPU.