Is "gpu_hist" only supported by GPU-enabled machine?

Hi folks,
I just trained a model (XGBRegressor) using “gpu_hist” tree method on a GPU-enabled machine.
I used pickle.dump(…) to store the model.

I want to load and apply this model on a machine without any GPU. I am wondering if it is feasible?

What I am doing currently is to use pickle.load(…) to load the model. For example XGBreg_load = pickle.load(…)
I want to use XGBreg.predict(…) to make the prediction, but the results are not correct.
I am wondering if it is possible to make it correct on a machine without any GPU?

Best

Sounds like a bug. Can you create a new GitHub issue?

Hi Philip. Thank you so much Philip. I created the GitHub issue here. I don’t want to re-train the model, so it will be great to know how to make use of the model (gpu_hist) from GPU-enabled machine on a machine without GPU.