Xgboost model trained on gpu is using gpu for scoring even with predictor = "cpu_predictor"

I am using xgboost in R with version 0.81.0.1 and I would like to train my models using GPUs and then score them on machines without GPU cards. However when I try to do that it fails. I notice that when I score the models with a GPU, about 291mb is taken up on the card.

Is there a way to not use the GPU when scoring?

Can you set n_gpus to 0?

1 Like

Thanks! That worked :slight_smile: