When using DART booster, is GPU support possible?

I am wondering if GPU support is available if using the DART booster? It is using the CPU which is rather slow.

https://xgboost.readthedocs.io/en/latest/tutorials/dart.html

It uses GPU if I use the standard booster as I am using ‘tree_method’: ‘gpu_hist’. I have the latest version of XGBoost installed under Python 3.7.

It’s supported. But might not be really helpful as the bottleneck is in prediction. We plan to do some optimization in there for the next release.

You are correct, the GPU is used. I used a larger dataset, and I can see GPU usage spiking occasionally.

My feeling is that the GPU is mainly used to calculate the RMSE metric, as GPU usage 0% while the trees are trained and 25% (very briefly) when RMSE is calculated (but this is an observational only - I havn’t looked at the code for XGBoost).