I hope this issue isn’t addressed elsewhere, but our lab has had an issue upgrading from v0.90 to v1.0.2 (Python 3.8.2). Using the same training script (exact same input data, hyperparameters, etc.) the performance drops significantly (more than 10 points in AUROC) when moving from v0.90 to v1.0.2. This also persists when upgrading to the newest version, v1.1.0. I took a look at the changelog (https://github.com/dmlc/xgboost/blob/master/NEWS.md) but didn’t see anything that jumped out at me that would fundamentally change the performance. Any direction or insight you’re able to provide would be greatly appreciated!
Thanks to the xgboost team for all the hard work you guys do to build an awesome package!