Performance drop with v1.0.2

Hello!

I hope this issue isn’t addressed elsewhere, but our lab has had an issue upgrading from v0.90 to v1.0.2 (Python 3.8.2). Using the same training script (exact same input data, hyperparameters, etc.) the performance drops significantly (more than 10 points in AUROC) when moving from v0.90 to v1.0.2. This also persists when upgrading to the newest version, v1.1.0. I took a look at the changelog (https://github.com/dmlc/xgboost/blob/master/NEWS.md) but didn’t see anything that jumped out at me that would fundamentally change the performance. Any direction or insight you’re able to provide would be greatly appreciated!

Thanks to the xgboost team for all the hard work you guys do to build an awesome package!

@bccummings It looks like a bug. Can you file a new issue in our GitHub repository (https://github.com/dmlc/xgboost)? To help developers address the issue quickly, please include an example script that reproduces the problem.

Hi hcho3,

Thanks so much for the quick response! I see what I can come up with as an example and post an issue. Thank you!