Xgboost 0.72 vs 0.80

From the release notes, I can’t find much information on the performance in xgboost 0.80 release. If I use the same hyper-parameters, will xgboost upgrade improve the solution? If not, will the new version improve on mitigating the over-fitting problem? Thanks!

There was no significant change made to the core algorithm implementation. Such changes would have been part of the release notes.