I encountered an odd behaviour of xgboost4j under linux (Ubuntu 17.10).
Namely, if I specify eta to be smaller than 1.0 e.g. 0.3 (the default listed in the documentation), then the resulting model seems to not have learned anything outputting the same probabilities for all inputs if the objective multi:softprob is used.
Note that this happens for 0.72 and 0.81.
Did anyone else encounter this issue or can tell me how to avoid it?
Thank you very much.