I’ve currently experienced some strange behaviour.
I trained a xgb model with the training data as swell as some test data in the eval() part.
Just for testing issues I eliminated the training data in eval() and suddenly I get other predictions. All other parameters are unchanged. The test data was always th second part in the eval tuple.
Did anyone experience similar things or can anybody explain this behaviour?
thanks and br