I’m looking to add a new integer member variable to the TreeParam class.
The first time I did that I hit a static assert in line 48 of TreeParam that seems to have hardcoded the byte size of the class.
Currently I have a workaround that adds one more integer member variable to the size calculation, but I’m curious if that might lead to trouble in the future, and what’s the purpose of the 64bit alignment.
Hmm, for a new variant of the model I’m trying out I need to be carrying some extra information for each tree.
Assuming I only train and not save/load model explicitly, will adding new parameters cause any issues? I know rabit tries to write the model at the end of the learning process but that’s about it.