Bug in C library inference?

I have a very simple dataset (30 rows, 32 columns).

I wrote a Python program to load the data and train an XGBoost model, then save the model to disk.

I also compiled a C++ program that uses libxgboost (C api) and loads the model for inference.

When using the SAME saved model, Python and C++ give different results for the same input (a single row of all zeros).

xgboost is 0.90 and I have attached all files (including the numpy data files) here:

I have no idea how to even debug this. Any help would be appreciated, thanks!

Here are the outputs of the two programs (the source of which are in the .tar file):

The Python program

(Which prints some strings and THEN prints a single number of output, which is to be compared against the C++ output)
$ python3 jl_functions_tiny.py
Loading data
Creating model
Training model
Saving model
Deleting model
Loading model
Testing model
[587558.2]

The C++ program

(The single number here doesn’t match the single number output of the Python program)
$ ./jl_functions
628180.062500