Wrong results after conversion to CoreML

I’m trying to convert a XGBoost Booster model (regressor) to coremltools. The resulting model gives very different output. For the same dataset:
XGBoost: 3.018615e-05
MLModel: 10.701606790712008

The output should belong to [0, 1] interval.

Here is the script which I use for model conversion:

import sys
import xgboost
import coremltools

xgbModel = xgboost.Booster()
xgbModel.load_model(sys.argv[1])
featureNames = list(map(lambda i: 'f{}'.format(i), range(0, 10000)))
cmlModel = coremltools.converters.xgboost.convert(xgbModel, featureNames)
cmlModel.save(sys.argv[2])

The original model accepts hash array (10000 doubles) and has no feature names so I must specify them directly.

Here is the test run:

input = {...} # features dict
n_features = 10000

hasher = FeatureHasher(n_features)
hash_matrix = hasher.transform([input])

coreml_model = coreml.models.MLModel(sys.argv[1])
features_dict = { "f{}".format(i): hash_matrix[0, i] for i in range(0, hash_matrix.shape[1]) }
coreml_predictions = coreml_model.predict(features_dict)
print("CoreML:")
print(coreml_predictions)

xgb_model = xgb.Booster(model_file=sys.argv[2])
dmatrix = xgb.DMatrix(hash_matrix)
xgb_predictions = xgb_model.predict(dmatrix)
print("XGBoost:")
print(xgb_predictions)

Output:

CoreML:
{'target': 10.701606790712008}
XGBoost:
[3.018615e-05]

You should post this issue to https://github.com/apple/coremltools.

I found the workaround. Apply this transform to MLModel predictions:
f(x) = 1/(1 + exp(0.5 - x))
For more details see the github issue.

After trying it out, I found the results to be somewhat different from the Android XGBoost output. How to improve the accuracy.Thanks!