Numba jit compiled custom objective function

I wonder if there is a way, to implement custom objective function for xgboost classification model with numba nopython jit compilation?

My tests with xgboost 1.0 rc and both google colab and xps 15 running wsl 2 have showed 2x+ increase of training time, when numba variant of objective has been provided.

%time xgb.train({'max_depth': 2, "seed": 123, 'tree_method': 'hist','objective':'binary:logistic'}, dm, num_boost_round=50)
CPU times: user 1min 34s, sys: 493 ms, total: 1min 34s
Wall time: 25 s
<xgboost.core.Booster at 0x7f7c8c8b5a20>

%timeit xgb.train({'max_depth': 2, "seed": 123, 'tree_method': 'hist'}, dm, num_boost_round=50, obj = logregobj)
1 loop, best of 3: 23.8 s per loop

%timeit xgb.train({'max_depth': 2, "seed": 123, 'tree_method': 'hist'}, dm, num_boost_round=50, obj = nb_obj)
1 loop, best of 3: 52.9 s per loop

Full jupyter notebook with code: https://colab.research.google.com/drive/1OoO-TfQvDrWfdHEhpiCHIjhrG54BS9l8

You could adapt this Python + Numba implementation of XGBoost instead:

Only a few hundred lines of Python code and 3x faster than C++ XGBoost