Cross validation with XGBoost.jl

Wondering how others are performing cross validation using the Julia package XGBoost.jl . Are you using MLJ.jl or have you written your own routine(s). (This package does not have its own convenience ‘cv’ function).

I have written my own routine, which is functional but ‘feels’ kludgy. Part of this is being traditional and performing CV-10 for each round of trees. For learning rates of 0.01 and nrounds of 500 to 1500, processing times are stretching out - even for small dataset, i.e. the Boston dataset with eta=.01 and nrounds=1500, a 10-fold CV at each round requires 30 seconds to process.

Would like to hear how others approach this issue.