Distributed cross-validation training

Hello,

I do see how we can train single models on dask with dask.train. Is there a way to do a cv also with early stopping and whatever other callbacks on dask. Like same way we can use xgb.cv for non-distributed xgboost?