XGBoost Regressor one-step ahead trainning

I have some time series data (multiple streams of it) and I’m using XGBoost regressor to forecast it. Looks pretty good.

I only have to predict the next timestep and I’m training on like 80% of the data and using the last 20% as validation (using the error from the validation set to compare possible models).

Well, I realized that there’s probably a better way. What if trained on the entire dataset, and predicting each timestep, gathering error, and then using the aggregate error over the entire dataset to choose my model?

Well I was researching and I think that’s called one-step ahead training? right? I don’t know.

Can I ask XGBoost Regressor to train that way out of the box? If not how do I go about doing it? Does anyone have an example of this?

Thanks!