Very High Max Depth in XGBoost

I’m using a dataset of 5 million samples, predicting 10 outputs with 100 features. With a max depth below 20, I achieve an R2 score of around 0.89. However, increasing the max depth to over 80, along with a minimum child weight of 30 and 100 estimators, boosts accuracy to 97.5% (training set accuracy 99.1%). Despite this improvement, the model shows signs of overfitting, indicated by a training RMSE of around 28 and a testing RMSE of around 38, even after applying a lambda value of 380 . Should I stick with the higher max depth, or revert to a lower value to mitigate overfitting?