Generic optimization with random forest and non differentiable objective function

Could be possible optimize a objective function that isn’t differentiable (no first/second derivative matrix to input at xgb), but instead of gblinear and gradient boost, use a random forest algoritm?

No, XGBoost is a gradient boosting library. Consider using other random forest libraries, such as scikit-learn.