XGBoost uses classification tree or regression tree when doing classification?

When I use the’objective’:‘binary:logistic’ method to classify, the tree split is based on continuous values instead of discrete values

XGBoost uses CART (Classification and Regression Trees), first developed by Leo Breiman.

Currently, XGBoost is able to generate splits with continuous variables and binary discrete variables (0/1). It cannot yet directly generate splits from multi-level discrete (categorical) variables. You will need to convert multi-level discrete variables into binary dummy variables using One-Hot Encoding.

I think that when using the xgboost classification algorithm, the regression tree in CART is always used instead of the classification tree. Am I right?

In XGBoost, trees always produce a continuous output, even for the classification task. So you are correct, assuming that regression trees refer to trees with continuous outputs.