Xgboost returned objective value (R)

My question is whether the returned training loss values in R package xgboost version 1.2.0.1 contain regularization terms. Specifically, in the following example, does train-logloss contains regularized term related to gamma, lambda and alpha? If not, how do I obtain objective value with regularized term? Could someone point to relevant code in the R package xgboost? Thanks!

x=matrix(rnorm(100*2),100,2)
g2=sample(c(0,1),100,replace=TRUE)
library("xgboost")
fit2=xgboost(x, g2, nrounds=5, objective="binary:logistic", eval_metric="logloss", gamma=1, lambda=1, alpha=1)
 [1] train-logloss:0.646722
 [2] train-logloss:0.591888
 [3] train-logloss:0.530950
 [4] train-logloss:0.520571
 [5] train-logloss:0.491039

Here is the implementation of the logloss metric:

It does not contain regualarization terms. The regularization terms are only used to compute the outputs from leaf nodes. Once the output from leaf nodes are determined, we use the formula -y * log(py) - (1-y) * log(1 - py), where py is the output from the leaf node.