Gridsearch XGBregressor scoring vs objective

I am trying to use XGBRegressor with sklearn’s GridSearch for a regression problem.
As such I specify the ‘objective’ for XGBRegressor to be ‘reg:linear’. To my knowledge, XGB tries to maximise the explained variance for its linear models. Is that correct? Does XGB provide other metrics such as minimising the RMSE?

GridSearch has a parameter called ‘scoring’ that can be specified in order to rank the models. According to gridsearch’s documentation:

If scoring is not specified, the estimator’s default scorer (if available) is used.

In the case of XGBRegressor would that be the ‘explained variance?’

And if I manually specify the ‘scoring’ to be equal to ‘neg_mean_squared_error’, how will that interact with XGB’s default metric? Should they both be the same (i.e. leave the scoring parameter empty for gridsearch so it uses the default XGB param)

If it helps, I’ve provided an example of use of XGBRegressor with GridSearch.

xgb1 = XGBRegressor()
        parameters = { 
                      'objective':['reg:linear'],
                      'learning_rate': [0.045,0.05,0.06], 
                      'max_depth': [3,4,5],
                      'min_child_weight': [2,3,4],
                      'silent': [1],
                      'subsample': [0.5,0.55,0.6],
                      'colsample_bytree': [0.7,0.8,0.85],
                      'n_estimators': [650,750,800]}

    xgb_grid = GridSearchCV(xgb1,
                            parameters,
                            scoring='neg_mean_squared_error', #this line can be commented to use XGB's default metric
                            cv = 5,
                            n_jobs = 5,
                            verbose=True)


    xgb_grid.fit(a,b)