Hi. I was waiting for feature_weights, in XGBoost now we have it. However, I haven’t seen a tutorial/demo yet. I tried it by myself, it changes something, but when I plot the tree I see it doesn’t work as expected.

**feature_weights** ( *array_like* ) – Weight for each feature, defines the probability of each feature being selected when colsample is being used. All values must be greater than 0, otherwise a ValueError is thrown. Only available for hist, gpu_hist and exact tree methods.

I’ve tried to change tree_methods to see difference, but they don’t reflect to change.

I have 3 features.

1)

When I define feature_weights in that way:

feature_weights = np.array([0.8,0,0.19]).astype(np.float32)

bst = xgb.XGBRegressor(**param,tree_method= “exact”)

bst.fit(X_train,y_train, feature_weights=feature_weights,eval_set=[(X_valid, y_valid)])

I expect to not to see second feature when I plot the trees. But I see them.

2)

When I define feature_weights in that way:

feature_weights = np.array([0.8,0,0.2).astype(np.float32)

bst = xgb.XGBRegressor(**param,tree_method= “exact”)

bst.fit(X_train,y_train, feature_weights=feature_weights,eval_set=[(X_valid, y_valid)])

It gives constant result, as I see, sum of features should be less than one.

3)

When I define feature_weights in that way by increasing my number of features to 5:

feature_weights = np.array([0,0,0,0,0.999]).astype(np.float32)

bst = xgb.XGBRegressor(**param,tree_method= “exact”)

bst.fit(X_train,y_train, feature_weights=feature_weights,eval_set=[(X_valid, y_valid)])

I see that only first tree features are shown in the plots, although I set them to be zero.

4)

When I define feature_weights in that way by increasing my number of features to 5:

feature_weights = np.array([0,0,0,0,1]).astype(np.float32)

bst = xgb.XGBRegressor(**param,tree_method= “exact”)

bst.fit(X_train,y_train, feature_weights=feature_weights,eval_set=[(X_valid, y_valid)])

I see that only first tree features are shown in the plots, and results become constantly zero again.

What should I do, we at least need tutorial I think.