Monotone_constraints does not work

I’m looking for a linear classifier that can constrain coefficients to be non-negative, and arrive to XGBoost.

Following tutorials and some findings on Google, I wrote code as below.
However, it still gives negative coefficients.

What’s wrong here?

import xgboost as xgb

import numpy as np
from sklearn.model_selection import KFold
from sklearn.datasets import load_iris

iris = load_iris()
X = iris['data']
y = iris['target']

assert np.min(X) >= 0
assert np.min(y) >= 0

kf = KFold(n_splits=2, shuffle=True, random_state=42)
for train_index, test_index in kf.split(X):
	xgb_model = xgb.XGBClassifier(booster='gblinear', monotone_constraints="(1,1,1,1)").fit(X[train_index], y[train_index])
	predictions = xgb_model.predict(X[test_index])
	actuals = y[test_index]
	print("\nConfusion matrix ---------------")
	print(confusion_matrix(actuals, predictions))
	print("\nCoefficients -------------------")
	print(xgb_model.coef_)

Output:

Confusion matrix ---------------
[[29  0  0]
 [ 0  7 16]
 [ 0  0 23]]

Coefficients -------------------
[[-0.0196637   0.0086542   0.00804164  0.120476  ]
 [-0.0631702  -0.0526377  -0.298369    0.057412  ]
 [ 0.188522   -0.146681    0.00242743  0.143865  ]]

Confusion matrix ---------------
[[21  0  0]
 [ 1  4 22]
 [ 0  0 27]]

Coefficients -------------------
[[-0.00676357 -0.00036048  0.00594936  0.148731  ]
 [-0.0714969  -0.077248   -0.312422    0.0771748 ]
 [ 0.201741   -0.155078    0.0128066   0.142495  ]]

Sorry, monotone constraints are only available for tree models.