Get access to trained booster parameters

Hi, here is my code to train a model:

params = {“max_depth”:10, “eta”:0.05, “objective”:“binary:logistic”, “subsample”:1}
xgb_model = xgb.train(params=params, dtrain=dtrain, num_boost_round=8)

How can I use the xgb_model variable to get access values of “max_depth”, “objective”, etc. ?
I’m asking this because I’m working to a visualization library, https://github.com/parrt/dtreeviz, and I didn’t find a way to get parameter values from the trained model. (not from my predefined ‘params’ variable).
I’ve look through the API and haven’t found a solution.

dtreeviz library will contain very cool visualizations for xgboost :slight_smile:

Thanks.

@tlapusan Thanks for putting XGBoost on dtreeviz. I wanted to do it but could not find time.

You can run save_config() method to obtain the Booster parameters:

bst = xgb.train(...)
config = json.loads(bst.save_config())
print(config)

Also make sure to use the JSON format so that you can parse it easily:

bst.save_model('my_model.json')   # Notice the use of JSON extension

@hcho3 thanks a lot.
I’ve tried your suggestion and it works.

hi @hcho3, we’ve just release the xgboost implementation in dtreeviz library https://github.com/parrt/dtreeviz.

People can use pip to try it ‘pip install -U dtreeviz’.
If you have time to try the new visualizations for xgboost, I will be happy to hear your feedback.

Thanks for your support!

1 Like