How to measure the features contribute for each output?

Most discussions are focus on the features importance for all trees in XGBoost.

But I would like to get to know the features contribute for each output. Since each observation may has different paths. They would get the output because of different features. And different features may contribute to the output high or low.

In regression XGBoost may measure the contribution with R-square in my imagination. I’m not sure is it correct. And I’ve no idea how it looks like in classification XGBoost .

Is there any idea? Thank you.

Have you looked into using SHAP? https://github.com/slundberg/shap

It seems to be another way to know the contribution, a general way.
I’ll take a look to the SHAP! Thank you for your help.

Besides, according to the tree construction. I thought there may be some way to know the contribution with the value like “Gain”, “Gini”, “R-square”… to measure the contribution?
Or most of the time using LIME, SHAP methods?