Unifying local and global model explanations by functional decomposition of low dimensional structures

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

Unifying local and global model explanations by functional decomposition of low dimensional structures. / Hiabu, Munir; Meyer, Joseph T.; Wright, Marvin N.

Proceedings of The 26th International Conference on Artificial Intelligence and Statistics. PMLR, 2023. s. 7040-7060 (Proceedings of Machine Learning Research, Bind 206).

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Hiabu, M, Meyer, JT & Wright, MN 2023, Unifying local and global model explanations by functional decomposition of low dimensional structures. i Proceedings of The 26th International Conference on Artificial Intelligence and Statistics. PMLR, Proceedings of Machine Learning Research, bind 206, s. 7040-7060, 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023, Valencia, Spanien, 25/04/2023. <https://proceedings.mlr.press/v206/hiabu23a.html>

APA

Hiabu, M., Meyer, J. T., & Wright, M. N. (2023). Unifying local and global model explanations by functional decomposition of low dimensional structures. I Proceedings of The 26th International Conference on Artificial Intelligence and Statistics (s. 7040-7060). PMLR. Proceedings of Machine Learning Research Bind 206 https://proceedings.mlr.press/v206/hiabu23a.html

Vancouver

Hiabu M, Meyer JT, Wright MN. Unifying local and global model explanations by functional decomposition of low dimensional structures. I Proceedings of The 26th International Conference on Artificial Intelligence and Statistics. PMLR. 2023. s. 7040-7060. (Proceedings of Machine Learning Research, Bind 206).

Author

Hiabu, Munir ; Meyer, Joseph T. ; Wright, Marvin N. / Unifying local and global model explanations by functional decomposition of low dimensional structures. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics. PMLR, 2023. s. 7040-7060 (Proceedings of Machine Learning Research, Bind 206).

Bibtex

@inproceedings{5a2c4a25b1754d8e961e3184ba55081a,
title = "Unifying local and global model explanations by functional decomposition of low dimensional structures",
abstract = "We consider a global representation of a regression or classification function by decomposing it into the sum of main and interaction components of arbitrary order. We propose a new identification constraint that allows for the extraction of interventional SHAP values and partial dependence plots, thereby unifying local and global explanations. With our proposed identification, a feature's partial dependence plot corresponds to the main effect term plus the intercept. The interventional SHAP value of feature k is a weighted sum of the main component and all interaction components that include k, with the weights given by the reciprocal of the component's dimension. This brings a new perspective to local explanations such as SHAP values which were previously motivated by game theory only. We show that the decomposition can be used to reduce direct and indirect bias by removing all components that include a protected feature. Lastly, we motivate a new measure of feature importance. In principle, our proposed functional decomposition can be applied to any machine learning model, but exact calculation is only feasible for low-dimensional structures or ensembles of those. We provide an algorithm and efficient implementation for gradient-boosted trees (xgboost) and random planted forest. Conducted experiments suggest that our method provides meaningful explanations and reveals interactions of higher orders. The proposed methods are implemented in an R package, available at https://github.com/PlantedML/glex.",
author = "Munir Hiabu and Meyer, {Joseph T.} and Wright, {Marvin N.}",
note = "Publisher Copyright: Copyright {\textcopyright} 2023 by the author(s); 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 ; Conference date: 25-04-2023 Through 27-04-2023",
year = "2023",
language = "English",
series = "Proceedings of Machine Learning Research",
pages = "7040--7060",
booktitle = "Proceedings of The 26th International Conference on Artificial Intelligence and Statistics",
publisher = "PMLR",

}

RIS

TY - GEN

T1 - Unifying local and global model explanations by functional decomposition of low dimensional structures

AU - Hiabu, Munir

AU - Meyer, Joseph T.

AU - Wright, Marvin N.

N1 - Publisher Copyright: Copyright © 2023 by the author(s)

PY - 2023

Y1 - 2023

N2 - We consider a global representation of a regression or classification function by decomposing it into the sum of main and interaction components of arbitrary order. We propose a new identification constraint that allows for the extraction of interventional SHAP values and partial dependence plots, thereby unifying local and global explanations. With our proposed identification, a feature's partial dependence plot corresponds to the main effect term plus the intercept. The interventional SHAP value of feature k is a weighted sum of the main component and all interaction components that include k, with the weights given by the reciprocal of the component's dimension. This brings a new perspective to local explanations such as SHAP values which were previously motivated by game theory only. We show that the decomposition can be used to reduce direct and indirect bias by removing all components that include a protected feature. Lastly, we motivate a new measure of feature importance. In principle, our proposed functional decomposition can be applied to any machine learning model, but exact calculation is only feasible for low-dimensional structures or ensembles of those. We provide an algorithm and efficient implementation for gradient-boosted trees (xgboost) and random planted forest. Conducted experiments suggest that our method provides meaningful explanations and reveals interactions of higher orders. The proposed methods are implemented in an R package, available at https://github.com/PlantedML/glex.

AB - We consider a global representation of a regression or classification function by decomposing it into the sum of main and interaction components of arbitrary order. We propose a new identification constraint that allows for the extraction of interventional SHAP values and partial dependence plots, thereby unifying local and global explanations. With our proposed identification, a feature's partial dependence plot corresponds to the main effect term plus the intercept. The interventional SHAP value of feature k is a weighted sum of the main component and all interaction components that include k, with the weights given by the reciprocal of the component's dimension. This brings a new perspective to local explanations such as SHAP values which were previously motivated by game theory only. We show that the decomposition can be used to reduce direct and indirect bias by removing all components that include a protected feature. Lastly, we motivate a new measure of feature importance. In principle, our proposed functional decomposition can be applied to any machine learning model, but exact calculation is only feasible for low-dimensional structures or ensembles of those. We provide an algorithm and efficient implementation for gradient-boosted trees (xgboost) and random planted forest. Conducted experiments suggest that our method provides meaningful explanations and reveals interactions of higher orders. The proposed methods are implemented in an R package, available at https://github.com/PlantedML/glex.

M3 - Article in proceedings

AN - SCOPUS:85165133091

T3 - Proceedings of Machine Learning Research

SP - 7040

EP - 7060

BT - Proceedings of The 26th International Conference on Artificial Intelligence and Statistics

PB - PMLR

T2 - 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023

Y2 - 25 April 2023 through 27 April 2023

ER -

ID: 382991169