Provides SHAP explanations of machine learning models. In applied machine learning, there is a strong belief that we need to strike a balance between interpretability and accuracy. However, in field of the Interpretable Machine Learning, there are more and more new ideas for explaining black-box models. One of the best known method for local explanations is SHapley Additive exPlanations (SHAP) introduced by Lundberg, S., et al., (2016) <doi:10.48550/arXiv.1705.07874> The SHAP method is used to calculate influences of variables on the particular observation. This method is based on Shapley values, a technique used in game theory. The R package 'shapper' is a port of the Python library 'shap'.
Version: | 0.1.3 |
Imports: | reticulate, DALEX, ggplot2 |
Suggests: | covr, knitr, randomForest, rpart, testthat, markdown, qpdf |
Published: | 2020-08-28 |
DOI: | 10.32614/CRAN.package.shapper |
Author: | Szymon Maksymiuk [aut, cre], Alicja Gosiewska [aut], Przemyslaw Biecek [aut], Mateusz Staniak [ctb], Michal Burdukiewicz [ctb] |
Maintainer: | Szymon Maksymiuk <sz.maksymiuk at gmail.com> |
BugReports: | https://github.com/ModelOriented/shapper/issues |
License: | GPL-2 | GPL-3 [expanded from: GPL] |
URL: | https://github.com/ModelOriented/shapper |
NeedsCompilation: | no |
Materials: | NEWS |
In views: | MachineLearning |
CRAN checks: | shapper results |
Reference manual: | shapper.pdf |
Vignettes: |
How to use shapper for classification How to use shapper for regression |
Package source: | shapper_0.1.3.tar.gz |
Windows binaries: | r-devel: shapper_0.1.3.zip, r-release: shapper_0.1.3.zip, r-oldrel: shapper_0.1.3.zip |
macOS binaries: | r-release (arm64): shapper_0.1.3.tgz, r-oldrel (arm64): shapper_0.1.3.tgz, r-release (x86_64): shapper_0.1.3.tgz, r-oldrel (x86_64): shapper_0.1.3.tgz |
Old sources: | shapper archive |
Please use the canonical form https://CRAN.R-project.org/package=shapper to link to this page.