-
Notifications
You must be signed in to change notification settings - Fork 4
/
README.Rmd
203 lines (163 loc) · 8.07 KB
/
README.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
---
output: github_document
---
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
fig.path = "man/figures/README-",
out.width = "100%"
)
```
# **innsight** - Get the insights of your neural network
<a href='https://bips-hb.github.io/innsight/'><img src='man/figures/logo.png' align="right" width="200" /></a>
<!-- badges: start -->
[![R-CMD-check](https://github.com/bips-hb/innsight/actions/workflows/R-CMD-check.yaml/badge.svg)](https://github.com/bips-hb/innsight/actions/workflows/R-CMD-check.yaml)
[![CRAN status](https://www.r-pkg.org/badges/version/innsight)](https://CRAN.R-project.org/package=innsight)
[![Lifecycle: experimental](https://img.shields.io/badge/lifecycle-experimental-orange.svg)](https://lifecycle.r-lib.org/articles/stages.html#experimental)
[![Codecov test coverage](https://codecov.io/gh/bips-hb/innsight/branch/master/graph/badge.svg)](https://app.codecov.io/gh/bips-hb/innsight?branch=master)
<!-- badges: end -->
## Table of contents
* [Introduction](#introduction)
* [Installation](#installation)
* [Usage](#usage)
* [Examples](#examples)
* [Contributing and future work](#contributing-and-future-work)
* [Citation](#citation)
* [Funding](#funding)
## Introduction
**innsight** is an R package that interprets the behavior and explains individual
predictions of modern neural networks. Many methods for explaining individual
predictions already exist, but hardly any of them are implemented or
available in R. Most of these so-called *feature attribution* methods are
only implemented in Python and thus difficult to access or use for the
R community. In this sense, the package **innsight** provides a common interface
for various methods for the interpretability of neural networks and can
therefore be considered as an R analogue to
[iNNvestigate](https://github.com/albermax/innvestigate) or
[Captum](https://captum.ai/) for Python.
This package implements several model-specific
interpretability (feature attribution) methods based on neural networks in R, e.g.,
* Layer-wise Relevance Propagation ([LRP](https://doi.org/10.1371/journal.pone.0130140))
* Including propagation rules: $\varepsilon$-rule and $\alpha$-$\beta$-rule
* Deep Learning Important Features ([DeepLift](https://arxiv.org/abs/1704.02685))
* Including propagation rules for non-linearities: Rescale rule and RevealCancel rule
* [DeepSHAP](https://proceedings.neurips.cc/paper/2017/hash/8a20a8621978632d76c43dfd28b67767-Abstract.html)
* Gradient-based methods:
* Vanilla Gradient, including [Gradient x Input](https://www.jmlr.org/papers/v11/baehrens10a.html)
* Smoothed gradients ([SmoothGrad](https://arxiv.org/abs/1706.03825)), including
SmoothGrad x Input
* [Integrated gradients](https://arxiv.org/abs/1703.01365)
* [Expected gradients](https://doi.org/10.1038/s42256-021-00343-w)
* Connection Weights
Example results for these methods on ImageNet with pretrained network VGG19 (see
[Example 3: ImageNet with
**keras**](https://bips-hb.github.io/innsight/articles/Example_3_imagenet.html)
for details):
![vgg16](https://github.com/bips-hb/innsight/blob/master/man/images/Vgg19_result.png?raw=true)
The package **innsight** aims to be as flexible as possible and independent of a
specific deep learning package in which the passed network has been learned.
Basically, a neural network of the libraries
[**torch**](https://torch.mlverse.org/),
[**keras**](https://tensorflow.rstudio.com/) and
[**neuralnet**](https://CRAN.R-project.org/package=neuralnet) can be passed,
which is internally converted into a **torch** model with special insights
needed for interpretation. But it is also possible to pass an arbitrary net in
form of a named list (see
[vignette](https://bips-hb.github.io/innsight/articles/detailed_overview.html#model-as-named-list)
for details).
## Installation
The package can be installed directly from CRAN and the development version
from GitHub with the following commands (successful
installation of [`devtools`](https://devtools.r-lib.org/) is required)
```{r, eval = FALSE}
# Stable version
install.packages("innsight")
# Development version
devtools::install_github("bips-hb/innsight")
```
Internally, any passed model is converted to a **torch** model, thus the correct
functionality of this package relies on a complete and correct installation
of **torch**. For this reason, the following command must be run manually to
install the missing libraries LibTorch and LibLantern:
```{r, eval = FALSE}
torch::install_torch()
```
> **`r knitr::asis_output("\U1F4DD")` Note**
> Currently this can lead to problems under Windows if the Visual
Studio runtime is not pre-installed. See the issue on GitHub
[here](https://github.com/mlverse/torch/issues/246#issuecomment-695097121) or
for more information and other problems with installing **torch** see the official
installation
[vignette](https://CRAN.R-project.org/package=torch/vignettes/installation.html)
of **torch**.
## Usage
You have a trained neural network `model` and your model input data `data`. Now
you want to interpret individual data points or the overall behavior by using
the methods from the package **innsight**, then stick to the following
pseudo code:
```{r, eval=FALSE}
# --------------- Step 0: Train your model -----------------
# 'model' has to be an instance of either torch::nn_sequential,
# keras::keras_model_sequential, keras::keras_model or neuralnet::neuralnet
model = ...
# -------------- Step 1: Convert your model ----------------
# For keras and neuralnet
converter <- convert(model)
# For a torch model the argument 'input_dim' is required
converter <- convert(model, input_dim = model_input_dim)
# -------------- Step 2: Apply method ----------------------
# Apply global method
result <- run_method(converter) # no data argument is needed
# Apply local methods
result <- run_method(converter, data)
# -------------- Step 3: Get and plot results --------------
# Get the results as an array
res <- get_result(result)
# Plot individual results
plot(result)
# Plot a aggregated plot of all given data points in argument 'data'
plot_global(result)
boxplot(result) # alias of `plot_global` for tabular and signal data
# Interactive plots can also be created for both methods
plot(result, as_plotly = TRUE)
```
For a more detailed high-level introduction, see the
[introduction](https://bips-hb.github.io/innsight/articles/innsight.html)
vignette, and for a full in-depth explanation with all the possibilities, see
the ["In-depth explanation"](https://bips-hb.github.io/innsight/articles/detailed_overview.html)
vignette.
## Examples
- Iris dataset with **torch** model (numeric tabular data)
[→ vignette](https://bips-hb.github.io/innsight/articles/Example_1_iris.html)
- Penguin dataset with **torch** model and trained with **luz** (numeric and categorical tabular data)
[→ vignette](https://bips-hb.github.io/innsight/articles/Example_2_penguin.html)
- ImageNet dataset with pre-trained models in **keras** (image data)
[→ article](https://bips-hb.github.io/innsight/articles/Example_3_imagenet.html)
## Contributing and future work
If you would like to contribute, please open an issue or submit a pull request.
This package becomes even more alive and valuable if people are using it for
their analyses. Therefore, don't hesitate to write me (<[email protected]>)
or create a feature request if you are missing something for your analyses or
have great ideas for extending this package. Currently, we are working on the following:
- [ ] GPU support
- [ ] More methods, e.g. Grad-CAM, etc.
- [ ] More examples and documentation (contact me if you have a non-trivial
application for me)
## Citation
If you use this package in your research, please cite it as follows:
```{}
@Article{,
title = {Interpreting Deep Neural Networks with the Package {innsight}},
author = {Niklas Koenen and Marvin N. Wright},
journal = {Journal of Statistical Software},
year = {2024},
volume = {111},
number = {8},
pages = {1--52},
doi = {10.18637/jss.v111.i08},
}
```
## Funding
This work is funded by the German Research Foundation (DFG) in the context
of the Emmy Noether Grant 437611051.