-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to calculate SHAP values for simple MLP Neural Network? #1304
Comments
any update on this one |
any update, please? |
In my case, I'm not even able to run the KernelExplainer on a jupyter notebook. It crashed using all the RAM. I'm using MLP + TF-IDF on a 3000 samples and ~9000 words vocabulary. What do you suggest? |
This issue has been inactive for two years, so it's been automatically marked as 'stale'. We value your input! If this issue is still relevant, please leave a comment below. This will remove the 'stale' label and keep it open. If there's no activity in the next 90 days the issue will be closed. |
This issue has been automatically closed due to lack of recent activity. Your input is important to us! Please feel free to open a new issue if the problem persists or becomes relevant again. |
I'm building AutoML package where I provide SHAP explanations for different models. I have a problem with SHAP explanations for Neural Networks (issue). The NN that I construct is simple Multi-Layer Perceptron with 2 hidden layers. The computation of SHAP values takes a lot of time (several hours) for very small datasets.
Below simple example:
On this simple dataset, computing SHAP values take > 8 hours. What is the faster way to compute the SHAP values? For other algorithms (Xgboost, CatBoost, Extra Tress, LightGBM, Random Forest, Linear Regression) all computations are under 1 minute.
The text was updated successfully, but these errors were encountered: