shap : SystemError: initialization of _internal failed without raising an exception

Question:

I am using a SVC to predict a target. I am tryring to use shap to get features importance. but it fails.

here is my simple code that I copied from the official doc of shap :

import shap
svc_linear = SVC(C=1.2, probability=True)
svc_linear.fit(X_train, Y_train)
explainer = shap.KernelExplainer(svc_linear.predict_proba, X_train)
shap_values = explainer.shap_values(X_test)
shap.force_plot(explainer.expected_value[0], shap_values[0], X_test)

but I get this :

---------------------------------------------------------------------------
SystemError                               Traceback (most recent call last)
~AppDataLocalTempipykernel_110123923049429.py in <module>
----> 1 import shap
      2 svc_linear = SVC(C=1.2, probability=True)
      3 svc_linear.fit(X_train, Y_train)
      4 explainer = shap.KernelExplainer(svc_linear.predict_proba, X_train)
      5 shap_values = explainer.shap_values(X_test)

~Anaconda3libsite-packagesshap__init__.py in <module>
     10     warnings.warn("As of version 0.29.0 shap only supports Python 3 (not 2)!")
     11 
---> 12 from ._explanation import Explanation, Cohorts
     13 
     14 # explainers

~Anaconda3libsite-packagesshap_explanation.py in <module>
     10 from slicer import Slicer, Alias, Obj
     11 # from ._order import Order
---> 12 from .utils._general import OpChain
     13 from .utils._exceptions import DimensionError
     14 

~Anaconda3libsite-packagesshaputils__init__.py in <module>
----> 1 from ._clustering import hclust_ordering, partition_tree, partition_tree_shuffle, delta_minimization_order, hclust
      2 from ._general import approximate_interactions, potential_interactions, sample, safe_isinstance, assert_import, record_import_error
      3 from ._general import shapley_coefficients, convert_name, format_value, ordinal_str, OpChain, suppress_stderr
      4 from ._show_progress import show_progress
      5 from ._masked_model import MaskedModel, make_masks

~Anaconda3libsite-packagesshaputils_clustering.py in <module>
      2 import scipy as sp
      3 from scipy.spatial.distance import pdist
----> 4 from numba import jit
      5 import sklearn
      6 import warnings

~Anaconda3libsite-packagesnumba__init__.py in <module>
     40 
     41 # Re-export vectorize decorators and the thread layer querying function
---> 42 from numba.np.ufunc import (vectorize, guvectorize, threading_layer,
     43                             get_num_threads, set_num_threads)
     44 

~Anaconda3libsite-packagesnumbanpufunc__init__.py in <module>
      1 # -*- coding: utf-8 -*-
      2 
----> 3 from numba.np.ufunc.decorators import Vectorize, GUVectorize, vectorize, guvectorize
      4 from numba.np.ufunc._internal import PyUFunc_None, PyUFunc_Zero, PyUFunc_One
      5 from numba.np.ufunc import _internal, array_exprs

~Anaconda3libsite-packagesnumbanpufuncdecorators.py in <module>
      1 import inspect
      2 
----> 3 from numba.np.ufunc import _internal
      4 from numba.np.ufunc.parallel import ParallelUFuncBuilder, ParallelGUFuncBuilder
      5 

SystemError: initialization of _internal failed without raising an exception

I don’t know why? does anyone knows why ?

ps :

python version : 3.9.13

shap version : 0.40.0

Asked By: ikram zouaoui

||

Answers:

As per Hiran’s comment in the question, it also worked for me.
install shap again after uninstall it.

pip uninstall shap

pip install shap

Answered By: CHOI

In my case reinstalling sharp didn’t help.

The problem is most likely caused by a bug in Numba library. More details: https://github.com/numba/numba/issues/8718 and https://github.com/numba/numba/issues/8615

It should be fixed in a next release (0.57).

EDIT:
When I reinstalled numba (pip uninstall numba ; pip install numba) the problem has disappeared. I think it might be related to updated packages in my system.

Answered By: Chris Ociepa