Visualizing SHAP explainability

Explainable AI in Python

Fouad Trad

Machine Learning Engineer

Dataset

age gender bmi children smoker charges
19 0 27.900 0 1 16884.92
18 1 33.770 1 0 1725.55
28 1 33.000 3 0 4449.46
33 1 22.705 0 0 21984.47
32 1 28.880 0 0 3866.85

model: random forest regressor to predict charges

import shap
explainer = shap.TreeExplainer(model)
shap_values = explainer.shap_values(X)
Explainable AI in Python

Feature importance plot

  • Shows contribution of each feature on model output
shap.summary_plot(shap_values, X, plot_type="bar")

Feature importance plot showing that smoking status is the most important feature.

Explainable AI in Python

Beeswarm plot

  • Shows SHAP values distribution
  • Highlights direction and magnitude of each feature on prediction
    • Red color → high feature value
    • Blue color → low feature value
    • SHAP value > 0 → increases outcome
    • SHAP value < 0 → decreases outcome

Beeswarm plot showing shap value distribution for each feature, with colors indicating feature values.

Explainable AI in Python

Beeswarm plot

  • Shows SHAP values distribution
  • Highlights direction and magnitude of each feature on prediction
    • Red color → high feature value
    • Blue color → low feature value
    • SHAP value > 0 → increases outcome
    • SHAP value < 0 → decreases outcome

Same beeswarm plot highlighting that high feature values of the 'smoker' feature have positive SHAP values.

Explainable AI in Python

Beeswarm plot

  • Shows SHAP values distribution
  • Highlights direction and magnitude of each feature on prediction
    • Red color → high feature value
    • Blue color → low feature value
    • SHAP value > 0 → increases outcome
    • SHAP value < 0 → decreases outcome
shap.summary_plot(shap_values, X, 
                  plot_type="dot")

Same beeswarm plot highlighting that low feature values of the 'smoker' feature have negative SHAP values.

Explainable AI in Python

Partial dependence plot

  • Shows relationship between feature and predicted outcome
  • Shows feature's impact across its range
  • Verifies if relationship is as expected

A generic partial dependence plot showing the relation between a feature and an outcome as a decreasing function.

Explainable AI in Python

Partial dependence plot

  • For each sample:
    • Vary value of selected feature
    • Hold other features constant
    • Predict outcome
  • Average results from all samples
shap.partial_dependence_plot("age", 
                             model.predict, 
                             X)

Partial dependence plot for the age feature showing the when age increases, the predicted value increases.

Explainable AI in Python

Partial dependence plot

  • For each sample:

    • Vary value of selected feature
    • Hold other features constant
    • Predict outcome
  • Average results from all samples

shap.partial_dependence_plot("age", 
                             model.predict, 
                             X)

Same partial dependence plot highlighting the distribution of the age values in the dataset.

Explainable AI in Python

Let's practice!

Explainable AI in Python

Preparing Video For Download...