TSR
TSR
Wrapper Class for Saliency Calculation. Automatically calls the corresponding PYT or TF implementation.
__new__(model, NumTimeSteps, NumFeatures, method='GRAD', mode='time', device='cpu', normalize=True, tsr=True)
Initialization
Arguments:
model [torch.nn.Module, tf.keras.Model]: model to be explained
NumTimeSteps int : Number of Time Step
NumFeatures int : Number Features
method str: Saliency Methode to be used
mode str: Second dimension 'time'->(1,time,feat)
or 'feat'->(1,feat,time)
Saliency_PTY(model, NumTimeSteps, NumFeatures, method='GRAD', mode='time', tsr=True, normalize=True, device='cpu')
Bases: Saliency
PyTorch Implementation for Saliency Calculation based on [1]. The Saliency Methods are based on the library captum [2]. For PyTorch the following saliency methods are available: + Gradients (GRAD) + Integrated Gradients (IG) + Gradient Shap (GS) + DeepLift (DL) + DeepLiftShap (DLS) + SmoothGrad (SG) + Shapley Value Sampling(SVS) + Feature Ablatiom (FA) + Occlusion (FO) References
[1] Ismail, Aya Abdelsalam, et al. "Benchmarking deep learning interpretability in time series predictions." Advances in neural information processing systems 33 (2020): 6441-6452. [2] Kokhlikyan, Narine, et al. "Captum: A unified and generic model interpretability library for pytorch." arXiv preprint arXiv:2009.07896 (2020).
Initialization
Arguments:
model [torch.nn.Module]: model to be explained
NumTimeSteps int : Number of Time Step
NumFeatures int : Number Features
method str: Saliency Methode to be used
mode str: Second dimension 'time'->(1,time,feat)
or 'feat'->(1,feat,time)
explain(item, labels, TSR=None, **kwargs)
Method to explain the model based on the item.
Arguments:
item np.array: item to get feature attribution for, if mode = time
->(1,time,feat)
or mode = feat
->(1,feat,time)
labels int: label
TSR bool: if True time series rescaling according to [1] is used, else plain (scaled) weights are returened
Returns:
np.array: feature attribution weights mode = time
->(time,feat)
or mode = feat
->(feat,time)
Saliency_TF(model, NumTimeSteps, NumFeatures, method='GRAD', mode='time', tsr=True, device='cpu')
Bases: Saliency
Tensorflow Implementation for Saliency Calculation based on [1]. The Saliency Methods are based on the library tf-explain [2] and shap [3]. For Tensorflow the following saliency methods are available: + Gradients (GRAD) + Integrated Gradients (IG) + Gradient Shap (GS)) + DeepLiftShap (DLS) + SmoothGrad (SG) + Occlusion (FO)
Attention: GS and DLS only work for Python < 3.10.
References
[1] Ismail, Aya Abdelsalam, et al. "Benchmarking deep learning interpretability in time series predictions." Advances in neural information processing systems 33 (2020): 6441-6452.
[2] Meudec, Raphael: , tf-explain. https://github.com/sicara/tf-explain
[3] Lundberg, Scott M., and Su-In Lee. "A unified approach to interpreting model predictions." Advances in neural information processing systems 30 (2017). https://shap.readthedocs.io/
PARAMETER | DESCRIPTION |
---|---|
model |
model to be explained
TYPE:
|
NumTimeSteps |
Number of Time Step
TYPE:
|
NumFeatures |
Number Features
TYPE:
|
method |
Saliency Methode to be used
TYPE:
|
mode |
Second dimension 'time'->
TYPE:
|
explain(item, labels, TSR=None)
Method to explain the model based on the item.
Arguments:
item np.array: item to get feature attribution for, if mode = time
->(1,time,feat)
or mode = feat
->(1,feat,time)
labels int: label
TSR bool: if True time series rescaling according to [1] is used, else plain (scaled) weights are returened
Returns:
np.array: feature attribution weights mode = time
->(time,feat)
or mode = feat
->(feat,time)