0.0.8
Install steps
pip install flashtorch
Upgrade steps
pip install flashtorch -U
Breaking changes
- None
New features
- None
Bug fixes
- Fixes #2
Improvements
-
Users can explicitly set a device to use when calculating gradients when using an instance of
Backprop
, by settinguse_gpu=True
. If it's True andtorch.cuda.is_available
, the computation will be moved to GPU. It defaults toFalse
if not provided.from flashtorch.saliency import Backprop ... # Prepare input and target_class model = model() backprop = Backprop(model) gradients = backprop. calculate_gradients(input, target_class, use_gpu=True)
Other changes
setup.py
has better indications of supported Python versions