Skip to content

0.0.8

Compare
Choose a tag to compare
@MisaOgura MisaOgura released this 08 Jul 10:45
5b77eb3

Install steps

  • pip install flashtorch

Upgrade steps

  • pip install flashtorch -U

Breaking changes

  • None

New features

  • None

Bug fixes

  • Fixes #2

Improvements

  • Users can explicitly set a device to use when calculating gradients when using an instance of Backprop, by setting use_gpu=True. If it's True and torch.cuda.is_available, the computation will be moved to GPU. It defaults to False if not provided.

    from flashtorch.saliency import Backprop
    
    ... # Prepare input and target_class
    
    model = model()
    backprop = Backprop(model)
    gradients = backprop. calculate_gradients(input, target_class, use_gpu=True)
    

Other changes

  • setup.py has better indications of supported Python versions