Skip to content

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward. #228

Answered by frgfm
karnikkanojia asked this question in Q&A
Discussion options

You must be logged in to vote

Thanks! So the error mentioned can be avoided as the library allows low-level PyTorch options:

import torch
import torchxrayvision as xrv
from torchcam.methods import GradCAM

image = xrv.utils.load_image(<path>)
image = torch.from_numpy(image)
model = xrv.models.DenseNet(weights="densenet121-res224-all").eval()
cam = GradCAM(model=model, target_layer=model.features[-2][-1][-1])
preds = model(image.unsqueeze(0))
cam_outputs = [cam(class_idx=idx, scores=preds, retain_graph=True) for idx in range(len(model.pathologies))]

This piece of code doesn't crash on my end 👍
It might be a bit slow as it will perform the backprop for each pathologies (18 apparently). One option that would use more RAM…

Replies: 2 comments 5 replies

Comment options

You must be logged in to vote
4 replies
@karnikkanojia
Comment options

@frgfm
Comment options

frgfm Jan 3, 2024
Maintainer

Answer selected by karnikkanojia
@karnikkanojia
Comment options

@frgfm
Comment options

frgfm Jan 4, 2024
Maintainer

Comment options

You must be logged in to vote
1 reply
@frgfm
Comment options

frgfm Nov 6, 2024
Maintainer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants