-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't TorchScript LightningModule when using Metric #4416
Comments
Hi! thanks for your contribution!, great first issue! |
@hudeven as a workaround can you override |
yeah, it works by overriding to_torchscript() and deleting the metric attributes there |
A controversial suggestion but could we use another method name instead of forward? I understand using the class name directly is convenient. Also curious about why the metrics class is an nn.Module? Is it to avoid the pains of syncing across distributed envs instead of using torch.distributed? |
Having Metrics somehow unique would also be helpful when saving / loading .ckpt. While this is intended behavior (it's a nn.Module), I also run into trouble: #4361 For inference Metrics are not necessary, but if you use them, you need to set |
@hudeven I don't know if you need to use TorchScript in combination with Add this to your script: def training_step(self, batch, batch_idx):
# use first batch to create an example input
if self.example_input_array is None:
# we only need 1 samples, not a whole batch, but keep the batch dimension
self.example_input_array = batch[0, :].unsqueeze(dim=0) And change This does not solve the actual issue at hand, but can be a workaround for some here. |
@snisarg, we are using an |
@NumesSanguis thanks for the workaround! I intend to use 'script'. This issue is fixed in #4428 |
🐛 Bug
Please reproduce using the BoringModel and post here
able to reproduce it in https://colab.research.google.com/drive/1MscNHxIc_LIbZxALHbZOAkooNu0TzVly?usp=sharing
To Reproduce
Expected behavior
Able to torchscript a Lightning moduel no matter Metric is used or not
It seems hard to make Metric torchscriptable as *args and **kwargs are useful in Python but not supported in torchscript.
As Metric is not needed for inference, I think it should be excluded when calling LightningModule.to_torchscript().
Environment
Note:
Bugs with code
are solved faster !Colab Notebook
should be madepublic
!Colab Notebook
: Please copy and paste the output from our environment collection script (or fill out the checklist below manually).You can get the script and run it with:
Additional context
The text was updated successfully, but these errors were encountered: