Skip to content

Summary not working for model on GPU with multiple inputs #537

Closed
@VSJMilewski

Description

@VSJMilewski

Describe the bug
When you want a summary for a model that requires multiple input parameters for forward, then this doesn't work. You can set self.example_input_array to be a tuple and there is some code for passing this to the forward method. However, if the model is on cuda, it tries to pass to move this input directly to cuda without a check whether it is a tuple or list.

the line with the error is here:
pytorch-lightning/blob/master/pytorch_lightning/root_module/memory.py#L53

example of how it should be checked:
pytorch-lightning/blob/master/pytorch_lightning/root_module/memory.py#L61

To Reproduce
Steps to reproduce the behavior:

  1. create a model that requires multiple inputs in the forward method.
  2. set self.example_input_array to be a tuple
  3. run the model on GPU

Expected behavior
a list with all layers and the input and output shapes of these layers.

Screenshots
image

Desktop (please complete the following information):

  • OS: Linux Mint 19.2
  • Browser chrome
  • Version 8.0.3904.97 (Official Build) (64-bit)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions