Description
Describe the bug
When you want a summary for a model that requires multiple input parameters for forward, then this doesn't work. You can set self.example_input_array to be a tuple and there is some code for passing this to the forward method. However, if the model is on cuda, it tries to pass to move this input directly to cuda without a check whether it is a tuple or list.
the line with the error is here:
pytorch-lightning/blob/master/pytorch_lightning/root_module/memory.py#L53
example of how it should be checked:
pytorch-lightning/blob/master/pytorch_lightning/root_module/memory.py#L61
To Reproduce
Steps to reproduce the behavior:
- create a model that requires multiple inputs in the forward method.
- set self.example_input_array to be a tuple
- run the model on GPU
Expected behavior
a list with all layers and the input and output shapes of these layers.
Desktop (please complete the following information):
- OS: Linux Mint 19.2
- Browser chrome
- Version 8.0.3904.97 (Official Build) (64-bit)