-
Notifications
You must be signed in to change notification settings - Fork 352
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
einops and tensorboard graphs #107
Comments
In the meantime I realized that repeat is not available as Layer. My proposal is to provide the function repeat as a Layer as well. My workaround is to define a simple wrapper as layer myself. Important is that parameters are set when calling forward and not when instanciating. E.g. batch size is only available during layer execution.
By the way: Same works for einsum which has the same problem with tensorboard graphs
|
Hi @JBOE22175, thanks for very detailed description of issue and the notebook.
I also don't see tensorboard to show actual parameters of layers, that's bummer
|
Hi, Conserning the usage of pattern and parameter for repeat:
Case 2: Here b = batch_size ist not known when calling init:
In both cases the pattern should be provided in init. But the parameter b in case 2 must be added in forward. |
I meet the same issue in case 2 |
First of all: I like einops - readability in NN models is very much improved!
a) Using einops as operation will create nearly unreadable graphs in tensorboard. To avoid this you should use einops as layers. You should add this to your documentation and perhaps cover layers more prominently.
b) using "parse_shape" is also a nice function but it creates a warning when used with tensorboard add_graph() :
"RuntimeWarning: Iterating over a tensor might cause the trace to be incorrect. Passing a tensor of different shape won't change the number of iterations executed (and might lead to errors or silently give incorrect results)."
Platform: windows with pytorch, tensorboard 2.2.0
see attached jupyter notebook
einops_tensorboard.zip
The text was updated successfully, but these errors were encountered: