You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When aggregating messages in PNAConv, scalers are applied one after the other and then stacked, but the variable used on the for loop is the same one always. That is, scalers are applied to the same variable one after the other, instead of applying each scaler independently to the same base variable.
and the variable out is reused over the entire loop.
Proposed solution
The solution is as simple as using a different variable name within the for loop and do, e.g., out_i = out at the start of the loop in line 77. Since it is such a simple change, I am happy to open a pull request if requested.
🐛 Describe the bug
When aggregating messages in PNAConv, scalers are applied one after the other and then stacked, but the variable used on the for loop is the same one always. That is, scalers are applied to the same variable one after the other, instead of applying each scaler independently to the same base variable.
This is the latest code in the repository:
pytorch_geometric/torch_geometric/nn/aggr/scaler.py
Lines 68 to 90 in 8bcc77c
and the variable
out
is reused over the entire loop.Proposed solution
The solution is as simple as using a different variable name within the for loop and do, e.g.,
out_i = out
at the start of the loop in line 77. Since it is such a simple change, I am happy to open a pull request if requested.I might be wrong here, so please correct me if that's the case, but looking at the original paper and the official implementation, I think this is a bug:
https://github.com/lukecavabarrett/pna/blob/0c630c2e2d88bb1ef784c850dd8f3a069fcd9489/models/pytorch_geometric/pna.py#L158
Environment
conda
,pip
, source):torch-scatter
):The text was updated successfully, but these errors were encountered: