Is CEBRA invariant to permutations in the input channels? #182
Answered
by
stes
icarosadero
asked this question in
Q&A
-
Should I still get the same embedding even if I shuffle the order of the input neurons? |
Beta Was this translation helpful? Give feedback.
Answered by
stes
Oct 20, 2024
Replies: 1 comment 1 reply
-
Hi @icarosadero , you would not get the same embedding, but an embedding with very high consistency. So yes, the ordering does not matter. This applies if you train and embed from scratch. The input layer is not permutation invariant in the sense that you can train a model, input permuted channels, and expect to get the same embedding. Does that answer the Q? |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
icarosadero
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi @icarosadero , you would not get the same embedding, but an embedding with very high consistency. So yes, the ordering does not matter.
This applies if you train and embed from scratch. The input layer is not permutation invariant in the sense that you can train a model, input permuted channels, and expect to get the same embedding.
Does that answer the Q?