-
Notifications
You must be signed in to change notification settings - Fork 352
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing einops.layers.....Repeat #185
Comments
Hi Christoph, Please read a relevant discussion here: #126 As far as I see, in most cases repeat can be replaced with rearrange + broadcasting (which is also more memory-efficient). Repeat layer is necessary e.g. if you need to overwrite components of output. Let me know if that's the case for you |
Yes, I agree, that usually rearrange + broadcasting is the better option. With one hyperparameter of my experiment I say, that only the time when a speaker is active should be estimated (i.e. no frequency resolution). So, yes, you can always use repeat and broadcasting, but sometimes it is better to just waste some computational time, if the remaining codes get easier to read. With your argumentation, you could also say, that you never need the function repeat. For computations, there always exists a possibility to do it with rearrange and broadcasting. |
not really, np.repeat and np.tile behavior can't be obtained from other functions. It's just my observation that use-cases for Repeat layer mostly fall into bin "you could broadcast it". Anyway, I see your point about convenience. I've also recalled that now repeat uses "torch.expand"-like behavior, so it should be quite performant anyway. Will try to incorporate |
Having a |
Hello, I have this use case where I want to augment an image with multiple different (random) transformations. torch.nn.Sequential(
einops.layers.torch.Repeat("B C H W -> (B N) C H W", N=10),
RandomImageAugmentation(), # B C H W → B C H W, random edit per image in batch
einops.layers.torch.Rearrange("(B N) C H W -> B N C H W", N=10),
) |
I tried to use repeat for in torch and needed a layer, but strangely it was not there.
I know, that I could use
einops.layers.torch.Reduce('...', reduction='repeat', ...)
, but that is confusing to read.What do you think about adding
einops.layers.....Repeat
functions to einops?Here is a toy example, where the last line fails because the function counterpart is missing and the first line is difficult to read:
Since
einops.repeat
exists, I think the same use cases would be valid for a layer. I have one case, where I want to use it in pytorch.Something like the following in each layers backend file
I would say yes, it is the counterpart of the function
einops.repeat
.I think this is obvious, currently I use
einops.layers.torch.Reduce('...', reduction='repeat', ...)
and that is confusing.The text was updated successfully, but these errors were encountered: