-
Notifications
You must be signed in to change notification settings - Fork 35
Issues: lucidrains/memory-efficient-attention-pytorch
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Example for long embedding sequence prediction using this attention
#8
opened Jun 3, 2024 by
ramdhan1989
Making this work with relative position bias from XTransformers
#5
opened Dec 2, 2022 by
pfeatherstone
save_for_backward can only save variables, but argument 5 is of type bool
#4
opened Oct 26, 2022 by
abalikhan
Checkpointing is not compatible with .grad() or when an
inputs
parameter is passed to .backward()
#3
opened Oct 5, 2022 by
vrobot
ProTip!
Find all open issues with in progress development work with linked:pr.