Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Kaggle P100 GPU #4

Closed
pchankh opened this issue Jun 3, 2022 · 1 comment
Closed

Support for Kaggle P100 GPU #4

pchankh opened this issue Jun 3, 2022 · 1 comment

Comments

@pchankh
Copy link

pchankh commented Jun 3, 2022

First and foremost congratulation of a great R&D work here. Wonder if there is any plan to support Kaggle GPUs?
I suspect there will be a great deal of interest from the community there to test this.

Cheers
Dr.Patrick

@tridao
Copy link
Contributor

tridao commented Jun 3, 2022

Thanks for the interest!
We're currently prioritizing support for Turing (T4) & Volta (V100) architectures as mentioned in the roadmap.
We've been talking to the xformers team and they're also working on something similar (targeting fp32 instead of fp16).
There's a chance we might be able to borrow what they have to support P100 eventually.
facebookresearch/xformers#267
facebookresearch/xformers#281

@tridao tridao closed this as completed Nov 10, 2022
groenenboomj referenced this issue in ROCm/flash-attention Feb 17, 2023
kuizhiqing pushed a commit to kuizhiqing/flash-attention that referenced this issue May 26, 2023
* add for alpha_fold2

* add some extra setting

* fix some bugs

* fix some changes

* fix some bugs 2nd

* Add another initition of Gmem_tile_qkv and Gmem_tile_o

* add some compensation for try..catch

* fix mistake in flash_attn_fwd

* commit for code style and bug check

* fix some bugs for flash_attn_with_bias-mask

* add more print for pointer debug

* add some bug test cases.

* backward function

* fix bugs

* make some changes for backward

* Fix compiling error.

* quote all printf debug

* quote all printf debug and fix interface error

* quote all printf debug and fix interface error, fix typo

* remove all printf

* split files

* remove useless debug code

* split fwd and bwd execution function

* split fwd and bwd execution function

* remove useless codes

* remove useless codes

* remove useless codes 3rd times

* remove useless codes 4th times

* Fix compiling error.

* Remove const.
njhill pushed a commit to njhill/flash-attention that referenced this issue Sep 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants