Releases: kingbri1/flash-attention
Releases Β· kingbri1/flash-attention
v2.4.2
Inline with the parent repo's tag
Made for cuda 12.x and pytorch 2.1.2 and 2.2
v2.4.3 and up cannot be built on Windows at this time.
v2.4.1
Add Windows workflows
2.3.3-windows
In parity with the original tag
Built with Pytorch 2.1.1 and CUDA 12.2. This wheel will work with pytorch 2.1+ and CUDA 12+
Full Changelog: https://github.com/bdashore3/flash-attention/commits/2.3.3
2.3.2-windows
Cuda 12.1 only. Please see The original repo for more information
2.3.2-2-windows
Update wheels to cuda 12.2 and 12.1 versions. The 12.2 wheel is backwards compatible with 12.1 (on my 3090ti system).
2.3.2-1-windows
Tests "unified" wheel