Skip to content

Releases: Dao-AILab/flash-attention

v2.4.1

24 Dec 05:01
Compare
Choose a tag to compare
Bump to v2.4.1

v2.4.0.post1

22 Dec 18:10
Compare
Choose a tag to compare
[CI] Don't compile for python 3.7 pytorch 2.2

v2.4.0

22 Dec 08:10
Compare
Choose a tag to compare
Bump to v2.4.0

v2.3.6

28 Nov 00:24
Compare
Choose a tag to compare
Bump to v2.3.6

v2.3.5

27 Nov 03:09
Compare
Choose a tag to compare
Bump to v2.3.5

v2.3.4

20 Nov 07:22
Compare
Choose a tag to compare
Bump to v2.3.4

v2.3.3

24 Oct 07:24
Compare
Choose a tag to compare
Bump to v2.3.3

v2.3.2

09 Oct 00:22
Compare
Choose a tag to compare
Bump to v2.3.2

v2.3.1.post1

04 Oct 05:21
Compare
Choose a tag to compare
[CI] Use official Pytorch 2.1, add CUDA 11.8 for Pytorch 2.1

v2.3.1

04 Oct 02:57
Compare
Choose a tag to compare
Bump to v2.3.1