Releases: Dao-AILab/flash-attention
Releases Β· Dao-AILab/flash-attention
v2.4.1
Bump to v2.4.1
v2.4.0.post1
[CI] Don't compile for python 3.7 pytorch 2.2
v2.4.0
Bump to v2.4.0
v2.3.6
Bump to v2.3.6
v2.3.5
Bump to v2.3.5
v2.3.4
Bump to v2.3.4
v2.3.3
Bump to v2.3.3
v2.3.2
Bump to v2.3.2
v2.3.1.post1
[CI] Use official Pytorch 2.1, add CUDA 11.8 for Pytorch 2.1
v2.3.1
Bump to v2.3.1