Releases: tridao/flash-attention-wheels
Releases · tridao/flash-attention-wheels
v2.0.6.post3
Switch to cutlass 3.0
v2.0.6.post2
Only enable torch 2.1.0.dev and cuda 12.1
v2.0.6.post1
Update FlashAttention to v2.0.6 to test
v2.0.8.post17
Set MAX_JOBS=1
v2.0.8.post16
Dont' set --threads 4
v2.0.8.post15
Set MAX_JOBS=2
v2.0.8.post14
Set MAX_JOBS=3
v2.0.8.post13
Set MAX_JOBS=4
v2.0.8.post9
Use torch 2.1.0.dev20230731
v2.0.8.post8
Don't use manylinux docker