Skip to content

Releases: tridao/flash-attention-wheels

v2.0.6.post3

14 Aug 15:56
Compare
Choose a tag to compare
Switch to cutlass 3.0

v2.0.6.post2

14 Aug 15:35
Compare
Choose a tag to compare
Only enable torch 2.1.0.dev and cuda 12.1

v2.0.6.post1

14 Aug 15:32
Compare
Choose a tag to compare
Update FlashAttention to v2.0.6 to test

v2.0.8.post17

13 Aug 16:40
Compare
Choose a tag to compare
Set MAX_JOBS=1

v2.0.8.post16

13 Aug 15:50
Compare
Choose a tag to compare
Dont' set --threads 4

v2.0.8.post15

13 Aug 07:36
Compare
Choose a tag to compare
Set MAX_JOBS=2

v2.0.8.post14

13 Aug 07:15
Compare
Choose a tag to compare
Set MAX_JOBS=3

v2.0.8.post13

13 Aug 06:55
Compare
Choose a tag to compare
Set MAX_JOBS=4

v2.0.8.post9

12 Aug 06:51
Compare
Choose a tag to compare
Use torch 2.1.0.dev20230731

v2.0.8.post8

12 Aug 06:02
Compare
Choose a tag to compare
Don't use manylinux docker