Skip to content

Releases: Dao-AILab/flash-attention

v1.0.0

12 Apr 06:34

Choose a tag to compare

Bump version to 1.0.0

v0.2.8

19 Jan 21:19

Choose a tag to compare

Bump to v0.2.8

v0.2.7

07 Jan 01:38

Choose a tag to compare

Bump to v0.2.7

v0.2.6

28 Dec 06:08

Choose a tag to compare

Bump to v0.2.6

v0.2.5

21 Dec 22:33

Choose a tag to compare

Bump to v0.2.5

v0.2.4

14 Dec 22:52

Choose a tag to compare

Bump to v0.2.4

v0.2.3

13 Dec 09:39

Choose a tag to compare

Bump to v0.2.3

v0.2.2

26 Nov 00:30

Choose a tag to compare

Speed up compilation by splitting into separate .cu files

v0.2.1

21 Nov 06:36

Choose a tag to compare

Bump version to 0.2.1