Skip to content
GitLab
Explore
Sign in
Register
Tags
Tags give the ability to mark specific points in history as being important
This project is mirrored from
https://github.com/fla-org/flash-linear-attention.git
. Pull mirroring updated
Oct 14, 2025
.
Successfully updated
Oct 14, 2025
.
v0.3.2
f7d95fa0
·
Bump fla to `v0.3.2`
·
Sep 10, 2025
v0.3.1
80acaebc
·
[Mamba] Fix errors in Triton backend (#576)
·
Aug 26, 2025
v0.3.0
17dd5662
·
[README] Update model list
·
Jul 14, 2025
v0.2.2
a46f204c
·
[ShortConv] Add triton kernels for inference
·
Jun 05, 2025
v0.2.1
a670dff4
·
Bump `fla` to v0.2.1
·
Apr 24, 2025
v0.2.0
6bfd5e67
·
[Token Mixing] Remove the `head_first` arg from token mixing layers (#347)
·
Apr 12, 2025
v0.1.2
53b3ac7e
·
Bump `fla` to v0.1.2 (#264)
·
Mar 31, 2025
v0.1.1
09fe6c2c
·
Bump `fla` to v0.1.1
·
Mar 24, 2025
v0.1.0
cb0d1bb0
·
[Misc.] Use pkg name `flash-linear-attention`
·
Mar 20, 2025