Commit message (Expand) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | Flash-Attn upgrade / SoftCap Candle-FlashAttn [1/n] (#2688) | Michael Feil | 2024-12-31 | 1 | -6/+9 |
* | Update the flash attn kernels. (#2333) | Laurent Mazare | 2024-07-15 | 1 | -161/+263 |
* | Use flash-attn in gemma. (#2195) | Laurent Mazare | 2024-05-18 | 1 | -0/+4 |
* | chore: update flash attention kernels (#1518) | OlivierDehaene | 2024-01-05 | 1 | -30/+33 |
* | Add flash attention (#241) | Laurent Mazare | 2023-07-26 | 1 | -0/+251 |