Flash Attention v2 does not support Turing GPUs (T4, RTX 2080). This layer can be used in replacement of the official flash attention Candle layer in the meantime.
-
Notifications
You must be signed in to change notification settings - Fork 2
License
Apache-2.0, MIT licenses found
Licenses found
Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT
huggingface/candle-flash-attn-v1
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
License
Apache-2.0, MIT licenses found
Licenses found
Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published