diff options
author | SebastianRueClausen <51479502+SebastianRueClausen@users.noreply.github.com> | 2024-01-12 17:47:07 +0100 |
---|---|---|
committer | GitHub <noreply@github.com> | 2024-01-12 17:47:07 +0100 |
commit | a46864bd5650c4707753f3d95d7b4ff6b0905995 (patch) | |
tree | 7806fc222a27773dabd19acf44d31dcc3bb842fc /README.md | |
parent | bafe95b660048999a3bb000b3509d04fb1bb1789 (diff) | |
download | candle-a46864bd5650c4707753f3d95d7b4ff6b0905995.tar.gz candle-a46864bd5650c4707753f3d95d7b4ff6b0905995.tar.bz2 candle-a46864bd5650c4707753f3d95d7b4ff6b0905995.zip |
Fix "Minimal Mamba" link in README. (#1577)
Diffstat (limited to 'README.md')
-rw-r--r-- | README.md | 2 |
1 files changed, 1 insertions, 1 deletions
@@ -66,7 +66,7 @@ We also provide a some command line based examples using state of the art models - [Phi-1, Phi-1.5, and Phi-2](./candle-examples/examples/phi/): 1.3b and 2.7b general LLMs with performance on par with LLaMA-v2 7b. - [StableLM-3B-4E1T](./candle-examples/examples/stable-lm/): a 3b general LLM pre-trained on 1T tokens of English and code datasets. -- [Minimal Mamba](./candle-examples/examples/minimal-mamba/): a minimal +- [Minimal Mamba](./candle-examples/examples/mamba-minimal/): a minimal implementation of the Mamba state space model. - [Mistral7b-v0.1](./candle-examples/examples/mistral/): a 7b general LLM with better performance than all publicly available 13b models as of 2023-09-28. |