summaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorSebastianRueClausen <51479502+SebastianRueClausen@users.noreply.github.com>2024-01-12 17:47:07 +0100
committerGitHub <noreply@github.com>2024-01-12 17:47:07 +0100
commita46864bd5650c4707753f3d95d7b4ff6b0905995 (patch)
tree7806fc222a27773dabd19acf44d31dcc3bb842fc /README.md
parentbafe95b660048999a3bb000b3509d04fb1bb1789 (diff)
downloadcandle-a46864bd5650c4707753f3d95d7b4ff6b0905995.tar.gz
candle-a46864bd5650c4707753f3d95d7b4ff6b0905995.tar.bz2
candle-a46864bd5650c4707753f3d95d7b4ff6b0905995.zip
Fix "Minimal Mamba" link in README. (#1577)
Diffstat (limited to 'README.md')
-rw-r--r--README.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/README.md b/README.md
index c4f27548..14172742 100644
--- a/README.md
+++ b/README.md
@@ -66,7 +66,7 @@ We also provide a some command line based examples using state of the art models
- [Phi-1, Phi-1.5, and Phi-2](./candle-examples/examples/phi/): 1.3b and 2.7b general LLMs with performance on par with LLaMA-v2 7b.
- [StableLM-3B-4E1T](./candle-examples/examples/stable-lm/): a 3b general LLM
pre-trained on 1T tokens of English and code datasets.
-- [Minimal Mamba](./candle-examples/examples/minimal-mamba/): a minimal
+- [Minimal Mamba](./candle-examples/examples/mamba-minimal/): a minimal
implementation of the Mamba state space model.
- [Mistral7b-v0.1](./candle-examples/examples/mistral/): a 7b general LLM with
better performance than all publicly available 13b models as of 2023-09-28.