From a46864bd5650c4707753f3d95d7b4ff6b0905995 Mon Sep 17 00:00:00 2001 From: SebastianRueClausen <51479502+SebastianRueClausen@users.noreply.github.com> Date: Fri, 12 Jan 2024 17:47:07 +0100 Subject: Fix "Minimal Mamba" link in README. (#1577) --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'README.md') diff --git a/README.md b/README.md index c4f27548..14172742 100644 --- a/README.md +++ b/README.md @@ -66,7 +66,7 @@ We also provide a some command line based examples using state of the art models - [Phi-1, Phi-1.5, and Phi-2](./candle-examples/examples/phi/): 1.3b and 2.7b general LLMs with performance on par with LLaMA-v2 7b. - [StableLM-3B-4E1T](./candle-examples/examples/stable-lm/): a 3b general LLM pre-trained on 1T tokens of English and code datasets. -- [Minimal Mamba](./candle-examples/examples/minimal-mamba/): a minimal +- [Minimal Mamba](./candle-examples/examples/mamba-minimal/): a minimal implementation of the Mamba state space model. - [Mistral7b-v0.1](./candle-examples/examples/mistral/): a 7b general LLM with better performance than all publicly available 13b models as of 2023-09-28. -- cgit v1.2.3