summaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorLaurent Mazare <laurent.mazare@gmail.com>2023-10-06 21:26:04 +0100
committerGitHub <noreply@github.com>2023-10-06 21:26:04 +0100
commit955e00b2e895c6e53f9348b1dd672ee4ebcf8f53 (patch)
tree72645b09fb60c7b8629c7d92ded57225e5f19a79 /README.md
parentd5f7267087bc253a2fe93c95ae78a164053646c1 (diff)
downloadcandle-955e00b2e895c6e53f9348b1dd672ee4ebcf8f53.tar.gz
candle-955e00b2e895c6e53f9348b1dd672ee4ebcf8f53.tar.bz2
candle-955e00b2e895c6e53f9348b1dd672ee4ebcf8f53.zip
Add to the readmes for stable-lm. (#1047)
Diffstat (limited to 'README.md')
-rw-r--r--README.md3
1 files changed, 3 insertions, 0 deletions
diff --git a/README.md b/README.md
index 64e1c451..af308ede 100644
--- a/README.md
+++ b/README.md
@@ -62,6 +62,8 @@ We also provide a some command line based examples using state of the art models
- [LLaMA and LLaMA-v2](./candle-examples/examples/llama/): general LLM.
- [Falcon](./candle-examples/examples/falcon/): general LLM.
- [Phi-v1.5](./candle-examples/examples/phi/): a 1.3b general LLM with performance on par with LLaMA-v2 7b.
+- [StableLM-3B-4E1T](./candle-examples/examples/stable-lm/): a 3b general LLM
+ pre-trained on 1T tokens of English and code datasets.
- [Mistral7b-v0.1](./candle-examples/examples/mistral/): a 7b general LLM with
performance larger than all publicly available 13b models as of 2023-09-28.
- [StarCoder](./candle-examples/examples/bigcode/): LLM specialized to code generation.
@@ -152,6 +154,7 @@ If you have an addition to this list, please submit a pull request.
- StarCoder.
- Phi v1.5.
- Mistral 7b v0.1.
+ - StableLM-3B-4E1T.
- T5.
- Bert.
- Whisper (multi-lingual support).