diff options
Diffstat (limited to 'candle-examples/examples/quantized-t5/README.md')
-rw-r--r-- | candle-examples/examples/quantized-t5/README.md | 25 |
1 files changed, 23 insertions, 2 deletions
diff --git a/candle-examples/examples/quantized-t5/README.md b/candle-examples/examples/quantized-t5/README.md index 4a1ee5bf..8b8179eb 100644 --- a/candle-examples/examples/quantized-t5/README.md +++ b/candle-examples/examples/quantized-t5/README.md @@ -1,5 +1,7 @@ # candle-quantized-t5 +## Seq2Seq example + This example uses a quantized version of the t5 model. ```bash @@ -8,6 +10,8 @@ $ cargo run --example quantized-t5 --release -- --prompt "translate to German: A Eine schöne Kerze. ``` +## Generating Quantized weight files + The weight file is automatically retrieved from the hub. It is also possible to generate quantized weight files from the original safetensors file by using the `tensor-tools` command line utility via: @@ -16,8 +20,11 @@ generate quantized weight files from the original safetensors file by using the $ cargo run --example tensor-tools --release -- quantize --quantization q6k PATH/TO/T5/model.safetensors /tmp/model.gguf ``` -To use a different model, specify the `model-id`. For example, you can use -quantized [CoEdit models](https://huggingface.co/jbochi/candle-coedit-quantized). +## Using custom models + +To use a different model, specify the `model-id`. + +For example, for text editing, you can use quantized [CoEdit models](https://huggingface.co/jbochi/candle-coedit-quantized). ```bash $ cargo run --example quantized-t5 --release -- \ @@ -26,6 +33,7 @@ $ cargo run --example quantized-t5 --release -- \ --temperature 0 ... Although their flight is weak, they run quickly through the tree canopy. +``` By default, it will look for `model.gguf` and `config.json`, but you can specify custom local or remote `weight-file` and `config-file`s: @@ -40,3 +48,16 @@ cargo run --example quantized-t5 --release -- \ ... Note that a storm surge is what forecasters consider a hurricane's most dangerous part. ``` + +### [MADLAD-400](https://arxiv.org/abs/2309.04662) + +MADLAD-400 is a series of multilingual machine translation T5 models trained on 250 billion tokens covering over 450 languages using publicly available data. These models are competitive with significantly larger models. + +```bash +cargo run --example quantized-t5 --release -- \ + --model-id "jbochi/madlad400-3b-mt" --weight-file "model-q4k.gguf" \ + --prompt "<2de> How are you, my friend?" \ + --temperature 0 +... + Wie geht es dir, mein Freund? +``` |