summaryrefslogtreecommitdiff
path: root/candle-book
diff options
context:
space:
mode:
authorPatrick von Platen <patrick.v.platen@gmail.com>2023-08-23 08:52:53 +0000
committerPatrick von Platen <patrick.v.platen@gmail.com>2023-08-23 08:52:53 +0000
commit7c0ca80d3a4c238543cc705400643bf05d474007 (patch)
tree7fef9eed02b4389a631814f452a91a78a23d96fd /candle-book
parentb558d08b85f07a30cc29da00bcd2dcd2fddfc1e7 (diff)
downloadcandle-7c0ca80d3a4c238543cc705400643bf05d474007.tar.gz
candle-7c0ca80d3a4c238543cc705400643bf05d474007.tar.bz2
candle-7c0ca80d3a4c238543cc705400643bf05d474007.zip
move installation to book
Diffstat (limited to 'candle-book')
-rw-r--r--candle-book/src/guide/installation.md40
1 files changed, 40 insertions, 0 deletions
diff --git a/candle-book/src/guide/installation.md b/candle-book/src/guide/installation.md
index d2086e0c..69752391 100644
--- a/candle-book/src/guide/installation.md
+++ b/candle-book/src/guide/installation.md
@@ -1,5 +1,44 @@
# Installation
+- **With Cuda support**:
+
+1. First, make sure that Cuda is correctly installed.
+- `nvcc --version` should print your information about your Cuda compiler driver.
+- `nvidia-smi --query-gpu=compute_cap --format=csv` should print your GPUs compute capability, e.g. something
+like:
+```
+compute_cap
+8.9
+```
+
+If any of the above commands errors out, please make sure to update your Cuda version.
+
+2. Create a new app and add [`candle-core`](https://github.com/huggingface/candle/tree/main/candle-core) with Cuda support
+
+Start by creating a new cargo:
+
+```bash
+cargo new myapp
+cd myapp
+```
+
+Make sure to add the `candle-core` crate with the cuda feature:
+
+```
+cargo add --git https://github.com/huggingface/candle.git candle-core --features "cuda"
+```
+
+Run `cargo build` to make sure everything can be correctly built.
+
+```
+cargo run
+```
+
+**Without Cuda support**:
+
+Create a new app and add [`candle-core`](https://github.com/huggingface/candle/tree/main/candle-core) as follows:
+
+
Start by creating a new app:
```bash
@@ -20,5 +59,6 @@ You can check everything works properly:
cargo build
```
+**With mkl support**
You can also see the `mkl` feature which could be interesting to get faster inference on CPU. [Using mkl](./advanced/mkl.md)