diff options
author | zachcp <zachcp@users.noreply.github.com> | 2024-11-15 02:30:15 -0500 |
---|---|---|
committer | GitHub <noreply@github.com> | 2024-11-15 08:30:15 +0100 |
commit | f689ce5d39c6f1475dfc71503288ea2905c8f685 (patch) | |
tree | 10b35ae68f1f5683edfebdcf92970de78ba05283 /candle-transformers/src/models/mixtral.rs | |
parent | 0ed24b9852ccc7dfb92d555afba3d56c2a3f3224 (diff) | |
download | candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.tar.gz candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.tar.bz2 candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.zip |
Documentation Pass for Models (#2617)
* links in chinese_clip
* links for clip model
* add mod docs for flux and llava
* module doc for MMDIT and MIMI
* add docs for a few more modesl
* mod docs for bert naser and beit
* add module docs for convmixer colpali codegeex and chatglm
* add another series of moddocs
* add fastvit-llama2_c
* module docs mamba -> mobileone
* module docs from moondream-phi3
* mod docs for quantized and qwen
* update to yi
* fix long names
* Update llama2_c.rs
* Update llama2_c_weights.rs
* Fix the link for mimi + tweaks
---------
Co-authored-by: Laurent Mazare <laurent.mazare@gmail.com>
Diffstat (limited to 'candle-transformers/src/models/mixtral.rs')
-rw-r--r-- | candle-transformers/src/models/mixtral.rs | 17 |
1 files changed, 17 insertions, 0 deletions
diff --git a/candle-transformers/src/models/mixtral.rs b/candle-transformers/src/models/mixtral.rs index a578d6fe..70115e10 100644 --- a/candle-transformers/src/models/mixtral.rs +++ b/candle-transformers/src/models/mixtral.rs @@ -1,3 +1,20 @@ +//! Mixtral Model, a sparse mixture of expert model based on the Mistral architecture +//! +//! See Mixtral model details at: +//! - [Hugging Face](https://huggingface.co/docs/transformers/model_doc/mixtral) +//! - [Mixtral-8x7B Blog Post](https://mistral.ai/news/mixtral-of-experts/) +//! +//! The model uses a mixture of experts architecture with: +//! - 8 experts per layer +//! - Top 2 expert routing +//! - Sliding window attention +//! - RoPE embeddings +//! +//! References: +//! - [Hugging Face Implementation](https://github.com/huggingface/transformers/blob/main/src/transformers/models/mixtral/modeling_mixtral.py) +//! - [Mixtral Blog Post](https://mistral.ai/news/mixtral-of-experts/) +//! + use crate::models::with_tracing::{linear_no_bias, Linear, RmsNorm}; /// Mixtral Model /// https://github.com/huggingface/transformers/blob/main/src/transformers/models/mixtral/modeling_mixtral.py |