diff options
author | zachcp <zachcp@users.noreply.github.com> | 2024-11-15 02:30:15 -0500 |
---|---|---|
committer | GitHub <noreply@github.com> | 2024-11-15 08:30:15 +0100 |
commit | f689ce5d39c6f1475dfc71503288ea2905c8f685 (patch) | |
tree | 10b35ae68f1f5683edfebdcf92970de78ba05283 /candle-transformers/src/models/qwen2_moe.rs | |
parent | 0ed24b9852ccc7dfb92d555afba3d56c2a3f3224 (diff) | |
download | candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.tar.gz candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.tar.bz2 candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.zip |
Documentation Pass for Models (#2617)
* links in chinese_clip
* links for clip model
* add mod docs for flux and llava
* module doc for MMDIT and MIMI
* add docs for a few more modesl
* mod docs for bert naser and beit
* add module docs for convmixer colpali codegeex and chatglm
* add another series of moddocs
* add fastvit-llama2_c
* module docs mamba -> mobileone
* module docs from moondream-phi3
* mod docs for quantized and qwen
* update to yi
* fix long names
* Update llama2_c.rs
* Update llama2_c_weights.rs
* Fix the link for mimi + tweaks
---------
Co-authored-by: Laurent Mazare <laurent.mazare@gmail.com>
Diffstat (limited to 'candle-transformers/src/models/qwen2_moe.rs')
-rw-r--r-- | candle-transformers/src/models/qwen2_moe.rs | 18 |
1 files changed, 18 insertions, 0 deletions
diff --git a/candle-transformers/src/models/qwen2_moe.rs b/candle-transformers/src/models/qwen2_moe.rs index 8d1d2f70..40e02797 100644 --- a/candle-transformers/src/models/qwen2_moe.rs +++ b/candle-transformers/src/models/qwen2_moe.rs @@ -1,3 +1,21 @@ +//! Qwen2 model implementation with Mixture of Experts support. +//! +//! Qwen2 is a large language model using sparse Mixture of Experts (MoE). +//! This implementation provides support for sparsely activated MoE layers. +//! +//! Key characteristics: +//! - Mixture of Experts architecture +//! - Sparse expert activation +//! - Shared expert routing mechanism +//! - Grouped query attention (GQA) +//! - RMSNorm for layer normalization +//! - Rotary positional embeddings (RoPE) +//! +//! References: +//! - [Qwen2 Paper](https://arxiv.org/abs/2401.08985) +//! - [Model Card](https://huggingface.co/Qwen/Qwen2-7B-beta) +//! + use crate::models::with_tracing::{linear, linear_no_bias, Linear, RmsNorm}; use candle::{DType, Device, Module, Result, Tensor, D}; use candle_nn::{Activation, VarBuilder}; |