summaryrefslogtreecommitdiff
path: root/candle-transformers/src/models/t5.rs
diff options
context:
space:
mode:
authorzachcp <zachcp@users.noreply.github.com>2024-11-15 02:30:15 -0500
committerGitHub <noreply@github.com>2024-11-15 08:30:15 +0100
commitf689ce5d39c6f1475dfc71503288ea2905c8f685 (patch)
tree10b35ae68f1f5683edfebdcf92970de78ba05283 /candle-transformers/src/models/t5.rs
parent0ed24b9852ccc7dfb92d555afba3d56c2a3f3224 (diff)
downloadcandle-f689ce5d39c6f1475dfc71503288ea2905c8f685.tar.gz
candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.tar.bz2
candle-f689ce5d39c6f1475dfc71503288ea2905c8f685.zip
Documentation Pass for Models (#2617)
* links in chinese_clip * links for clip model * add mod docs for flux and llava * module doc for MMDIT and MIMI * add docs for a few more modesl * mod docs for bert naser and beit * add module docs for convmixer colpali codegeex and chatglm * add another series of moddocs * add fastvit-llama2_c * module docs mamba -> mobileone * module docs from moondream-phi3 * mod docs for quantized and qwen * update to yi * fix long names * Update llama2_c.rs * Update llama2_c_weights.rs * Fix the link for mimi + tweaks --------- Co-authored-by: Laurent Mazare <laurent.mazare@gmail.com>
Diffstat (limited to 'candle-transformers/src/models/t5.rs')
-rw-r--r--candle-transformers/src/models/t5.rs18
1 files changed, 16 insertions, 2 deletions
diff --git a/candle-transformers/src/models/t5.rs b/candle-transformers/src/models/t5.rs
index 8ba0c1c1..9da0c1af 100644
--- a/candle-transformers/src/models/t5.rs
+++ b/candle-transformers/src/models/t5.rs
@@ -1,5 +1,19 @@
-// T5 Text Model
-// https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/modeling_t5.py
+//! T5 model implementation.
+//!
+//! T5 (Text-to-Text Transfer Transformer) is a unified text-to-text transformer model.
+//! This implementation follows the original model architecture.
+//!
+//! Key characteristics:
+//! - Text-to-text framework
+//! - Relative positional embeddings
+//! - T5-specific layer normalization
+//! - Encoder-decoder architecture
+//! - Support for sequence-to-sequence tasks
+//!
+//! References:
+//! - [T5 Paper](https://arxiv.org/abs/1910.10683)
+//! - [HuggingFace T5](https://huggingface.co/docs/transformers/model_doc/t5)
+//! - [GH Model](https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/modeling_t5.py)
use crate::models::with_tracing::Embedding;
use candle::{DType, Device, Module, Result, Tensor, D};