summaryrefslogtreecommitdiff
path: root/candle-transformers/src/models/bert.rs
Commit message (Expand)AuthorAgeFilesLines
* Change/bert encoder public (#2658)Justin Sing2024-12-041-21/+30
* More Model Module Docs (#2623)zachcp2024-11-171-50/+0
* Module Docs (#2620)zachcp2024-11-161-3/+56
* Documentation Pass for Models (#2617)zachcp2024-11-151-0/+6
* Add BertForMaskedLM to support SPLADE Models (#2550)Akshay Ballal2024-10-071-0/+97
* Clippy fixes for 1.81.0. (#2461)Laurent Mazare2024-09-051-3/+3
* Fix the device for the bert attention mask. (#2414)Laurent Mazare2024-08-141-1/+2
* bert attention mask (#1934)Zheng Li2024-08-011-17/+32
* Use candle_nn::embedding instead of local copies in a few models. (#1562)Jani Monoses2024-01-101-6/+1
* Speed up bert with approx gelu (#1410)Juarez Bochi2023-12-061-2/+4
* Share the layer-norm implementation. (#1248)Laurent Mazare2023-11-031-56/+1
* Consolidate the with-tracing usage. (#1234)Laurent Mazare2023-11-011-35/+1
* Add the jina-bert embeddings model. (#1187)Laurent Mazare2023-10-261-2/+18
* Use the gelu-erf activation. (#969)Laurent Mazare2023-09-261-3/+1
* Move some models to candle-transformers so that it's easier to re-use. (#794)Laurent Mazare2023-09-101-0/+568