index
:
forks/candle.git
main
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
candle-transformers
/
src
/
models
/
bert.rs
Commit message (
Expand
)
Author
Age
Files
Lines
*
Change/bert encoder public (#2658)
Justin Sing
2024-12-04
1
-21
/
+30
*
More Model Module Docs (#2623)
zachcp
2024-11-17
1
-50
/
+0
*
Module Docs (#2620)
zachcp
2024-11-16
1
-3
/
+56
*
Documentation Pass for Models (#2617)
zachcp
2024-11-15
1
-0
/
+6
*
Add BertForMaskedLM to support SPLADE Models (#2550)
Akshay Ballal
2024-10-07
1
-0
/
+97
*
Clippy fixes for 1.81.0. (#2461)
Laurent Mazare
2024-09-05
1
-3
/
+3
*
Fix the device for the bert attention mask. (#2414)
Laurent Mazare
2024-08-14
1
-1
/
+2
*
bert attention mask (#1934)
Zheng Li
2024-08-01
1
-17
/
+32
*
Use candle_nn::embedding instead of local copies in a few models. (#1562)
Jani Monoses
2024-01-10
1
-6
/
+1
*
Speed up bert with approx gelu (#1410)
Juarez Bochi
2023-12-06
1
-2
/
+4
*
Share the layer-norm implementation. (#1248)
Laurent Mazare
2023-11-03
1
-56
/
+1
*
Consolidate the with-tracing usage. (#1234)
Laurent Mazare
2023-11-01
1
-35
/
+1
*
Add the jina-bert embeddings model. (#1187)
Laurent Mazare
2023-10-26
1
-2
/
+18
*
Use the gelu-erf activation. (#969)
Laurent Mazare
2023-09-26
1
-3
/
+1
*
Move some models to candle-transformers so that it's easier to re-use. (#794)
Laurent Mazare
2023-09-10
1
-0
/
+568