index
:
forks/candle.git
main
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
candle-nn
/
src
/
ops.rs
Commit message (
Expand
)
Author
Age
Files
Lines
*
Depth Anything v2 (#2279)
Jeroen Vlek
2024-06-24
1
-1
/
+22
*
Add the layernorm specialized op. (#2212)
Laurent Mazare
2024-05-24
1
-5
/
+253
*
Fix sigmoid gradient calculation and move sigmoid into a specialized op (#2114)
MilkFather
2024-04-29
1
-2
/
+186
*
Apply the cast before the scaling. (#2135)
Laurent Mazare
2024-04-28
1
-1
/
+1
*
RmsNorm kernel for metal. (#1895)
Laurent Mazare
2024-03-21
1
-1
/
+46
*
Custom op for RmsNorm (#1890)
Laurent Mazare
2024-03-21
1
-4
/
+167
*
add clone to candle dropout (#1814)
Kirpal Grewal
2024-03-08
1
-1
/
+1
*
Improve metal buffer usage (#1807)
ivarflakstad
2024-03-07
1
-1
/
+2
*
feat: add silu activation function (#1706)
OlivierDehaene
2024-02-14
1
-3
/
+2
*
Clippy pass.
Nicolas Patry
2023-12-18
1
-3
/
+3
*
Addressing a lot of comments.
Nicolas Patry
2023-12-15
1
-1
/
+2
*
Remove `unwrap()`.
Nicolas Patry
2023-12-15
1
-2
/
+2
*
Renamed all kernel names.
Nicolas Patry
2023-12-15
1
-3
/
+3
*
Fixing softmax.
Nicolas Patry
2023-12-15
1
-1
/
+1
*
Working with merging encoders and using fences.
Nicolas Patry
2023-12-14
1
-2
/
+0
*
Lots of updates including some stack of command buffers.
nicolas
2023-12-12
1
-1
/
+3
*
Starting to fix some tests.
Nicolas Patry
2023-11-30
1
-0
/
+40
*
Add the swiglu activation from the chatglm PR. (#1246)
Laurent Mazare
2023-11-02
1
-0
/
+5
*
Add hard-sigmoid and hard-swish activations (#1244)
jamjamjon
2023-11-02
1
-0
/
+5
*
Allow for different behavior between training and eval (#1213)
Laurent Mazare
2023-10-29
1
-0
/
+6
*
Fix the leaky relu. (#898)
Laurent Mazare
2023-09-19
1
-1
/
+2
*
Replication pad (#861)
Laurent Mazare
2023-09-15
1
-0
/
+15
*
DiffNeXt/unet (#859)
Laurent Mazare
2023-09-15
1
-0
/
+24
*
Add the upblocks. (#853)
Laurent Mazare
2023-09-14
1
-0
/
+4
*
Softmax implementation for cuda. (#747)
Laurent Mazare
2023-09-05
1
-8
/
+51
*
Tweaks to softmax. (#745)
Laurent Mazare
2023-09-05
1
-5
/
+3
*
Add a custom softmax implementation. (#744)
Laurent Mazare
2023-09-05
1
-1
/
+68
*
More robust tests (so that they pass on accelerate). (#679)
Laurent Mazare
2023-08-30
1
-4
/
+4
*
Add a Dropout layer (#676)
Laurent Mazare
2023-08-30
1
-0
/
+35
*
Implement group-norm. (#334)
Laurent Mazare
2023-08-07
1
-0
/
+6
*
Add a stable diffusion example (#328)
Laurent Mazare
2023-08-06
1
-0
/
+4
*
Softmax numerical stability. (#267)
Laurent Mazare
2023-07-28
1
-0
/
+24
*
Move some shared functions to the nn module. (#221)
Laurent Mazare
2023-07-22
1
-0
/
+10