summaryrefslogtreecommitdiff
path: root/candle-nn/src/ops.rs
Commit message (Expand)AuthorAgeFilesLines
* feat: add silu activation function (#1706)OlivierDehaene2024-02-141-3/+2
* Clippy pass.Nicolas Patry2023-12-181-3/+3
* Addressing a lot of comments.Nicolas Patry2023-12-151-1/+2
* Remove `unwrap()`.Nicolas Patry2023-12-151-2/+2
* Renamed all kernel names.Nicolas Patry2023-12-151-3/+3
* Fixing softmax.Nicolas Patry2023-12-151-1/+1
* Working with merging encoders and using fences.Nicolas Patry2023-12-141-2/+0
* Lots of updates including some stack of command buffers.nicolas2023-12-121-1/+3
* Starting to fix some tests.Nicolas Patry2023-11-301-0/+40
* Add the swiglu activation from the chatglm PR. (#1246)Laurent Mazare2023-11-021-0/+5
* Add hard-sigmoid and hard-swish activations (#1244)jamjamjon2023-11-021-0/+5
* Allow for different behavior between training and eval (#1213)Laurent Mazare2023-10-291-0/+6
* Fix the leaky relu. (#898)Laurent Mazare2023-09-191-1/+2
* Replication pad (#861)Laurent Mazare2023-09-151-0/+15
* DiffNeXt/unet (#859)Laurent Mazare2023-09-151-0/+24
* Add the upblocks. (#853)Laurent Mazare2023-09-141-0/+4
* Softmax implementation for cuda. (#747)Laurent Mazare2023-09-051-8/+51
* Tweaks to softmax. (#745)Laurent Mazare2023-09-051-5/+3
* Add a custom softmax implementation. (#744)Laurent Mazare2023-09-051-1/+68
* More robust tests (so that they pass on accelerate). (#679)Laurent Mazare2023-08-301-4/+4
* Add a Dropout layer (#676)Laurent Mazare2023-08-301-0/+35
* Implement group-norm. (#334)Laurent Mazare2023-08-071-0/+6
* Add a stable diffusion example (#328)Laurent Mazare2023-08-061-0/+4
* Softmax numerical stability. (#267)Laurent Mazare2023-07-281-0/+24
* Move some shared functions to the nn module. (#221)Laurent Mazare2023-07-221-0/+10