summaryrefslogtreecommitdiff
path: root/candle-core/src/backprop.rs
Commit message (Expand)AuthorAgeFilesLines
* 20241118 docs (#2629)zachcp2024-11-191-1/+1
* optimize gradient for silu a bit (#2393)MilkFather2024-08-041-2/+2
* Add get_ids to GradStore (#2379)Takanori MAEHARA2024-08-011-0/+5
* Fix for backprop in ConvTranspose2D with stride of 2 (#2337)Ivor Wanders2024-07-171-2/+2
* Fix Elu gradient NaN on large input (#2328)Alexey Gerasev2024-07-161-1/+2
* Fix the silu gradient issue on 0. (#2083)Laurent Mazare2024-04-181-1/+1
* Add support for "sign" on tensors (#2012)Thomas Santerre2024-04-041-10/+8
* Backwards for ConvTranspose2D (#1910)Kirpal Grewal2024-03-231-3/+35
* Add grads for interpolate1d (#1742)Kirpal Grewal2024-02-221-4/+13
* Support for groups in conv-transpose1d. (#1731)Laurent Mazare2024-02-181-0/+1
* feat: add silu activation function (#1706)OlivierDehaene2024-02-141-0/+7
* Detach the tensors on batch-norm eval. (#1702)Laurent Mazare2024-02-131-1/+1
* Upsample grad (#1420)KGrewal12023-12-101-4/+22
* Detach all grads during backprop. (#1243)Laurent Mazare2023-11-051-4/+21
* feat: add backprop for elu (#1269)drbh2023-11-041-1/+10
* feat: impl backprop for erf and gelu-erf (#1258)drbh2023-11-031-3/+16
* Backprop support for conv1d (cpu only for now). (#1255)Laurent Mazare2023-11-031-1/+38
* Add the conv-transpose1d op. (#1251)Laurent Mazare2023-11-031-0/+8
* Fix the conv2d gradient computation. (#1214)Laurent Mazare2023-10-291-0/+7
* derivative for GELU (#1160)KGrewal12023-10-231-1/+9
* Avoid trying to backprop through non-differentiable layers. (#1094)Laurent Mazare2023-10-141-2/+12
* Add the rounding operators. (#1030)Laurent Mazare2023-10-041-0/+10
* Add slice-scatter. (#927)Laurent Mazare2023-09-221-1/+11
* Add the erf function. (#917)Laurent Mazare2023-09-211-0/+1
* Add an erf based gelu op (#900)Laurent Mazare2023-09-191-0/+3
* Do not backprop through argmin/argmax. (#865)Laurent Mazare2023-09-151-1/+3
* Add 1d upsampling. (#839)Laurent Mazare2023-09-131-0/+4
* Add tanh. (#675)Laurent Mazare2023-08-301-0/+5
* Add the powf op. (#664)Laurent Mazare2023-08-291-0/+6
* Simplify usage of the pool functions. (#662)Laurent Mazare2023-08-291-1/+1
* Dilated convolutions (#657)Laurent Mazare2023-08-291-4/+11
* Backprop support for pooling ops. (#652)Laurent Mazare2023-08-291-2/+35
* Backprop for conv2d. (#638)Laurent Mazare2023-08-281-1/+24
* Add conv-transpose. (#635)Laurent Mazare2023-08-281-0/+8
* Fix the minimum/maximum gradient computations. (#534)Laurent Mazare2023-08-211-3/+8
* Add a yolo-v3 example. (#528)Laurent Mazare2023-08-201-0/+10
* Add the permute op (similar to pytorch). (#504)Laurent Mazare2023-08-181-0/+10
* add max_pool2d (#371)LeeeSe2023-08-091-0/+2
* Skeleton for the avg-pool2d and upsample-nearest2d ops. (#337)Laurent Mazare2023-08-071-0/+12
* Add the recip op + use it in stable-diffusion. (#331)Laurent Mazare2023-08-061-0/+5
* Remove the embedding ops in favor of index-select. (#299)Laurent Mazare2023-08-021-4/+0
* Softmax numerical stability. (#267)Laurent Mazare2023-07-281-2/+0
* Support backprop for a few more ops. (#254)Laurent Mazare2023-07-261-11/+30
* Add the copy op. (#227)Laurent Mazare2023-07-231-0/+5
* Add the gather op. (#219)Laurent Mazare2023-07-221-0/+7
* Polish the index-add op and use it in the index-select backprop (#218)Laurent Mazare2023-07-221-19/+1
* Start adding index-add.laurent2023-07-211-1/+4
* Add binary and ternary custom ops. (#217)Laurent Mazare2023-07-211-4/+33
* Custom ops with a single argument (#214)Laurent Mazare2023-07-211-1/+7
* Refactor the reduce ops in order to introduce argmin/argmax. (#212)Laurent Mazare2023-07-211-0/+6