Commit message (Expand) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | Add flash-attn support. (#912) | Laurent Mazare | 2023-09-20 | 1 | -1/+2 |
* | Line-up the wuerstchen model with the python implementation. (#901) | Laurent Mazare | 2023-09-19 | 1 | -2/+2 |
* | Only use classifier free guidance for the prior. (#896) | Laurent Mazare | 2023-09-19 | 1 | -0/+28 |
* | Specialized attention module for Wuerstchen. (#890) | Laurent Mazare | 2023-09-18 | 1 | -4/+4 |
* | More Wuerstchen fixes. (#882) | Laurent Mazare | 2023-09-17 | 1 | -3/+3 |
* | Remove the parameters for the Wuerstchen layer-norm. (#879) | Laurent Mazare | 2023-09-17 | 1 | -16/+23 |
* | Add the attention block. (#846) | Laurent Mazare | 2023-09-14 | 1 | -0/+41 |
* | Start adding the Wuerstchen diffusion pipeline (#843) | Laurent Mazare | 2023-09-14 | 1 | -0/+126 |