Commit Graph

38 Commits

Author SHA1 Message Date
feffy380
6b3148fd3f Fix min-snr-gamma for v-prediction and ZSNR.
This fixes min-snr for vpred+zsnr by dividing directly by SNR+1.
The old implementation did it in two steps: (min-snr/snr) * (snr/(snr+1)), which causes division by zero when combined with --zero_terminal_snr
2023-11-07 23:02:25 +01:00
青龍聖者@bdsqlsz
202f2c3292 Debias Estimation loss (#889)
* update for bnb 0.41.1

* fixed generate_controlnet_subsets_config for training

* Revert "update for bnb 0.41.1"

This reverts commit 70bd3612d8.

* add debiased_estimation_loss

* add train_network

* Revert "add train_network"

This reverts commit 6539363c5c.

* Update train_network.py
2023-10-23 22:59:14 +09:00
Yuta Hayashibe
27f9b6ffeb updated typos to v1.16.15 and fix typos 2023-10-01 21:51:24 +09:00
Disty0
b64389c8a9 Intel ARC support with IPEX 2023-09-19 18:05:05 +03:00
Kohya S
73a08c0be0 Merge pull request #630 from ddPn08/sdxl
make tracker init_kwargs configurable
2023-07-20 22:05:55 +09:00
Kohya S
6d2d8dfd2f add zero_terminal_snr option 2023-07-18 23:17:23 +09:00
ddPn08
b841dd78fe make tracker init_kwargs configurable 2023-07-11 10:21:45 +09:00
Kohya S
68ca0ea995 Fix to show template type 2023-07-10 22:28:26 +09:00
Kohya S
ea182461d3 add min/max_timestep 2023-07-03 20:44:42 +09:00
Kohya S
5114e8daf1 fix training scripts except controlnet not working 2023-06-22 08:46:53 +09:00
Kohya S
19dfa24abb Merge branch 'main' into original-u-net 2023-06-16 20:59:34 +09:00
青龍聖者@bdsqlsz
e97d67a681 Support for Prodigy(Dadapt variety for Dylora) (#585)
* Update train_util.py for DAdaptLion

* Update train_README-zh.md for dadaptlion

* Update train_README-ja.md for DAdaptLion

* add DAdatpt V3

* Alignment

* Update train_util.py for experimental

* Update train_util.py V3

* Update train_README-zh.md

* Update train_README-ja.md

* Update train_util.py fix

* Update train_util.py

* support Prodigy

* add lower
2023-06-15 21:12:53 +09:00
Kohya S
9806b00f74 add arbitrary dataset feature to each script 2023-06-15 20:39:39 +09:00
ykume
9e1683cf2b support sdpa 2023-06-11 21:26:15 +09:00
ykume
0315611b11 remove workaround for accelerator=0.15, fix XTI 2023-06-11 18:32:14 +09:00
Kohya S
ec2efe52e4 scale v-pred loss like noise pred 2023-06-03 10:52:22 +09:00
Kohya S
968bbd2f47 Merge pull request #480 from yanhuifair/main
fix print "saving" and "epoch" in newline
2023-05-11 21:05:37 +09:00
Kohya S
09c719c926 add adaptive noise scale 2023-05-07 18:09:08 +09:00
Fair
b08154dc36 fix print "saving" and "epoch" in newline 2023-05-07 02:51:01 +08:00
Kohya S
2127907dd3 refactor selection and logging for DAdaptation 2023-05-06 18:14:16 +09:00
青龍聖者@bdsqlsz
164a1978de Support for more Dadaptation (#455)
* Update train_util.py for add DAdaptAdan and DAdaptSGD

* Update train_util.py for DAdaptadam

* Update train_network.py for dadapt

* Update train_README-ja.md for DAdapt

* Update train_util.py for DAdapt

* Update train_network.py for DAdaptAdaGrad

* Update train_db.py for DAdapt

* Update fine_tune.py for DAdapt

* Update train_textual_inversion.py for DAdapt

* Update train_textual_inversion_XTI.py for DAdapt
2023-05-06 17:30:09 +09:00
ykume
69579668bb Merge branch 'dev' of https://github.com/kohya-ss/sd-scripts into dev 2023-05-03 11:17:43 +09:00
Kohya S
2e688b7cd3 Merge pull request #471 from pamparamm/multires-noise
Multi-Resolution Noise
2023-05-03 11:17:21 +09:00
ykume
2fcbfec178 make transform_DDP more intuitive 2023-05-03 11:07:29 +09:00
Isotr0py
e1143caf38 Fix DDP issues and Support DDP for all training scripts (#448)
* Fix DDP bugs

* Fix DDP bugs for finetune and db

* refactor model loader

* fix DDP network

* try to fix DDP network in train unet only

* remove unuse DDP import

* refactor DDP transform

* refactor DDP transform

* fix sample images bugs

* change DDP tranform location

* add autocast to train_db

* support DDP in XTI

* Clear DDP import
2023-05-03 10:37:47 +09:00
Pam
b18d099291 Multi-Resolution Noise 2023-05-02 09:42:17 +05:00
Kohya S
74008ce487 add save_every_n_steps option 2023-04-24 23:22:24 +09:00
Plat
27ffd9fe3d feat: support wandb logging 2023-04-20 01:41:12 +09:00
Kohya S
9fc27403b2 support disk cache: same as #164, might fix #407 2023-04-13 21:40:34 +09:00
Kohya S
2e9f7b5f91 cache latents to disk in dreambooth method 2023-04-12 23:10:39 +09:00
Kohya S
6a5f87d874 disable weighted captions in TI/XTI training 2023-04-08 21:45:57 +09:00
Kohya S
541539a144 change method name, repo is private in default etc 2023-04-05 23:16:49 +09:00
ddPn08
16ba1cec69 change async uploading to optional 2023-04-02 17:45:26 +09:00
ddPn08
8bfa50e283 small fix 2023-04-02 17:39:23 +09:00
ddPn08
3cc4939dd3 Implement huggingface upload for all scripts 2023-04-02 17:39:22 +09:00
Kohya S
cb53a77334 show warning message for sample images in XTI 2023-03-30 21:33:57 +09:00
Jakaline-dev
24e3d4b464 disabled sampling (for now) 2023-03-30 02:20:04 +09:00
Jakaline-dev
a35d7ef227 Implement XTI 2023-03-26 05:26:10 +09:00