Commit Graph

87 Commits

Author SHA1 Message Date
Kohya S
c142dadb46 support sai model spec 2023-08-06 21:50:05 +09:00
Kohya S
73a08c0be0 Merge pull request #630 from ddPn08/sdxl
make tracker init_kwargs configurable
2023-07-20 22:05:55 +09:00
Kohya S
6d2d8dfd2f add zero_terminal_snr option 2023-07-18 23:17:23 +09:00
ddPn08
b841dd78fe make tracker init_kwargs configurable 2023-07-11 10:21:45 +09:00
Kohya S
ea182461d3 add min/max_timestep 2023-07-03 20:44:42 +09:00
Kohya S
5114e8daf1 fix training scripts except controlnet not working 2023-06-22 08:46:53 +09:00
Kohya S
92e50133f8 Merge branch 'original-u-net' into dev 2023-06-17 21:57:08 +09:00
Kohya S
19dfa24abb Merge branch 'main' into original-u-net 2023-06-16 20:59:34 +09:00
青龍聖者@bdsqlsz
e97d67a681 Support for Prodigy(Dadapt variety for Dylora) (#585)
* Update train_util.py for DAdaptLion

* Update train_README-zh.md for dadaptlion

* Update train_README-ja.md for DAdaptLion

* add DAdatpt V3

* Alignment

* Update train_util.py for experimental

* Update train_util.py V3

* Update train_README-zh.md

* Update train_README-ja.md

* Update train_util.py fix

* Update train_util.py

* support Prodigy

* add lower
2023-06-15 21:12:53 +09:00
Kohya S
9806b00f74 add arbitrary dataset feature to each script 2023-06-15 20:39:39 +09:00
ykume
9e1683cf2b support sdpa 2023-06-11 21:26:15 +09:00
ykume
0315611b11 remove workaround for accelerator=0.15, fix XTI 2023-06-11 18:32:14 +09:00
Kohya S
ec2efe52e4 scale v-pred loss like noise pred 2023-06-03 10:52:22 +09:00
ddPn08
1214f35985 update diffusers to 1.16 | train_db 2023-06-01 20:39:31 +09:00
Kohya S
714846e1e1 revert perlin_noise 2023-05-15 23:12:11 +09:00
hkinghuang
bca6a44974 Perlin noise 2023-05-15 11:16:08 +08:00
Kohya S
968bbd2f47 Merge pull request #480 from yanhuifair/main
fix print "saving" and "epoch" in newline
2023-05-11 21:05:37 +09:00
Kohya S
09c719c926 add adaptive noise scale 2023-05-07 18:09:08 +09:00
Fair
b08154dc36 fix print "saving" and "epoch" in newline 2023-05-07 02:51:01 +08:00
Kohya S
2127907dd3 refactor selection and logging for DAdaptation 2023-05-06 18:14:16 +09:00
青龍聖者@bdsqlsz
164a1978de Support for more Dadaptation (#455)
* Update train_util.py for add DAdaptAdan and DAdaptSGD

* Update train_util.py for DAdaptadam

* Update train_network.py for dadapt

* Update train_README-ja.md for DAdapt

* Update train_util.py for DAdapt

* Update train_network.py for DAdaptAdaGrad

* Update train_db.py for DAdapt

* Update fine_tune.py for DAdapt

* Update train_textual_inversion.py for DAdapt

* Update train_textual_inversion_XTI.py for DAdapt
2023-05-06 17:30:09 +09:00
ykume
69579668bb Merge branch 'dev' of https://github.com/kohya-ss/sd-scripts into dev 2023-05-03 11:17:43 +09:00
Kohya S
2e688b7cd3 Merge pull request #471 from pamparamm/multires-noise
Multi-Resolution Noise
2023-05-03 11:17:21 +09:00
ykume
2fcbfec178 make transform_DDP more intuitive 2023-05-03 11:07:29 +09:00
Isotr0py
e1143caf38 Fix DDP issues and Support DDP for all training scripts (#448)
* Fix DDP bugs

* Fix DDP bugs for finetune and db

* refactor model loader

* fix DDP network

* try to fix DDP network in train unet only

* remove unuse DDP import

* refactor DDP transform

* refactor DDP transform

* fix sample images bugs

* change DDP tranform location

* add autocast to train_db

* support DDP in XTI

* Clear DDP import
2023-05-03 10:37:47 +09:00
Pam
b18d099291 Multi-Resolution Noise 2023-05-02 09:42:17 +05:00
Kohya S
74008ce487 add save_every_n_steps option 2023-04-24 23:22:24 +09:00
Plat
27ffd9fe3d feat: support wandb logging 2023-04-20 01:41:12 +09:00
Kohya S
2e9f7b5f91 cache latents to disk in dreambooth method 2023-04-12 23:10:39 +09:00
AI-Casanova
0d54609435 Merge branch 'kohya-ss:main' into weighted_captions 2023-04-07 14:55:40 -05:00
Kohya S
541539a144 change method name, repo is private in default etc 2023-04-05 23:16:49 +09:00
AI-Casanova
1892c82a60 Reinstantiate weighted captions after a necessary revert to Main 2023-04-02 19:43:34 +00:00
ddPn08
b5ff4e816f resume from huggingface repository 2023-04-02 17:39:21 +09:00
Kohya S
4f70e5dca6 fix to work with num_workers=0 2023-03-28 19:42:47 +09:00
Kohya S
6732df93e2 Merge branch 'dev' into min-SNR 2023-03-26 17:10:53 +09:00
Kohya S
4f42f759ea Merge pull request #322 from u-haru/feature/token_warmup
タグ数を徐々に増やしながら学習するオプションの追加、persistent_workersに関する軽微なバグ修正
2023-03-26 17:05:59 +09:00
u-haru
a4b34a9c3c blueprint_args_conflictは不要なため削除、shuffleが毎回行われる不具合修正 2023-03-26 03:26:55 +09:00
u-haru
4dc1124f93 lora以外も対応 2023-03-26 02:19:55 +09:00
AI-Casanova
518a18aeff (ACTUAL) Min-SNR Weighting Strategy: Fixed SNR calculation to authors implementation 2023-03-23 12:34:49 +00:00
u-haru
dbadc40ec2 persistent_workersを有効にした際にキャプションが変化しなくなるバグ修正 2023-03-23 12:33:03 +09:00
u-haru
447c56bf50 typo修正、stepをglobal_stepに修正、バグ修正 2023-03-23 09:53:14 +09:00
u-haru
a9b26b73e0 implement token warmup 2023-03-23 07:37:14 +09:00
AI-Casanova
64c923230e Min-SNR Weighting Strategy: Refactored and added to all trainers 2023-03-22 01:27:29 +00:00
Kohya S
2d86f63e15 update steps calc with max_train_epochs 2023-03-21 21:21:12 +09:00
Kohya S
1645698ec0 Merge pull request #306 from robertsmieja/main
Extract parser setup to helper function
2023-03-21 21:09:23 +09:00
Kohya S
1816ac3271 add vae_batch_size option for faster caching 2023-03-21 18:15:57 +09:00
Robert Smieja
eb66e5ebac Extract parser setup to helper function
- Allows users who `import` the scripts to examine the parser programmatically
2023-03-20 00:06:47 -04:00
Kohya S
64d85b2f51 fix num_processes, fix indent 2023-03-19 10:52:46 +09:00
Kohya S
ec7f9bab6c Merge branch 'dev' into dev 2023-03-19 10:25:22 +09:00
Kohya S
83e102c691 refactor config parse, feature to output config 2023-03-19 10:11:11 +09:00