Kohya S
09c719c926
add adaptive noise scale
2023-05-07 18:09:08 +09:00
Kohya S
2127907dd3
refactor selection and logging for DAdaptation
2023-05-06 18:14:16 +09:00
青龍聖者@bdsqlsz
164a1978de
Support for more Dadaptation ( #455 )
...
* Update train_util.py for add DAdaptAdan and DAdaptSGD
* Update train_util.py for DAdaptadam
* Update train_network.py for dadapt
* Update train_README-ja.md for DAdapt
* Update train_util.py for DAdapt
* Update train_network.py for DAdaptAdaGrad
* Update train_db.py for DAdapt
* Update fine_tune.py for DAdapt
* Update train_textual_inversion.py for DAdapt
* Update train_textual_inversion_XTI.py for DAdapt
2023-05-06 17:30:09 +09:00
ykume
69579668bb
Merge branch 'dev' of https://github.com/kohya-ss/sd-scripts into dev
2023-05-03 11:17:43 +09:00
Kohya S
2e688b7cd3
Merge pull request #471 from pamparamm/multires-noise
...
Multi-Resolution Noise
2023-05-03 11:17:21 +09:00
ykume
2fcbfec178
make transform_DDP more intuitive
2023-05-03 11:07:29 +09:00
Isotr0py
e1143caf38
Fix DDP issues and Support DDP for all training scripts ( #448 )
...
* Fix DDP bugs
* Fix DDP bugs for finetune and db
* refactor model loader
* fix DDP network
* try to fix DDP network in train unet only
* remove unuse DDP import
* refactor DDP transform
* refactor DDP transform
* fix sample images bugs
* change DDP tranform location
* add autocast to train_db
* support DDP in XTI
* Clear DDP import
2023-05-03 10:37:47 +09:00
Pam
b18d099291
Multi-Resolution Noise
2023-05-02 09:42:17 +05:00
Kohya S
74008ce487
add save_every_n_steps option
2023-04-24 23:22:24 +09:00
Plat
27ffd9fe3d
feat: support wandb logging
2023-04-20 01:41:12 +09:00
Kohya S
2e9f7b5f91
cache latents to disk in dreambooth method
2023-04-12 23:10:39 +09:00
AI-Casanova
0d54609435
Merge branch 'kohya-ss:main' into weighted_captions
2023-04-07 14:55:40 -05:00
Kohya S
541539a144
change method name, repo is private in default etc
2023-04-05 23:16:49 +09:00
AI-Casanova
1892c82a60
Reinstantiate weighted captions after a necessary revert to Main
2023-04-02 19:43:34 +00:00
ddPn08
b5ff4e816f
resume from huggingface repository
2023-04-02 17:39:21 +09:00
Kohya S
4f70e5dca6
fix to work with num_workers=0
2023-03-28 19:42:47 +09:00
Kohya S
6732df93e2
Merge branch 'dev' into min-SNR
2023-03-26 17:10:53 +09:00
Kohya S
4f42f759ea
Merge pull request #322 from u-haru/feature/token_warmup
...
タグ数を徐々に増やしながら学習するオプションの追加、persistent_workersに関する軽微なバグ修正
2023-03-26 17:05:59 +09:00
u-haru
a4b34a9c3c
blueprint_args_conflictは不要なため削除、shuffleが毎回行われる不具合修正
2023-03-26 03:26:55 +09:00
u-haru
4dc1124f93
lora以外も対応
2023-03-26 02:19:55 +09:00
AI-Casanova
518a18aeff
(ACTUAL) Min-SNR Weighting Strategy: Fixed SNR calculation to authors implementation
2023-03-23 12:34:49 +00:00
u-haru
dbadc40ec2
persistent_workersを有効にした際にキャプションが変化しなくなるバグ修正
2023-03-23 12:33:03 +09:00
u-haru
447c56bf50
typo修正、stepをglobal_stepに修正、バグ修正
2023-03-23 09:53:14 +09:00
u-haru
a9b26b73e0
implement token warmup
2023-03-23 07:37:14 +09:00
AI-Casanova
64c923230e
Min-SNR Weighting Strategy: Refactored and added to all trainers
2023-03-22 01:27:29 +00:00
Kohya S
2d86f63e15
update steps calc with max_train_epochs
2023-03-21 21:21:12 +09:00
Kohya S
1645698ec0
Merge pull request #306 from robertsmieja/main
...
Extract parser setup to helper function
2023-03-21 21:09:23 +09:00
Kohya S
1816ac3271
add vae_batch_size option for faster caching
2023-03-21 18:15:57 +09:00
Robert Smieja
eb66e5ebac
Extract parser setup to helper function
...
- Allows users who `import` the scripts to examine the parser programmatically
2023-03-20 00:06:47 -04:00
Kohya S
64d85b2f51
fix num_processes, fix indent
2023-03-19 10:52:46 +09:00
Kohya S
ec7f9bab6c
Merge branch 'dev' into dev
2023-03-19 10:25:22 +09:00
Kohya S
83e102c691
refactor config parse, feature to output config
2023-03-19 10:11:11 +09:00
Kohya S
c3f9eb10f1
format with black
2023-03-18 18:58:12 +09:00
Linaqruf
44d4cfb453
feat: added function to load training config with .toml
2023-03-12 11:52:37 +07:00
Isotr0py
eb68892ab1
add lr_scheduler_type etc
2023-03-09 16:51:22 +08:00
Kohya S
5602e0e5fc
change dataset config option to dataset_config
2023-03-02 21:51:58 +09:00
Kohya S
ed19a92bbe
fix typos
2023-03-01 21:01:10 +09:00
fur0ut0
8abb8645ae
add detail dataset config feature by extra config file ( #227 )
...
* add config file schema
* change config file specification
* refactor config utility
* unify batch_size to train_batch_size
* fix indent size
* use batch_size instead of train_batch_size
* make cache_latents configurable on subset
* rename options
* bucket_repo_range
* shuffle_keep_tokens
* update readme
* revert to min_bucket_reso & max_bucket_reso
* use subset structure in dataset
* format import lines
* split mode specific options
* use only valid subset
* change valid subsets name
* manage multiple datasets by dataset group
* update config file sanitizer
* prune redundant validation
* add comments
* update type annotation
* rename json_file_name to metadata_file
* ignore when image dir is invalid
* fix tag shuffle and dropout
* ignore duplicated subset
* add method to check latent cachability
* fix format
* fix bug
* update caption dropout default values
* update annotation
* fix bug
* add option to enable bucket shuffle across dataset
* update blueprint generate function
* use blueprint generator for dataset initialization
* delete duplicated function
* update config readme
* delete debug print
* print dataset and subset info as info
* enable bucket_shuffle_across_dataset option
* update config readme for clarification
* compensate quotes for string option example
* fix bug of bad usage of join
* conserve trained metadata backward compatibility
* enable shuffle in data loader by default
* delete resolved TODO
* add comment for image data handling
* fix reference bug
* fix undefined variable bug
* prevent raise overwriting
* assert image_dir and metadata_file validity
* add debug message for ignoring subset
* fix inconsistent import statement
* loosen too strict validation on float value
* sanitize argument parser separately
* make image_dir optional for fine tuning dataset
* fix import
* fix trailing characters in print
* parse flexible dataset config deterministically
* use relative import
* print supplementary message for parsing error
* add note about different methods
* add note of benefit of separate dataset
* add error example
* add note for english readme plan
---------
Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com >
2023-03-01 20:58:08 +09:00
Kohya S
dd523c94ff
sample images in training (not fully tested)
2023-02-27 17:48:32 +09:00
Kohya S
9ab964d0b8
Add Adafactor optimzier
2023-02-22 21:09:47 +09:00
Kohya S
663aad2b0d
refactor get_scheduler etc.
2023-02-20 22:47:43 +09:00
mgz-dev
b29c5a750c
expand optimizer options and refactor
...
Refactor code to make it easier to add new optimizers, and support alternate optimizer parameters
-move redundant code to train_util for initializing optimizers
- add SGD Nesterov optimizers as option (since they are already available)
- add new parameters which may be helpful for tuning existing and new optimizers
2023-02-19 17:45:09 -06:00
Kohya S
048e7cd428
add lion optimizer support
2023-02-19 15:26:14 +09:00
Kohya S
ffdfd5f615
fix name of loss for epoch
2023-02-16 22:21:36 +09:00
Kohya S
914d1505df
Merge pull request #189 from shirayu/improve_loss_track
...
Show the moving average loss
2023-02-16 22:00:26 +09:00
Kohya S
43c0a69843
Add noise_offset
2023-02-14 21:15:48 +09:00
Yuta Hayashibe
8aed5125de
Removed call of sum()
2023-02-14 21:11:30 +09:00
Yuta Hayashibe
21f5b618c3
Show the moving average loss
2023-02-14 19:46:27 +09:00
Kohya S
3a72e6f003
add tag dropout
2023-02-09 21:35:27 +09:00
Kohya S
e42b2f7aa9
conditional caption dropout (in progress)
2023-02-07 22:28:56 +09:00