kohya_ss/test/config
bmaltais 8f2476115e Add pytorch_optimizer.CAME to optimizer list. 2025-03-30 14:44:24 -04:00
..
Diag-OFT-AdamW8bit-toml.json Adding support for LyCORIS BOFT, QLyCORIS and DoRA 2024-03-19 21:05:10 -04:00
DyLoRA-Adafactor-toml.json Ad new caption tool 2024-04-06 09:15:55 -04:00
LoKR-AdamW8bit-toml.json Fix accelerate issue on linux (#2281) 2024-04-13 16:52:27 -04:00
SDXL-Standard-Adafactor.json Fix [24.0.6] Train toml config seed type error #2370 2024-04-25 13:10:55 -04:00
SDXL-Standard-AdamW.json Update requirements 2025-03-28 15:27:39 -04:00
SDXL-Standard-AdamW8bit.json Update presets 2023-07-08 19:49:04 -04:00
Standard-AdamW.json Add support for LoRA-GGPO 2025-03-30 14:41:40 -04:00
Standard-AdamW8bit.json Update gradio in requirements 2025-03-28 15:17:25 -04:00
TI-AdamW8bit-SDXL.json v25.0.0 release (#3138) 2025-03-28 11:00:44 -04:00
TI-AdamW8bit-toml.json Merge dev-toml (#2131) 2024-03-19 18:21:40 -04:00
TI-AdamW8bit.json Add gui support for noise_offset_random_strength, ip_noise_gamma, ip_noise_gamma_random_strength 2024-03-26 19:12:48 -04:00
dataset-finetune.toml Merge dev-toml (#2131) 2024-03-19 18:21:40 -04:00
dataset-masked_loss.toml Add support for masked_loss to the GUI 2024-03-26 20:14:07 -04:00
dataset-multires.toml v25.0.0 release (#3138) 2025-03-28 11:00:44 -04:00
dataset.toml Add support for Main process port 2024-03-29 20:56:37 -04:00
dreambooth-Adafactor.json Making min and max steps parameters available on all trainers 2023-07-03 11:43:39 -04:00
dreambooth-AdamW.json Update gradio in requirements 2025-03-28 15:17:25 -04:00
dreambooth-AdamW8bit-masked_loss-toml.json Add support for masked_loss to the GUI 2024-03-26 20:14:07 -04:00
dreambooth-AdamW8bit-toml.json v25.0.0 release (#3138) 2025-03-28 11:00:44 -04:00
dreambooth-AdamW8bit.json Set `max_train_steps` to 0 if not specified in older `.json` config files 2024-04-26 07:11:07 -04:00
dreambooth-DAdaptAdam.json Add support for metadata parameters (#2295) 2024-04-15 13:26:39 -04:00
dreambooth-Prodigy-SDXL.json Dev pure (#2039) 2024-03-09 09:30:20 -05:00
dreambooth-Prodigy.json - Implement HuggingFace inputs in all training tabs (#2287) 2024-04-14 18:43:45 -04:00
dreambooth.json Making min and max steps parameters available on all trainers 2023-07-03 11:43:39 -04:00
finetune-AdamW-toml.json Fix issue with vae file path validation 2024-05-06 06:43:43 -04:00
finetune-AdamW.json Fix accelerate issue on linux (#2281) 2024-04-13 16:52:27 -04:00
iA3-Prodigy.json Update Finetuning tab structure 2023-07-03 06:56:58 -04:00
locon-Adafactor.json Update sd-script and add new test config 2024-04-01 11:30:54 -04:00
locon-AdamW.json Upgrade Gradio release 2024-04-10 20:48:12 -04:00
locon-AdamW8bit-masked_loss-toml.json Add support for masked_loss to the GUI 2024-03-26 20:14:07 -04:00
locon-AdamW8bit-toml.json Adding support for LyCORIS BOFT, QLyCORIS and DoRA 2024-03-19 21:05:10 -04:00
locon-AdamW8bit.json Align toml file content to sd-scripts defaults 2024-04-16 21:17:53 -04:00
locon-Prodigy.json Add new test cases 2023-06-23 15:21:07 -04:00
loha-Prodigy.json Update gradio release 2023-07-08 16:16:06 -04:00
meta-1_lat.json Merge dev-toml (#2131) 2024-03-19 18:21:40 -04:00
t5clrs.json Add pytorch_optimizer.CAME to optimizer list. 2025-03-30 14:44:24 -04:00