Commit Graph

44 Commits (b738ddd42a156b8d77abb0e9afef7858c8902836)

Author SHA1 Message Date
gammagoat cd2c2dab77 Fix LoRA requires_grad
Everything should work correctly except for training without unet, which
 errors out. This is going to change everyone's training substantially.
Old parameters will gave vastly different results.
2023-04-11 13:30:43 -04:00
d8ahazard 3e8ddce4dd One set of imports to rule them all. 2023-03-08 12:51:24 -06:00
ArrowM 6137a1a79e bye bye half lora 2023-03-04 16:28:32 -06:00
ArrowM f0576b4a51 save lora as fp16 2023-02-27 15:12:24 -06:00
ArrowM 0d50757a47 cleanup 2023-02-26 19:55:18 -06:00
ArrowM 9708072326 cleanup 2023-02-26 19:39:25 -06:00
gammagoat 45cd233c99 Merge branch 'dev' of https://github.com/gammagoat/sd_dreambooth_extension into dev
# Conflicts:
#	dreambooth/diff_to_sd.py
2023-02-26 19:28:10 -05:00
gammagoat 3d579490f2 Adds back support for applying Lora for ckpt
Makes apply_lora aware of use_extended. New merge_lora_to_model matches
the logic of merge_lora_to_pipe, but for a single model.

Tested on Previous extended loras.
2023-02-26 19:20:17 -05:00
ArrowM 8c29dd2711 fix loading model section of ui. fix a lora thing
fix 🔨 fix 🔨 fix 🔨
2023-02-26 09:38:01 -06:00
Markus Mayer b696d930b3
Resolve #957 by using nn.Parameter(weight) 2023-02-19 21:09:55 +01:00
d8ahazard f8a3513b7c Post-separation anxiety
I'm just kidding, this makes me nothing but giddy.

Bye Felicia!
2023-02-13 08:37:04 -06:00
d8ahazard b24fe0b22c I want to break free...
Add exception handling for all imports if, say, the code were launched by itself, versus as an extension...
2023-02-12 15:57:41 -06:00
ExponentialML 910347dd4a Remove redundant code, fix path vars 2023-02-01 19:00:15 -08:00
d8ahazard 281002d185 Bad programming is bad 2023-02-01 14:26:02 -06:00
d8ahazard c4f897cac5 Lora fixes
We need weight_apply_lora (or something else), because we're not using a pipeline anymore to save our checkpoint.
2023-02-01 14:16:38 -06:00
d8ahazard 0df98e766e
Merge branch 'dev' into ExtraEMA 2023-02-01 14:01:16 -06:00
ExponentialML c8cf6e7412 Fix LoRA merging. 2023-01-31 11:29:52 -08:00
d8ahazard 269bcc81ee EMA Nonsense 2023-01-31 09:09:28 -06:00
ExponentialML 3ceb581610 Set patch ti to False 2023-01-29 22:49:40 -08:00
ExponentialML 11facf4d2b Add pickle handlers for loras 2023-01-29 22:40:26 -08:00
ExponentialML 93c94a6abc Update imports and converting lora to checkpoint 2023-01-29 21:58:52 -08:00
ExponentialML 63446e3c60 modified: lora_diffusion/lora.py 2023-01-29 21:50:24 -08:00
ExponentialML 06924f04c3 Update patch_pipe for generating samples. 2023-01-29 21:42:58 -08:00
ExponentialML 0e765b8187 Update training lora functions and saving of weights. 2023-01-29 21:42:58 -08:00
ExponentialML c213521f63 Create LoRA save functions 2023-01-29 21:42:58 -08:00
ExponentialML 41c127cc49 Stage new LoRA file. 2023-01-29 21:42:58 -08:00
d8ahazard b63e1c09c7 Moar fixes 2023-01-26 14:31:30 -06:00
d8ahazard 97f03d93a9 Refactoring and cleanup and stuff, oh my.
Try to eliminate spaghetti by moving items to their own respective files.
Remove unused files, classes, methods.
Continue the good work of wrapping any dependencies on Auto1111 in the dreambooth/shared.py
2023-01-26 13:24:55 -06:00
Matt 4692019176 Use patch pipe for generating samples. 2023-01-20 16:38:44 -08:00
d8ahazard 245b074729 Lora fix 2023-01-20 17:14:09 -06:00
d8ahazard 6b314c6edd Fix lora params on load weights.
Otherwise, you're not gonna have a good time.
2023-01-19 14:00:21 -06:00
Matt bde92f19a5 Fixes LoRA inference for generating samples in the UI. 2023-01-07 22:06:33 -08:00
d8ahazard d3c8493e32 Fix lora reloading. 2023-01-07 22:50:05 -06:00
d8ahazard 1658fb28e7 LORA fixes
Fix/revert lora integration to how it was setup before bucketing was added.
Add lora rank to UI.
2023-01-07 22:20:22 -06:00
d8ahazard 07df22acbc Moar API, Image generation fixes. 2022-12-20 11:55:30 -06:00
d8ahazard 4f496d8f75 LINT 2022-12-15 13:01:02 -06:00
d8ahazard 4beb9f78ee Merge branch 'main' into ImageBuilder+ 2022-12-15 08:33:35 -06:00
d8ahazard da1fbb0511 Update lora.py 2022-12-14 23:03:01 -06:00
d8ahazard ea1603eddc Imagebuilder+
Add generate class button.
Fix saving params on method calls.
Add tensorboard profiling/flag.
Add custom apply lora weights method.
Add option to generate classifiers with txt2img.
Add save/load optimizer checkpointing.
Update extract/compile checkpoint code from diffusers.
2022-12-14 22:43:06 -06:00
Thomas-MMJ e346f4829b
Merge branch 'd8ahazard:main' into patch-1 2022-12-12 17:31:56 -09:00
d8ahazard 6db38198c9 Add lora to text encoder training, fix non-lora training. 2022-12-12 15:50:29 -06:00
Thomas-MMJ 0a589ac429
reduce memory usage by changing extract_lora_ups_down to a generator
during saving frequently getting vram exceeded on 6GB system, by changing to a generator this reduces the memory usage.
2022-12-12 01:04:44 -09:00
d8ahazard b689b8caee Lora cleanup... 2022-12-08 20:02:29 -06:00
d8ahazard 3e69ec8f01 Add LORA, v21 support 2022-12-08 09:42:13 -06:00