Commit Graph

130 Commits (6277c2db5979e9fa3a4e9b0aef419fe99bf69d73)

Author SHA1 Message Date
Matt 6277c2db59 Fix ordering of parameters. 2023-01-08 13:47:24 -08:00
Matt 881ed38413 Remove change handler function, fix api type, and move slider. 2023-01-08 13:29:55 -08:00
Matt 30c26ab8e7 Add comma for onchange handler. 2023-01-08 12:45:57 -08:00
Matt 34ae6bf2b0 Adds AdamW weight decay for the optimizer class. 2023-01-08 12:39:23 -08:00
d8ahazard 1658fb28e7 LORA fixes
Fix/revert lora integration to how it was setup before bucketing was added.
Add lora rank to UI.
2023-01-07 22:20:22 -06:00
d8ahazard 5c186f3724 Reduce excessive params.
We don't need to pass lora params everywhere, as they're already stored in the config.
2023-01-07 14:43:09 -06:00
d8ahazard 75ff3762b0 Add Discord Webhook
Rework of PR #702  by @Janca
2023-01-06 09:36:04 -06:00
d8ahazard eb51824fb3 Squashed commit of the following:
commit 3933b0b47ec546ecf8a57c81a002ac2583e332dc
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Fri Jan 6 08:44:36 2023 -0600

    Ensure model params are set before creating new model.

commit 2d112955194c1094cdbe7f548b16cc131febb3e2
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Fri Jan 6 08:43:35 2023 -0600

    Revert to old loss calculation method.

    Use original loss weight calc method from HF script, versus the version used in Kohya_SS.

    Only clip tenc params if training tenc.

commit 9f437e2d68da68e46675607504edeb400c5f4efa
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Thu Jan 5 22:22:48 2023 -0600

    Python 3.8 fix

commit bdef26f0d00d6be4e0c0d5df9afcbb1d0e824463
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Thu Jan 5 22:22:36 2023 -0600

    Update hints

commit c9773476a45e5b416875fb7ab6790d516bb88194
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Thu Jan 5 22:22:24 2023 -0600

    Fix warnings and errors generating graphs

commit b071d59932febc83f947a349f0433b1d9e04c07b
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Thu Jan 5 16:02:32 2023 -0600

    Code cleanup

    Fix typos, grammar.
    Remove unused vars.
    Optimize imports.
    Rename dupe/repeated vars.

commit df20e2278164eea3bb61f203c4e2b2cfa3d3a07a
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Thu Jan 5 15:13:11 2023 -0600

    clean

commit 9477b2e55fdcc48eb1d2b73a57f34c99cdbff696
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Thu Jan 5 14:07:06 2023 -0600

    Bippity Boppity Booyah!

    Remove "class_buckets" param.
    Remove excess imports.
    Fix saving chedkpoint counts during training.
    Ensure images returned to UI are images.
    Fix bucket matching, class images generation.
    Use same methods for bucketing and class generation.
    Add "Debug Buckets" button to UI.
    Fix sample image generation only generating samples for one concept.

commit 9ceb5c27980f19ec3e5ea50cb5186e28e3245d1c
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Wed Jan 4 13:13:46 2023 -0600

    Bad Bucket Matching Is Bad

    This is betterbuckets! We can't have bad bucket batching up in this bio...logy class. Better bucket bucketing is all about bucketing the buckets better while batching the buckets with the best bucket browsing.

commit e643624aa2adc788422941253cb96453d2689327
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Wed Jan 4 12:26:52 2023 -0600

    Building Better Bucket Resos

commit 690761f3a3bcdbb7a7ad99cec0f95a61ed0baf78
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Wed Jan 4 11:34:33 2023 -0600

    This too

commit efb79859101b82e9a2deddd5d47147881e24cb9d
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Wed Jan 4 11:34:15 2023 -0600

    More class generation work, fixes

commit 4325dfacf7fbb1de7c6eb671ea3f33a9ae5b6065
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Tue Jan 3 13:51:01 2023 -0600

    Remove old finetuning_dataset

commit e99940ca215660e2eda1e2e4fe54421623197f58
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Tue Jan 3 13:30:18 2023 -0600

    Cleanup print messages...

commit ff0ea01a965699d90bc8b949da8676ad34f39748
Merge: fe605c2 6d17489
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Tue Jan 3 12:56:44 2023 -0600

    Merge pull request #675 from Zuxier/FriedChickenBuckets

    Rework of stop text encoder and fix to reported total batch size

commit 6d1748935d
Author: Zuxier <120954436+Zuxier@users.noreply.github.com>
Date:   Tue Jan 3 18:57:39 2023 +0100

    add tenc encoder ratio and total batch size fix

commit fe605c250e
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Tue Jan 3 09:47:42 2023 -0600

    Cleanup

commit 9dfbebfc4c
Author: d8ahazard <d8ahazard@gmail.com>
Date:   Tue Jan 3 09:46:51 2023 -0600

    Better Bucketing

    Instead of randomly grabbing data out of one bucket, actually grab N instance images and N class images of matching resolution and caption, which should be more aligned with the ide of reg images.

    Refactor num_class_images to num_class_images_per - which ensures each instance image has N matching class images. Again, this should better align with the whole point of reg images.

    Better VRAM management while training as well.

    Add specific flag to allow debugging of vram stuff without constantly adding/removing print messages.
2023-01-06 08:49:26 -06:00
d8ahazard e85a34648f Show days if training takes that long... 2023-01-01 14:13:15 -06:00
d8ahazard 657dfc0c3a Seriously, Gradio...
Why does ONE method insist on passing any values as a list, but every other one doesn't???
2023-01-01 12:44:32 -06:00
d8ahazard e5908ab783 Fix latents issue, move grad_acc_steps closer to batch in UI 2023-01-01 11:35:33 -06:00
d8ahazard 645e71725c Remove unused "max steps" param from concepts, UI fixes.
Remove unused "max steps" param. Epochs does the same thing with less confusion and math.
Add JS hints for missing elements.
Re-arrange gradients checkpoint steps so it's next to batch size.
2023-01-01 11:26:52 -06:00
d8ahazard 5fcd5d6ae7 Don't hide API key with new gradio. 2023-01-01 10:37:43 -06:00
d8ahazard 88d1dfa5a8 UI Improvements, Fix Defaults, Moar
Fix weight mismatch (again).
Update default params.
Add warnings when saving without loading params first.
Delete vae if not needed.
Wizardy wizard stuff.
Adjust hidden UI elements on params load.
2022-12-31 17:14:22 -06:00
d8ahazard ff2fbe4887 More fixes, cleanup
Really, fix step counting to align with epoch stop.
Select individual image prompts in gallery.
Suppress tensorflow message on import.
Remove print messages that break the status bar.
Update state messages when generating training dataset.
2022-12-31 11:28:57 -06:00
d8ahazard 30e30c0477 Fix float mismatch, ui "generate log" button. 2022-12-30 14:56:44 -06:00
d8ahazard f3469fcc1b REALLY fix class img path, graph smoothing, fix class img counts. 2022-12-30 14:33:21 -06:00
d8ahazard 9259feea02 Fun with schedulers, fix fp16 issues? 2022-12-30 11:35:54 -06:00
d8ahazard ea3fbaa7f5
Merge branch 'main' into ImageBuilder+ 2022-12-30 00:14:37 -06:00
d8ahazard 0480f9c05b Fix load params, step update 2022-12-30 00:06:36 -06:00
d8ahazard b05b8cb6f7 We got us a firesale!
Remove max train steps entirely. It's a terrible measure of training duration.
Remove save using vars, because we don't need them.
2022-12-29 23:30:56 -06:00
d8ahazard d55390a4cf Buckets and self-adjusting batch sizes for all!
Add resolution bucketing from koyha_ss.
Add text encoder training limit.
Remove dumb adam beta params.
Remove printm calls, mem_record.
Superdataset is all but dead.
Lr loss average still looks janky AF.
Add Clip skip UI Param
Remove "scale learning rate" param
2022-12-29 20:57:17 -06:00
d8ahazard a4de0e104e Fix memory pinning, cleanup 2022-12-28 17:48:59 -06:00
d8ahazard 1ce5116020 Another washlist of changes...
Add "sanity prompt" and "sanity seed" options - generate a sample while training with an unrelated prompt to sanity check the overall state of the model.
Fixes for copying original config file when extracting/generating checkpoints.
Status update changes.
Better unique sample generation when selecting sample prompts.
Add cosine annealing with warm restarts lr scheduler.
Fix step counts (AGAIN).
Overlay graph for plot showing lr and loss avg.
Remove leftover use_cpu call.
2022-12-28 16:41:59 -06:00
d8ahazard 9a5a51224e Stupid counters 2022-12-26 15:14:49 -06:00
d8ahazard 9040494e05 Update main.py 2022-12-26 14:52:50 -06:00
d8ahazard dfb92f8255 Moar fixes
Respect max_train_epoch if > 1.
Fix saving sample txt.
Fix saving snapshot when canceling.
Fix generating save/image if frequ == 0.
2022-12-26 14:51:18 -06:00
d8ahazard 21fcfa33d9 Performance tweaking, UI Work
Attempt to show multiple images generated during training in gallery.
Add v2 tokenizer method to superdataset.
Add "set_diffusers_xformers_flag" method from kohya_ss
Adjust dataloader params to not pin memory and use 0 workers.
Revert breaking change in log parser (for now).
2022-12-26 12:25:51 -06:00
d8ahazard 1271aae9ab Work sync 2022-12-25 19:24:40 -06:00
d8ahazard 0a18203375 Fixes
Remove overlooked use_cpu call.
Try to prevent OOM on log image generation.
Fix VRAM log images/logging.
Fix imagic return type.
Fix UI sample generation.
2022-12-23 11:18:34 -06:00
d8ahazard 20b93e8f6a More massive messy commits!
Add fancy new graph generation showing LR and loss values.
Add LR and loss values to UI during updates.
Fix UI layout for progress bar and textinfos.
Remove hypernetwork junk from xattention, it's not applicable when training a model.
Add get_scheduler method from unreleased diffusers version to allow for new LR params.

Add wrapper class for training output - need to add for imagic yet.

Remove use CPU option entirely, replace with "use lora" in wizard.
Add grad acuumulataion steps to wizard.
Add lr cycles and lr power UI params for applicable LR schedulers.
Remove broke min_learning_rate param.
Remove unused "save class text" param.
Update js ui hints.
Bump diffusers version.
Make labels more useful, auto-adjusting as needed.
Add manual "check progress" button to UI, because "gradio".
2022-12-22 17:27:26 -06:00
d8ahazard 653c41dfb6 Break more things!
Start replacing references to native sd-webui methods with calls to a single wrapper class.
Add a minimum learning rate param for polynomial scheduler.
Add model epoch to UI.
Revert "actual_steps" value, because it's must more confusing...
Fix UI popups, regardless of if they work in main.
Add loss, vram usage to UI status text.
Fix tag shuffle, keep "first tag" in place.
Shuffle sample prompts?
Move DreamState class to db_shared.
2022-12-21 18:18:07 -06:00
d8ahazard de6601cb85 MOAR Cleanup
Fix indenting, API return values, remove unused prints and messages.
Rename some invalid objects.
Add __future__ imports to try to not break python < 3.9
Remove reference to misnamed SD method that will eventually get fixed and then break everything.
Clean up that potty mouth.
2022-12-21 10:13:35 -06:00
d8ahazard b6cb2233c8 Save epoch properly 2022-12-20 16:37:29 -06:00
d8ahazard cdbe52f714 Updates and fixes 2022-12-20 16:01:52 -06:00
d8ahazard 927018e831 API Fix, Generate Classes from training
Fix number/params for generate classes.
Fix API savefile method.
2022-12-20 14:44:21 -06:00
d8ahazard 07df22acbc Moar API, Image generation fixes. 2022-12-20 11:55:30 -06:00
d8ahazard c08fb42854 API Work, implement save params
Implement save options for checkpoints, snapshots, etc.
Fix loading checkpoint values from snapshots.
Work on adding missing API features.
2022-12-19 09:17:53 -06:00
d8ahazard d909998d94 WORK
Add additional save settings to config.
Update saving code, not to use the new settings, but just to work better when using batches.
Move "Generate Sample Images" to dreambooth.py so we can reload it on the fly.
Hide progressbar on start of new function.
Progress bar style work/fixes.
2022-12-18 12:38:20 -06:00
d8ahazard 6daa355419 Hide cancel button when not needed. 2022-12-17 17:38:37 -06:00
d8ahazard 56034ca683 Making Gradio my biotch...
Create unique db_state object we can use for stuff.
Add custom method wrappers so we don't depend on main.
Fix gallery, status messages.
Move things to /scripts folder so we can reload them faster.
Moar ui improvements.
2022-12-17 17:29:15 -06:00
Llewellyn Pritchard 5af3ef2733 Set warmup steps to 0 by default. Add extra tooltip info on why it could be bad. 2022-12-16 17:00:32 +02:00
d8ahazard 4f496d8f75 LINT 2022-12-15 13:01:02 -06:00
d8ahazard 4beb9f78ee Merge branch 'main' into ImageBuilder+ 2022-12-15 08:33:35 -06:00
d8ahazard e8d788b17a
Merge pull request #532 from ExponentialML/dev
Lora training fixes for saving and proper training.
2022-12-15 08:23:21 -06:00
Matt 6cf559be42 Add lora text alpha slider and change params to floats 2022-12-14 21:56:48 -08:00
d8ahazard ea1603eddc Imagebuilder+
Add generate class button.
Fix saving params on method calls.
Add tensorboard profiling/flag.
Add custom apply lora weights method.
Add option to generate classifiers with txt2img.
Add save/load optimizer checkpointing.
Update extract/compile checkpoint code from diffusers.
2022-12-14 22:43:06 -06:00
Matt eba043649f - Fixed saving and enabling params
- Fixed errors when piping paramaters through functions
- Custom name saving for models and lora checkpoints
2022-12-14 14:23:16 -08:00
Matt 7d9f309b35 Add custom name values, functions, and paramaters to compile_checkpoint 2022-12-14 12:23:56 -08:00
Matt a1e2a87b48 Create parameters for custom model name. 2022-12-14 12:10:57 -08:00