Commit Graph

583 Commits (4e1f8a2b711784636e27e3db2d9d5eb7ee7170a4)

Author SHA1 Message Date
Vladimir Mandic e9055c7cd8 major refactor 2023-09-04 11:31:29 -04:00
Vladimir Mandic 8258313555
Merge pull request #2128 from vladmandic/master
refresh dev
2023-09-03 15:59:45 -04:00
Vladimir Mandic d41baddf50 add styles to extra networks 2023-09-03 15:00:48 -04:00
Vladimir Mandic cbe779b541 test model jit tracing 2023-09-02 09:55:53 -04:00
Disty0 3287222177 Fix Kandinsky compile and Fix steps with non SDXL 2023-09-02 14:28:11 +03:00
Disty0 e5e1eeeac9 Cleanup 2023-09-02 01:34:01 +03:00
Vladimir Mandic 603ff9fc75 minor fixes 2023-08-31 12:00:29 -04:00
Disty0 418a085246 Update Readme 2023-08-31 14:43:29 +03:00
Disty0 77fce3e8c8 Fix OpenVINO on Windows 2023-08-31 12:58:39 +03:00
Vladimir Mandic d45e6a04cd improve vae reload 2023-08-30 08:49:27 -04:00
Vladimir Mandic 5e14178b3d fix tomed error 2023-08-30 02:38:25 -04:00
Vladimir Mandic 10f345f09b handle loading invalid model or pipeline 2023-08-29 15:59:57 -04:00
Vladimir Mandic 9e2e2b8b8f fix gallery, update js logging, add en caching 2023-08-29 14:16:37 -04:00
Vladimir Mandic 48c0ce9b2b fix model lookups 2023-08-27 08:01:29 +00:00
Vladimir Mandic c4263da0e6 fix refiner reload/unload 2023-08-26 13:35:59 +00:00
Disty0 8ef4aa7a4a OpenVINO fix resolution change 2023-08-25 16:12:10 +03:00
Vladimir Mandic 0512e2973c refactor settings ui 2023-08-24 10:01:39 +02:00
Vladimir Mandic 84f343687d fix python 3.9 compatibility 2023-08-23 08:18:40 +00:00
Disty0 863fa38c24 OpenVINO fix model reloading 2023-08-23 00:31:43 +03:00
Vladimir Mandic 6a4d4ea5b7 update logging and model hashinh 2023-08-22 18:28:09 +00:00
Disty0 80d2cc46e8 Add precompile as an option 2023-08-22 13:21:25 +03:00
Vladimir Mandic 93bf3bb263 set default settings 2023-08-21 08:31:50 +00:00
Vladimir Mandic 4826197d5b
Merge pull request #2047 from vladmandic/master
update dev
2023-08-20 14:43:09 +02:00
Disty0 05084a5f53 Diffusers fix dtypes 2023-08-19 19:27:06 +03:00
Disty0 f9718f068c Seperate OpenVINO from IPEX 2023-08-19 17:52:15 +03:00
Vladimir Mandic 5eac99d3f5 optimize diffusers memory handling 2023-08-18 20:41:34 +00:00
Vladimir Mandic 3d914688cc update metadata 2023-08-15 05:50:15 +02:00
vladmandic 79c0131158 Deploying to master from @ vladmandic/automatic@b1ea529c08 🚀 2023-08-15 12:25:08 +00:00
Disty0 86ae8175e0 Seperate OpenVINO from IPEX 2023-08-15 15:22:54 +03:00
Vladimir Mandic c5c817f482 img2img batching 2023-08-13 09:19:39 +00:00
Vladimir Mandic a156751857 fix pipeline autodetect 2023-08-10 21:56:26 +00:00
Vladimir Mandic f52249d5a8 fix prompt parser for sdxl and enable offloading 2023-08-10 21:20:56 +00:00
Vladimir Mandic 5bcd65d4c2 revert meta 2023-08-10 19:00:04 +00:00
Disty0 a9726b3319 Send to meta when unloading 2023-08-10 01:23:39 +03:00
Vladimir Mandic 0a3e821067 diffuser auto-pipeline and fix vae 2023-08-07 17:19:30 +00:00
Vladimir Mandic 7c4fdbff1b update taesd 2023-08-05 08:56:45 +02:00
Vladimir Mandic a12c9117e6 add requirement check install flag 2023-08-04 11:23:25 +02:00
Disty0 5f5a564d41 Update compile settings 2023-08-03 22:38:43 +03:00
Disty0 44b17b7418 Add compile type option 2023-08-03 17:51:29 +03:00
Disty0 4535a99fff Model compile support for IPEX 2023-08-03 17:25:15 +03:00
Disty0 80b834054b CPU offload mode check & Enable compile for IPEX 2023-08-01 02:51:50 +03:00
Disty0 8ffaea76ba Add Diffusers model and VAE variant loading option 2023-07-31 14:39:41 +03:00
Disty0 350be09282 w/a for model cpu offload refiner 2023-07-30 17:18:57 +03:00
Disty0 0180402563 Better move and accelerate handling 2023-07-30 11:36:47 +03:00
Disty0 66a6e783f0 has_accelerate = False for original backend 2023-07-30 01:22:29 +03:00
Kubuxu 4d00224082 Fix pipeline switching 2023-07-30 00:51:35 +03:00
Kubuxu 7e1030d499 Introduce sd_model.has_accelerate 2023-07-30 00:51:35 +03:00
Kubuxu f945cf14b0 Fix model offload by not focring the model to GPU 2023-07-30 00:51:35 +03:00
Disty0 085d1da825 Fix force upcast VAE with Diffusers 2023-07-29 21:01:24 +03:00
Disty0 3258b27523 Update sequential CPU offload check 2023-07-29 19:13:29 +03:00
Disty0 025df86ac0 Cleanup 2023-07-29 02:06:34 +03:00
Kubuxu 1da64b9d08 Fix setting VAE Force Upcast in diffusers 2023-07-28 23:07:11 +01:00
Disty0 dbd7887632 Fix refiner unloading 2023-07-28 23:12:45 +03:00
Disty0 c8af2affaf Fix original backend reloading 2023-07-28 21:26:54 +03:00
Disty0 cfcf481992 cleanup 2023-07-28 21:10:38 +03:00
Disty0 e6cf3d72cd Fix sequential cpu offload 2023-07-28 20:01:32 +03:00
Seunghoon Lee 77de9cd093
Fix medvram with DirectML. 2023-07-28 23:18:28 +09:00
Seunghoon Lee 0f44332e5c
Make sequential CPU offload available for non-CUDA
Add settings override for DirectML.
Move `devices.set_cuda_params()` to correct line.
2023-07-28 23:11:57 +09:00
Vladimir Mandic 4a64eef568
Merge pull request #1831 from vladmandic/improve_diffusers_kandinsky_2
[Diffusers] Make all Kandinsky work
2023-07-26 16:04:08 -04:00
Vladimir Mandic ff9b8bc062 api endpoint refresh vaes 2023-07-26 15:51:19 -04:00
Vladimir Mandic e79bdcdea3 fix startup without model 2023-07-26 14:41:29 -04:00
Patrick von Platen e6baac124d fix ruff 2023-07-25 21:50:35 +00:00
Patrick von Platen a9d239ab51 [Diffusers] Make all Kandinsky work 2023-07-25 21:47:49 +00:00
Vladimir Mandic 4c2664dbc3 redesign diffuser vae handling 2023-07-21 14:30:57 -04:00
Vladimir Mandic b31fa98669 fixes... 2023-07-21 09:28:02 -04:00
Disty0 57d1d3ed16 Fix Kandinsky safety_checker and compile 2023-07-20 14:29:15 +03:00
Steven 82d2a601c2 When searching for model info from the checkpoint specified from ckpt command line argument, strip the path from the argument so that we only search for the model's filename. 2023-07-18 18:47:39 -04:00
Vladimir Mandic debec28be6 rework settings, image-watermark, safe config handling 2023-07-18 14:41:27 -04:00
Vladimir Mandic 57dd6652df fix compile logging 2023-07-17 19:30:53 +00:00
Vladimir Mandic 926a0fde1a diffusers code refactoring and exception handling 2023-07-17 12:22:51 -04:00
Vladimir Mandic 4e48173e37 enable sdxl vae 2023-07-16 17:58:08 -04:00
Vladimir Mandic 55c0269032 minor fixes 2023-07-15 10:18:08 -04:00
Vladimir Mandic 9308c32ad2 update samplers and callbacks 2023-07-15 08:44:02 -04:00
Jack Wooldridge 7fdda2c8b3 MPS fix 2023-07-14 15:34:14 -04:00
Disty0 2a9133bfec IPEX rework 2023-07-14 17:33:24 +03:00
Disty0 558b71f088 Fix img2img and hires for IPEX 2023-07-14 02:21:52 +03:00
Disty0 25389f737b Better logging & Set IPEX MemSize from %80 to %100 2023-07-13 19:24:58 +03:00
Vladimir Mandic c55024fe1b set backend persist restarts 2023-07-13 11:53:23 -04:00
Vladimir Mandic fdb76ddf12 model load exception handling 2023-07-13 10:45:58 -04:00
Vladimir Mandic 186bd236f5 js monitor ops 2023-07-13 09:50:38 -04:00
Vladimir Mandic 6947776dcb fix imageview direction 2023-07-12 16:59:02 -04:00
Vladimir Mandic e4a682de2b fix priorpipeline 2023-07-12 16:46:25 -04:00
Vladimir Mandic 5c8ead7be0 update diffusers 2023-07-12 15:35:41 -04:00
Disty0 f0506fd517 Move ipex optimize to compile 2023-07-12 19:58:08 +03:00
Disty0 562ca33275 Fix Diffusers _conv_forward dtype error with IPEX 2023-07-12 02:03:45 +03:00
Vladimir Mandic ec99bad021 enable backend switching on-the-fly 2023-07-11 15:55:02 -04:00
Vladimir Mandic 6d277305f6 update processing 2023-07-11 11:40:53 -04:00
Disty0 a844a83d9d VRAM efficient refiner loading for compiler 2023-07-11 11:00:45 +03:00
Vladimir Mandic 75a8c1f9d0 enable basic img2img 2023-07-10 11:44:52 -04:00
Disty0 4152c2049b Update IPEX logging 2023-07-10 14:49:42 +03:00
Disty0 798c5f23c5 VRAM efficient IPEX Optimize for refiner 2023-07-09 19:14:30 +03:00
Vladimir Mandic a16eee1504 bugfix release 2023-07-09 10:06:47 -04:00
Disty0 a9dab70a2c Fix IPEX Optimize with Diffusers 2023-07-09 16:17:50 +03:00
Disty0 d915d8d659 Disable xpu.optimize for SD 1-2 2023-07-09 03:09:37 +03:00
Disty0 467197a7f1 xpu.optimize for diffusers 2023-07-09 02:35:12 +03:00
Vladimir Mandic 2a21196061
Merge branch 'master' into dev 2023-07-08 13:35:25 -04:00
Vladimir Mandic 3e61907bfe minor fixes 2023-07-08 13:17:12 -04:00
Vladimir Mandic a79b8c86c2 cleanup before merge 2023-07-08 12:20:37 -04:00
Vladimir Mandic 89a7ea6a3f overal quality fixes 2023-07-08 09:49:41 -04:00
Disty0 4459cc581a Prior device.type cuda or xpu 2023-07-07 23:03:29 +03:00
Vladimir Mandic 120710f28a force model variant 2023-07-07 13:38:04 -04:00
Vladimir Mandic 3e4ca0095e fix compile 2023-07-07 10:05:32 -04:00
Vladimir Mandic 3e1a6a96d0 add additional pipelines 2023-07-07 09:38:19 -04:00
Vladimir Mandic 9f96d4f657 update notes 2023-07-06 20:21:01 -04:00
Vladimir Mandic 7e11ff2b34 add sdxl support 2023-07-06 19:26:43 -04:00
Vladimir Mandic dd4602fd64 update dynamo logging 2023-07-05 18:58:26 -04:00
Vladimir Mandic be0bfbcd27 fix samplers config 2023-07-05 11:00:29 -04:00
Disty0 966eed8dd9 Autodetect IPEX 2023-07-04 23:37:36 +03:00
Vladimir Mandic 2524b6659c double package install pass 2023-07-04 16:04:22 -04:00
Vladimir Mandic 191da73d48 diffuser sampler settings 2023-07-04 14:10:31 -04:00
Vladimir Mandic 18ef9e6fd7 redo diffusers scheduler 2023-07-04 13:07:05 -04:00
Vladimir Mandic b216a35ddd update diffusers and extra networks 2023-07-04 09:28:48 -04:00
Vladimir Mandic 8241e33868 major diffusers update 2023-07-03 16:48:03 -04:00
Vladimir Mandic cc685a8729 wip diffusers 2023-07-02 21:07:26 -04:00
Vladimir Mandic a2caafe4df initial diffusers merge into dev 2023-07-02 14:04:54 -04:00
Vladimir Mandic fbbb56f6ca dissalow ckpt option 2023-07-01 16:12:38 -04:00
Vladimir Mandic 2345237844 fix race condition on model load via api 2023-06-19 09:13:17 -04:00
Vladimir Mandic 457dddf7aa refactor html-info and do some linting cleanups 2023-06-18 11:38:42 -04:00
Vladimir Mandic 8d80b5f6d9 add server class 2023-06-17 13:44:55 -04:00
Vladimir Mandic c52c63128b handle sending of deleted images 2023-06-15 14:08:12 -04:00
Vladimir Mandic ba6f9fb4c9 fix callbacks 2023-06-15 10:31:30 -04:00
Vladimir Mandic 0ddf613b49 jumbo merge part two 2023-06-14 11:23:02 -04:00
Vladimir Mandic 1d9e490ef9 ruff linting fixes 2023-06-13 12:22:39 -04:00
Vladimir Mandic cb307399dd jumbo merge 2023-06-13 11:59:56 -04:00
Vladimir Mandic f510abed39 fix sd metadata 2023-06-13 07:22:48 -04:00
Vladimir Mandic eb47acf552 add metadata cache 2023-06-12 22:11:25 -04:00
Vladimir Mandic 1d0a18ef4a reorg server startup 2023-06-11 09:00:38 -04:00
Vladimir Mandic aaa0d46286 update installer and add sd_model_dict 2023-06-07 13:26:21 -04:00
Vladimir Mandic 82095082af fix model path on initial install 2023-06-06 15:32:22 -04:00
Patrick von Platen 9cf5888479 dedup code 2023-06-05 18:23:35 +00:00
Patrick von Platen 46d410687e Improve when loading from diffusers 2023-06-05 18:22:17 +00:00
Disty0 d6b3504f4d Compile with IPEX Optimize 2023-06-02 13:04:50 +03:00
Alexander Brown 2e7aa7eb15 Raise exception when failing to find diffuser model 2023-05-31 08:54:58 -07:00
Vladimir Mandic d9f72b066f precalc hashes 2023-05-31 09:14:34 -04:00
Vladimir Mandic 5f1fd7bd66 update common ui 2023-05-29 13:43:03 -04:00
Vladimir Mandic 54257dd226 refactoring for pylint 2023-05-28 17:09:58 -04:00
Vladimir Mandic 851d129680 more diffusers work 2023-05-27 15:49:54 -04:00
Vladimir Mandic efd3810860 diffusers merge 2023-05-26 22:42:03 -04:00
Disty0 8022de7464 Fix AVX512 error when using low or med vram with ipex 2023-05-26 23:43:37 +03:00
Vladimir Mandic 9a3a56dbb2 fix ipex device 2023-05-25 13:40:48 -04:00
Vladimir Mandic fc82ea2d7e cache loaded model 2023-05-25 08:51:46 -04:00
Disty0 5614a4c3fd Fix typo 2023-05-24 13:11:02 +03:00
Disty0 0412651a6c Send to CPU intead of XPU when unloading 2023-05-24 13:03:40 +03:00
Vladimir Mandic b6289d56c7 cleanup 2023-05-20 13:36:27 -04:00
Vladimir Mandic 0891b30ffe update 2023-05-20 08:29:29 -04:00
Vladimir Mandic 4c4e147baa fully localize data-dir 2023-05-19 15:23:26 -04:00
Vladimir Mandic 6221ccba4f change default model on download 2023-05-19 14:06:46 -04:00
Vladimir Mandic 325c0945d2 update model path 2023-05-18 14:04:06 -04:00
Vladimir Mandic 8b682183e3 update gradio 2023-05-18 10:41:24 -04:00
Vladimir Mandic df1fae7248 fix models path 2023-05-18 10:17:39 -04:00
Vladimir Mandic 6c66228cde fix models dir 2023-05-18 08:17:49 -04:00
Vladimir Mandic 0ccda9bc8b jumbo patch 2023-05-17 14:15:55 -04:00
Vladimir Mandic c99c1410f5 update 2023-05-14 20:25:27 -04:00
Vladimir Mandic 760f5fb89a add extra debug messages 2023-05-14 12:26:15 -04:00
Vladimir Mandic 85d67d6331 add interrupt to processing 2023-05-14 12:13:44 -04:00
Vladimir Mandic 1943bfea88 use cudnn workaround 2023-05-11 22:24:12 -04:00
Vladimir Mandic 05656a54fe update extra networks 2023-05-11 09:30:34 -04:00
Vladimir Mandic e038bf1549 aggressive gc 2023-05-10 16:03:55 -04:00
Vladimir Mandic d8a2c32918 xyz grid optimizations 2023-05-09 10:41:23 -04:00
Vladimir Mandic 8203bd5c97 update 2023-05-09 09:09:31 -04:00
Vladimir Mandic 8062f9197d run without checkpoint 2023-05-09 09:09:31 -04:00
Vladimir Mandic 41182009cb switch some cmdopts to opts 2023-05-08 09:27:50 -04:00
Vladimir Mandic 1360c6422a add fp16 test 2023-05-08 09:27:50 -04:00
Vladimir Mandic fe496f4ebc add train preprocess options 2023-05-05 09:06:06 -04:00
Vladimir Mandic c470f39913 merge fixes 2023-05-04 16:55:41 -04:00
Disty0 8171d57c36 Remove unnecessary IPEX imports 2023-05-04 02:34:34 +03:00
Disty0 7577a09528 Add IPEX Optimizers and use XPU instead of CPU when using IPEX 2023-05-03 18:12:38 +03:00
Vladimir Mandic 6f976c358f optimize model load 2023-05-02 21:30:34 -04:00
Vladimir Mandic eb03fce3e4 fix logger 2023-05-02 15:57:28 -04:00
Vladimir Mandic 7a083d322b merge commits 2023-05-02 15:06:06 -04:00
Vladimir Mandic cb4cff3929 redesign logging 2023-05-02 13:57:16 -04:00
Vladimir Mandic 22da90d4b8 fix lora memory leak 2023-05-01 10:13:21 -04:00
Disty0 68fc95b2e1 Merge remote-tracking branch 'upstream/master' 2023-04-30 18:28:22 +03:00
Disty0 de8d0bef9f More patches and Import IPEX after Torch 2023-04-30 18:19:37 +03:00
Vladimir Mandic 682330b172 new command line parser 2023-04-30 10:54:59 -04:00
Disty0 b075d3c8fd Intel ARC Support 2023-04-30 15:13:56 +03:00
Vladimir Mandic 20b64aad7b update samplers 2023-04-24 16:16:52 -04:00
Vladimir Mandic cf277e7326 fix dtype logic 2023-04-21 15:04:05 -04:00
Vladimir Mandic 4417d570aa
Merge pull request #233 from Yan233th/master
Fix hasattr to in method
2023-04-21 09:29:51 -04:00
papuSpartan 9e8dc9843c port to vlad 2023-04-21 03:18:08 -05:00
Vladimir Mandic 7939a1649d parse model preload 2023-04-20 23:19:25 -04:00
Vladimir Mandic 0282832f12 fix vae path 2023-04-20 15:50:06 -04:00
Vladimir Mandic 5a0664c945 fixes 2023-04-20 15:35:40 -04:00
Vladimir Mandic 752b91d38a fix model download 2023-04-20 12:29:54 -04:00
Vladimir Mandic 0e7144186d jump patch 2023-04-20 11:20:27 -04:00
Vladimir Mandic 8b1f26324b optional model loader and integrate image info 2023-04-17 15:31:43 -04:00
Vladimir Mandic fd51bb90d0 enable quick launch 2023-04-15 11:51:58 -04:00
Vladimir Mandic ed8819b8fc lycoris, strong linting, model keyword, circular imports 2023-04-15 10:28:31 -04:00
Vladimir Mandic 2ece9782e4 handle duplicate extensions and redo exception handler 2023-04-14 09:57:53 -04:00
Vladimir Mandic 81b8294e93 switch cmdflags to settings 2023-04-12 10:40:11 -04:00
Vladimir Mandic ffc54d0938 update launcher 2023-04-06 11:23:25 -04:00
Vladimir Mandic 8cc3a64201 redo timers 2023-04-04 10:18:39 -04:00
Yan233_ 017398885a Fix hasattr to in method 2023-03-29 23:32:48 +08:00
Vladimir Mandic 86b83fc956
Merge pull request #66 from AUTOMATIC1111/master
merge from upstream
2023-03-28 16:43:39 -04:00
AUTOMATIC 1b63afbedc sort hypernetworks and checkpoints by name 2023-03-28 20:03:57 +03:00
AUTOMATIC1111 f1db987e6a
Merge pull request #8958 from MrCheeze/variations-model
Add support for the unclip (Variations) models, unclip-h and unclip-l
2023-03-28 19:39:20 +03:00
Vladimir Mandic 6fe6eff9b4 improve error handling 2023-03-26 21:50:15 -04:00
MrCheeze 1f08600345 overwrite xformers in the unclip model config if not available 2023-03-26 16:55:29 -04:00
Vladimir Mandic 404a2a2cb2 fix broken generate and add progress bars 2023-03-26 14:23:45 -04:00
MrCheeze 8a34671fe9 Add support for the Variations models (unclip-h and unclip-l) 2023-03-25 21:03:07 -04:00
Vladimir Mandic 284bbcd67b update modules 2023-03-25 09:25:13 -04:00
AUTOMATIC1111 956ed9a737
Merge pull request #8780 from Brawlence/master
Unload and re-load checkpoint to VRAM on request (API & Manual)
2023-03-25 12:03:26 +03:00
carat-johyun 92e173d414 fix variable typo 2023-03-23 14:28:08 +09:00
Φφ 4cbbb881ee Unload checkpoints on Request
…to free VRAM.

New Action buttons in the settings to manually free and reload checkpoints, essentially
juggling models between RAM and VRAM.
2023-03-21 09:28:50 +03:00
AUTOMATIC 6a04a7f20f fix an error loading Lora with empty values in metadata 2023-03-14 11:22:29 +03:00
AUTOMATIC c19530f1a5 Add view metadata button for Lora cards. 2023-03-14 09:10:26 +03:00
w-e-w 014e7323f6 when exists 2023-02-19 20:49:07 +09:00
w-e-w c77f01ff31 fix auto sd download issue 2023-02-19 20:37:40 +09:00
missionfloyd c4ea16a03f Add ".vae.ckpt" to ext_blacklist 2023-02-15 19:47:30 -07:00
missionfloyd 1615f786ee Download model if none are found 2023-02-14 20:54:02 -07:00
AUTOMATIC 668d7e9b9a make it possible to load SD1 checkpoints without CLIP 2023-02-05 11:21:00 +03:00
AUTOMATIC 3e0f9a7543 fix issue with switching back to checkpoint that had its checksum calculated during runtime mentioned in #7506 2023-02-04 15:23:16 +03:00
AUTOMATIC1111 c0e0b5844d
Merge pull request #7470 from cbrownstein-lambda/update-error-message-no-checkpoint
Update error message WRT missing checkpoint file
2023-02-04 12:07:12 +03:00
AUTOMATIC 81823407d9 add --no-hashing 2023-02-04 11:38:56 +03:00
Cody Brownstein fb97acef63 Update error message WRT missing checkpoint file
The Safetensors format is also supported.
2023-02-01 14:51:06 -08:00
AUTOMATIC f6b7768f84 support for searching subdirectory names for extra networks 2023-01-29 10:20:19 +03:00
AUTOMATIC 5d14f282c2 fixed a bug where after switching to a checkpoint with unknown hash, you'd get empty space instead of checkpoint name in UI
fixed a bug where if you update a selected checkpoint on disk and then restart the program, a different checkpoint loads, but the name is shown for the the old one.
2023-01-28 16:23:49 +03:00
Max Audron 5eee2ac398 add data-dir flag and set all user data directories based on it 2023-01-27 14:44:30 +01:00
AUTOMATIC 6f31d2210c support detecting midas model
fix broken api for checkpoint list
2023-01-27 11:54:19 +03:00
AUTOMATIC d2ac95fa7b remove the need to place configs near models 2023-01-27 11:28:12 +03:00
AUTOMATIC1111 1574e96729
Merge pull request #6510 from brkirch/unet16-upcast-precision
Add upcast options, full precision sampling from float16 UNet and upcasting attention for inference using SD 2.1 models without --no-half
2023-01-25 19:12:29 +03:00
Kyle ee0a0da324 Add instruct-pix2pix hijack
Allows loading instruct-pix2pix models via same method as inpainting models in sd_models.py and sd_hijack_ip2p.py

Adds ddpm_edit.py necessary for instruct-pix2pix
2023-01-25 08:53:23 -05:00
brkirch 84d9ce30cb Add option for float32 sampling with float16 UNet
This also handles type casting so that ROCm and MPS torch devices work correctly without --no-half. One cast is required for deepbooru in deepbooru_model.py, some explicit casting is required for img2img and inpainting. depth_model can't be converted to float16 or it won't work correctly on some systems (it's known to have issues on MPS) so in sd_models.py model.depth_model is removed for model.half().
2023-01-25 01:13:02 -05:00
AUTOMATIC c1928cdd61 bring back short hashes to sd checkpoint selection 2023-01-19 18:58:08 +03:00
AUTOMATIC a5bbcd2153 fix bug with "Ignore selected VAE for..." option completely disabling VAE election
rework VAE resolving code to be more simple
2023-01-14 19:56:09 +03:00
AUTOMATIC 08c6f009a5 load hashes from cache for checkpoints that have them
add checkpoint hash to footer
2023-01-14 15:55:40 +03:00
AUTOMATIC febd2b722e update key to use with checkpoints' sha256 in cache 2023-01-14 13:37:55 +03:00
AUTOMATIC f9ac3352cb change hypernets to use sha256 hashes 2023-01-14 10:25:37 +03:00
AUTOMATIC a95f135308 change hash to sha256 2023-01-14 09:56:59 +03:00
AUTOMATIC 4bd490727e fix for an error caused by skipping initialization, for realsies this time: TypeError: expected str, bytes or os.PathLike object, not NoneType 2023-01-11 18:54:13 +03:00
AUTOMATIC 1a23dc32ac possible fix for fallback for fast model creation from config, attempt 2 2023-01-11 10:34:36 +03:00
AUTOMATIC 4fdacd31e4 possible fix for fallback for fast model creation from config 2023-01-11 10:24:56 +03:00
AUTOMATIC 0f8603a559 add support for transformers==4.25.1
add fallback for when quick model creation fails
2023-01-10 17:46:59 +03:00
AUTOMATIC ce3f639ec8 add more stuff to ignore when creating model from config
prevent .vae.safetensors files from being listed as stable diffusion models
2023-01-10 16:51:04 +03:00
AUTOMATIC 0c3feb202c disable torch weight initialization and CLIP downloading/reading checkpoint to speedup creating sd model from config 2023-01-10 14:08:29 +03:00
Vladimir Mandic 552d7b90bf
allow model load if previous model failed 2023-01-09 18:34:26 -05:00
AUTOMATIC 642142556d use commandline-supplied cuda device name instead of cuda:0 for safetensors PR that doesn't fix anything 2023-01-04 15:09:53 +03:00
AUTOMATIC 68fbf4558f Merge remote-tracking branch 'Narsil/fix_safetensors_load_speed' 2023-01-04 14:53:03 +03:00
AUTOMATIC 0cd6399b8b fix broken inpainting model 2023-01-04 14:29:13 +03:00
AUTOMATIC 8d8a05a3bb find configs for models at runtime rather than when starting 2023-01-04 12:47:42 +03:00
AUTOMATIC 02d7abf514 helpful error message when trying to load 2.0 without config
failing to load model weights from settings won't break generation for currently loaded model anymore
2023-01-04 12:35:07 +03:00
AUTOMATIC 8f96f92899 call script callbacks for reloaded model after loading embeddings 2023-01-03 18:39:14 +03:00
AUTOMATIC 311354c0bb fix the issue with training on SD2.0 2023-01-02 00:38:09 +03:00
Vladimir Mandic f55ac33d44
validate textual inversion embeddings 2022-12-31 11:27:02 -05:00
Nicolas Patry 5ba04f9ec0
Attempting to solve slow loads for `safetensors`.
Fixes #5893
2022-12-27 11:27:19 +01:00
Yuval Aboulafia 3bf5591efe fix F541 f-string without any placeholders 2022-12-24 21:35:29 +02:00
linuxmobile ( リナックス ) 5a650055de
Removed lenght in sd_model at line 115
Commit eba60a4 is what is causing this error, delete the length check in sd_model starting at line 115 and it's fine.

https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/5971#issuecomment-1364507379
2022-12-24 09:25:35 -03:00
AUTOMATIC1111 eba60a42eb
Merge pull request #5627 from deanpress/patch-1
fix: fallback model_checkpoint if it's empty
2022-12-24 12:20:31 +03:00
MrCheeze ec0a48826f unconditionally set use_ema=False if value not specified (True never worked, and all configs except v1-inpainting-inference.yaml already correctly set it to False) 2022-12-11 11:18:34 -05:00
Dean van Dugteren 59c6511494
fix: fallback model_checkpoint if it's empty
This fixes the following error when SD attempts to start with a deleted checkpoint:

```
Traceback (most recent call last):
  File "D:\Web\stable-diffusion-webui\launch.py", line 295, in <module>
    start()
  File "D:\Web\stable-diffusion-webui\launch.py", line 290, in start
    webui.webui()
  File "D:\Web\stable-diffusion-webui\webui.py", line 132, in webui
    initialize()
  File "D:\Web\stable-diffusion-webui\webui.py", line 62, in initialize
    modules.sd_models.load_model()
  File "D:\Web\stable-diffusion-webui\modules\sd_models.py", line 283, in load_model
    checkpoint_info = checkpoint_info or select_checkpoint()
  File "D:\Web\stable-diffusion-webui\modules\sd_models.py", line 117, in select_checkpoint
    checkpoint_info = checkpoints_list.get(model_checkpoint, None)
TypeError: unhashable type: 'list'
```
2022-12-11 17:08:51 +01:00
MrCheeze bd81a09eac fix support for 2.0 inpainting model while maintaining support for 1.5 inpainting model 2022-12-10 11:29:26 -05:00
AUTOMATIC1111 ec5e072124
Merge pull request #4841 from R-N/vae-fix-none
Fix None option of VAE selector
2022-12-10 09:58:20 +03:00
Jay Smith 1ed4f0e228 Depth2img model support 2022-12-08 20:50:08 -06:00
AUTOMATIC 0376da180c make it possible to save nai model using safetensors 2022-11-28 08:39:59 +03:00
AUTOMATIC dac9b6f15d add safetensors support for model merging #4869 2022-11-27 15:51:29 +03:00
AUTOMATIC 6074175faa add safetensors to requirements 2022-11-27 14:46:40 +03:00
AUTOMATIC1111 f108782e30
Merge pull request #4930 from Narsil/allow_to_load_safetensors_file
Supporting `*.safetensors` format.
2022-11-27 14:36:55 +03:00
MrCheeze 1e506657e1 no-half support for SD 2.0 2022-11-26 13:28:44 -05:00
Nicolas Patry 0efffbb407 Supporting `*.safetensors` format.
If a model file exists with extension `.safetensors` then we can load it
more safely than with PyTorch weights.
2022-11-21 14:04:25 +01:00
Muhammad Rizqi Nur 8662b5e57f Merge branch 'a1111' into vae-fix-none 2022-11-19 16:38:21 +07:00
Muhammad Rizqi Nur 2c5ca706a7 Remove no longer necessary parts and add vae_file safeguard 2022-11-19 12:01:41 +07:00
Muhammad Rizqi Nur c7be83bf02 Misc
Misc
2022-11-19 11:44:37 +07:00
Muhammad Rizqi Nur abc1e79a5d Fix base VAE caching was done after loading VAE, also add safeguard 2022-11-19 11:41:41 +07:00
cluder eebf49592a restore #4035 behavior
- if checkpoint cache is set to 1, keep 2 models in cache (current +1 more)
2022-11-09 07:17:09 +01:00
cluder 3b51d239ac - do not use ckpt cache, if disabled
- cache model after is has been loaded from file
2022-11-09 05:43:57 +01:00
AUTOMATIC 99043f3360 fix one of previous merges breaking the program 2022-11-04 11:20:42 +03:00
AUTOMATIC1111 24fc05cf57
Merge branch 'master' into fix-ckpt-cache 2022-11-04 10:54:17 +03:00
digburn 3780ad3ad8 fix: loading models without vae from cache 2022-11-04 00:43:00 +00:00
Muhammad Rizqi Nur fb3b564801 Merge branch 'master' into fix-ckpt-cache 2022-11-02 20:53:41 +07:00
AUTOMATIC f2a5cbe6f5 fix #3986 breaking --no-half-vae 2022-11-02 14:41:29 +03:00
Muhammad Rizqi Nur 056f06d373 Reload VAE without reloading sd checkpoint 2022-11-02 12:51:46 +07:00
Muhammad Rizqi Nur f8c6468d42
Merge branch 'master' into vae-picker 2022-11-02 00:25:08 +07:00
Jairo Correa af758e97fa Unload sd_model before loading the other 2022-11-01 04:01:49 -03:00
Muhammad Rizqi Nur bf7a699845 Fix #4035 for real now 2022-10-31 16:27:27 +07:00
Muhammad Rizqi Nur 36966e3200 Fix #4035 2022-10-31 15:38:58 +07:00
Muhammad Rizqi Nur 726769da35 Checkpoint cache by combination key of checkpoint and vae 2022-10-31 15:22:03 +07:00
Muhammad Rizqi Nur cb31abcf58 Settings to select VAE 2022-10-30 21:54:31 +07:00
AUTOMATIC1111 9553a7e071
Merge pull request #3818 from jwatzman/master
Reduce peak memory usage when changing models
2022-10-29 09:16:00 +03:00
Antonio 5d5dc64064
Natural sorting for dropdown checkpoint list
Example:

Before					After

11.ckpt					11.ckpt
ab.ckpt					ab.ckpt
ade_pablo_step_1000.ckpt	ade_pablo_step_500.ckpt			
ade_pablo_step_500.ckpt	ade_pablo_step_1000.ckpt	
ade_step_1000.ckpt		ade_step_500.ckpt
ade_step_1500.ckpt		ade_step_1000.ckpt
ade_step_2000.ckpt		ade_step_1500.ckpt
ade_step_2500.ckpt		ade_step_2000.ckpt
ade_step_3000.ckpt		ade_step_2500.ckpt
ade_step_500.ckpt			ade_step_3000.ckpt
atp_step_5500.ckpt			atp_step_5500.ckpt
model1.ckpt				model1.ckpt
model10.ckpt				model10.ckpt
model1000.ckpt			model33.ckpt
model33.ckpt				model50.ckpt
model400.ckpt			model400.ckpt
model50.ckpt				model1000.ckpt
moo44.ckpt				moo44.ckpt
v1-4-pruned-emaonly.ckpt	v1-4-pruned-emaonly.ckpt
v1-5-pruned-emaonly.ckpt	v1-5-pruned-emaonly.ckpt
v1-5-pruned.ckpt			v1-5-pruned.ckpt
v1-5-vae.ckpt				v1-5-vae.ckpt
2022-10-28 05:49:39 +02:00
Josh Watzman b50ff4f4e4 Reduce peak memory usage when changing models
A few tweaks to reduce peak memory usage, the biggest being that if we
aren't using the checkpoint cache, we shouldn't duplicate the model
state dict just to immediately throw it away.

On my machine with 16GB of RAM, this change means I can typically change
models, whereas before it would typically OOM.
2022-10-27 22:01:06 +01:00
AUTOMATIC 321bacc6a9 call model_loaded_callback after setting shared.sd_model in case scripts refer to it using that 2022-10-22 20:15:12 +03:00
MrCheeze 0df94d3fcf fix aesthetic gradients doing nothing after loading a different model 2022-10-22 20:14:18 +03:00
AUTOMATIC 2b91251637 removed aesthetic gradients as built-in
added support for extensions
2022-10-22 12:23:58 +03:00
AUTOMATIC ac0aa2b18e loading SD VAE, see PR #3303 2022-10-21 17:35:51 +03:00
AUTOMATIC df57064093 do not load aesthetic clip model until it's needed
add refresh button for aesthetic embeddings
add aesthetic params to images' infotext
2022-10-21 16:10:51 +03:00
AUTOMATIC 7d6b388d71 Merge branch 'ae' 2022-10-21 13:35:01 +03:00
random_thoughtss 49533eed9e XY grid correctly re-assignes model when config changes 2022-10-20 16:01:27 -07:00
random_thoughtss 708c3a7bd8 Added PLMS hijack and made sure to always replace methods 2022-10-20 13:28:43 -07:00
random_thoughtss 8e7097d06a Added support for RunwayML inpainting model 2022-10-19 13:47:45 -07:00
AUTOMATIC f894dd552f fix for broken checkpoint merger 2022-10-19 12:45:42 +03:00
MalumaDev 2362d5f00e
Merge branch 'master' into test_resolve_conflicts 2022-10-19 10:22:39 +02:00
AUTOMATIC 10aca1ca3e more careful loading of model weights (eliminates some issues with checkpoints that have weird cond_stage_model layer names) 2022-10-19 08:42:22 +03:00
MalumaDev 9324cdaa31 ui fix, re organization of the code 2022-10-16 17:53:56 +02:00
AUTOMATIC1111 af144ebdc7
Merge branch 'master' into ckpt-cache 2022-10-15 10:35:18 +03:00
Rae Fu e21f01f645 add checkpoint cache option to UI for faster model switching
switching time reduced from ~1500ms to ~280ms
2022-10-14 14:09:23 -06:00
AUTOMATIC bb295f5478 rework the code for lowram a bit 2022-10-14 20:03:41 +03:00
Ljzd-PRO 4a216ded43 load models to VRAM when using `--lowram` param
load models to VRM instead of RAM (for machines which have bigger VRM than RAM such as free Google Colab server)
2022-10-14 19:57:23 +03:00
AUTOMATIC 727e4d1086 no to different messages plus fix using != to compare to None 2022-10-10 20:46:55 +03:00
AUTOMATIC1111 b3d3b335cf
Merge pull request #2131 from ssysm/upstream-master
Add VAE Path Arguments
2022-10-10 20:45:14 +03:00
ssysm af62ad4d25 change vae loading method 2022-10-10 13:25:28 -04:00