Vladimir Mandic
ebc1243238
Merge pull request #2014 from midcoastal/Issue1563/FS-Path-Cache
...
Add a FS directory/path cacher
2023-08-18 07:45:06 +02:00
Midcoastal
67f369ed25
A walk() optimization and lint fixes
2023-08-17 16:49:42 -04:00
Midcoastal
05850c2344
Upgrade Lora/TI model listers to use cache
2023-08-15 23:06:23 -04:00
Vladimir Mandic
3d914688cc
update metadata
2023-08-15 05:50:15 +02:00
Vladimir Mandic
8f6f8413b1
fix ti training
2023-07-27 09:30:41 -04:00
Vladimir Mandic
d778876010
minor css fixes
2023-07-26 19:48:36 -04:00
Vladimir Mandic
debec28be6
rework settings, image-watermark, safe config handling
2023-07-18 14:41:27 -04:00
Disty0
2a9133bfec
IPEX rework
2023-07-14 17:33:24 +03:00
Vladimir Mandic
2a21196061
Merge branch 'master' into dev
2023-07-08 13:35:25 -04:00
Vladimir Mandic
7e11ff2b34
add sdxl support
2023-07-06 19:26:43 -04:00
Disty0
966eed8dd9
Autodetect IPEX
2023-07-04 23:37:36 +03:00
Vladimir Mandic
8241e33868
major diffusers update
2023-07-03 16:48:03 -04:00
Disty0
6ea6f2448e
Revert loss scale for ipex
2023-06-19 23:59:18 +03:00
Disty0
71b1532bb0
Scale loss for ipex
2023-06-15 23:55:23 +03:00
Disty0
618097dac2
GradScaler patch for IPEX
2023-06-15 01:19:35 +03:00
Disty0
a9f66cb33e
loss / 2 for ipex train
2023-06-14 12:18:08 +03:00
Vladimir Mandic
cb307399dd
jumbo merge
2023-06-13 11:59:56 -04:00
Disty0
0d101b9def
Revert xpu.optimize in training
2023-06-13 15:36:52 +03:00
Disty0
c9e95bec3f
Fix loss=nan
2023-06-12 06:13:18 +03:00
Vladimir Mandic
1595c7a11c
minor fixes
2023-06-11 21:49:48 -04:00
Disty0
ab255b732b
Remove unnecessary ipex code
2023-06-12 04:45:29 +03:00
Disty0
c9e58c9604
Fix train for IPEX
2023-06-12 00:21:32 +03:00
Vladimir Mandic
409c9d4c9d
upstream ports
2023-06-10 18:23:28 -04:00
Vladimir Mandic
aaa0d46286
update installer and add sd_model_dict
2023-06-07 13:26:21 -04:00
Disty0
3bef3e3eee
Train patches for IPEX
2023-06-07 17:25:11 +03:00
Vladimir Mandic
5f1fd7bd66
update common ui
2023-05-29 13:43:03 -04:00
Vladimir Mandic
efd3810860
diffusers merge
2023-05-26 22:42:03 -04:00
Vladimir Mandic
9285326c6d
fix tqdm
2023-05-25 07:53:25 -04:00
Vladimir Mandic
a64bb4375a
minor upadtes
2023-05-22 10:50:59 -04:00
Vladimir Mandic
42280ef804
add theme mode toggle
2023-05-19 13:24:40 -04:00
Vladimir Mandic
0ccda9bc8b
jumbo patch
2023-05-17 14:15:55 -04:00
Vladimir Mandic
fe496f4ebc
add train preprocess options
2023-05-05 09:06:06 -04:00
Disty0
8171d57c36
Remove unnecessary IPEX imports
2023-05-04 02:34:34 +03:00
Vladimir Mandic
6f976c358f
optimize model load
2023-05-02 21:30:34 -04:00
Vladimir Mandic
7a083d322b
merge commits
2023-05-02 15:06:06 -04:00
Vladimir Mandic
cb4cff3929
redesign logging
2023-05-02 13:57:16 -04:00
Vladimir Mandic
4dc5941912
fix embedding logging
2023-04-30 21:01:49 -04:00
Disty0
68fc95b2e1
Merge remote-tracking branch 'upstream/master'
2023-04-30 18:28:22 +03:00
Disty0
de8d0bef9f
More patches and Import IPEX after Torch
2023-04-30 18:19:37 +03:00
Vladimir Mandic
b23b6a6e2c
update ti folders
2023-04-30 08:55:47 -04:00
Disty0
b075d3c8fd
Intel ARC Support
2023-04-30 15:13:56 +03:00
David Pina
2220316920
Fix typo that prevents training Textual Inversion
...
There was a small typo in line 529.
shared.ops.embeddings_train_log caused an attribute not found exception when training TIs.
2023-04-29 22:41:24 +02:00
Vladimir Mandic
21ff7bad67
configurable train log
2023-04-28 09:42:19 -04:00
Vladimir Mandic
11fa3aff6d
ti fixes
2023-04-25 09:21:38 -04:00
Vladimir Mandic
cf277e7326
fix dtype logic
2023-04-21 15:04:05 -04:00
Vladimir Mandic
fd51bb90d0
enable quick launch
2023-04-15 11:51:58 -04:00
Vladimir Mandic
2ece9782e4
handle duplicate extensions and redo exception handler
2023-04-14 09:57:53 -04:00
Vladimir Mandic
81b8294e93
switch cmdflags to settings
2023-04-12 10:40:11 -04:00
Vladimir Mandic
f181885f0c
Merge pull request #57 from AUTOMATIC1111/master
...
merge from upstream
2023-03-25 08:47:00 -04:00
butaixianran
803d44c474
Fix None type error for TI module
...
When user using model_name.png as a preview image, textural_inversion.py still treat it as an embeding, and didn't handle its error, just let python throw out an None type error like following:
```bash
File "D:\Work\Dev\AI\stable-diffusion-webui\modules\textual_inversion\textual_inversion.py", line 155, in load_from_file
name = data.get('name', name)
AttributeError: 'NoneType' object has no attribute 'get'
```
With just a simple `if data:` checking as following, there will be no error, breaks nothing, and now this module can works fine with user's preview images.
Old code:
```python
data = extract_image_data_embed(embed_image)
name = data.get('name', name)
```
New code:
```python
data = extract_image_data_embed(embed_image)
if data:
name = data.get('name', name)
else:
# if data is None, means this is not an embeding, just a preview image
return
```
Also, since there is no more errors on textual inversion module, from now on, extra network can set "model_name.png" as preview image for embedings.
2023-03-25 02:05:00 +08:00
Vladimir Mandic
f275e43eb9
update torch load and external repos
2023-03-20 10:10:56 -04:00
Vladimir Mandic
f6679fcc77
add global exception handler
2023-03-17 10:08:07 -04:00
Vladimir Mandic
54ab13502f
Merge pull request #37 from AUTOMATIC1111/master
...
sync forks
2023-02-19 10:11:25 -05:00
Shondoit
edb10092de
Add ability to choose using weighted loss or not
2023-02-15 10:03:59 +01:00
Shondoit
bc50936745
Call weighted_forward during training
2023-02-15 10:03:59 +01:00
Vladimir Mandic
e92b66e2ea
update version
2023-01-29 13:44:33 -05:00
AUTOMATIC
aa6e55e001
do not display the message for TI unless the list of loaded embeddings changed
2023-01-29 11:53:05 +03:00
Alex "mcmonkey" Goodwin
e179b6098a
allow symlinks in the textual inversion embeddings folder
2023-01-25 08:48:40 -08:00
AUTOMATIC
40ff6db532
extra networks UI
...
rework of hypernets: rather than via settings, hypernets are added directly to prompt as <hypernet:name:weight>
2023-01-21 08:36:07 +03:00
AUTOMATIC
924e222004
add option to show/hide warnings
...
removed hiding warnings from LDSR
fixed/reworked few places that produced warnings
2023-01-18 23:04:24 +03:00
AUTOMATIC
d8b90ac121
big rework of progressbar/preview system to allow multiple users to prompts at the same time and do not get previews of each other
2023-01-15 18:51:04 +03:00
AUTOMATIC
a95f135308
change hash to sha256
2023-01-14 09:56:59 +03:00
AUTOMATIC
82725f0ac4
fix a bug caused by merge
2023-01-13 15:04:37 +03:00
AUTOMATIC1111
9cd7716753
Merge branch 'master' into tensorboard
2023-01-13 14:57:38 +03:00
AUTOMATIC
a176d89487
print bucket sizes for training without resizing images #6620
...
fix an error when generating a picture with embedding in it
2023-01-13 14:32:15 +03:00
Shondoit
d52a80f7f7
Allow creation of zero vectors for TI
2023-01-12 09:22:29 +01:00
Vladimir Mandic
3f43d8a966
set descriptions
2023-01-11 10:28:55 -05:00
Lee Bousfield
f9706acf43
Support loading textual inversion embeddings from safetensors files
2023-01-10 18:40:34 -07:00
AUTOMATIC
1fbb6f9ebe
make a dropdown for prompt template selection
2023-01-09 23:35:40 +03:00
AUTOMATIC
43bb5190fc
remove/simplify some changes from #6481
2023-01-09 22:52:23 +03:00
AUTOMATIC1111
18c001792a
Merge branch 'master' into varsize
2023-01-09 22:45:39 +03:00
AUTOMATIC
085427de0e
make it possible for extensions/scripts to add their own embedding directories
2023-01-08 09:37:33 +03:00
AUTOMATIC
a0c87f1fdf
skip images in embeddings dir if they have a second .preview extension
2023-01-08 08:52:26 +03:00
dan
669fb18d52
Add checkbox for variable training dims
2023-01-08 02:31:40 +08:00
dan
448b9cedab
Allow variable img size
2023-01-08 02:14:36 +08:00
AUTOMATIC
79e39fae61
CLIP hijack rework
2023-01-07 01:46:13 +03:00
AUTOMATIC
683287d87f
rework saving training params to file #6372
2023-01-06 08:52:06 +03:00
AUTOMATIC1111
88e01b237e
Merge pull request #6372 from timntorres/save-ti-hypernet-settings-to-txt-revised
...
Save hypernet and textual inversion settings to text file, revised.
2023-01-06 07:59:44 +03:00
Faber
81133d4168
allow loading embeddings from subdirectories
2023-01-06 03:38:37 +07:00
Kuma
fda04e620d
typo in TI
2023-01-05 18:44:19 +01:00
timntorres
b6bab2f052
Include model in log file. Exclude directory.
2023-01-05 09:14:56 -08:00
timntorres
b85c2b5cf4
Clean up ti, add same behavior to hypernetwork.
2023-01-05 08:14:38 -08:00
timntorres
eea8fc40e1
Add option to save ti settings to file.
2023-01-05 07:24:22 -08:00
AUTOMATIC1111
eeb1de4388
Merge branch 'master' into gradient-clipping
2023-01-04 19:56:35 +03:00
AUTOMATIC
525cea9245
use shared function from processing for creating dummy mask when training inpainting model
2023-01-04 17:58:07 +03:00
AUTOMATIC
184e670126
fix the merge
2023-01-04 17:45:01 +03:00
AUTOMATIC1111
da5c1e8a73
Merge branch 'master' into inpaint_textual_inversion
2023-01-04 17:40:19 +03:00
AUTOMATIC1111
7bbd984dda
Merge pull request #6253 from Shondoit/ti-optim
...
Save Optimizer next to TI embedding
2023-01-04 14:09:13 +03:00
Vladimir Mandic
192ddc04d6
add job info to modules
2023-01-03 10:34:51 -05:00
Shondoit
bddebe09ed
Save Optimizer next to TI embedding
...
Also add check to load only .PT and .BIN files as embeddings. (since we add .optim files in the same directory)
2023-01-03 13:30:24 +01:00
Philpax
c65909ad16
feat(api): return more data for embeddings
2023-01-02 12:21:48 +11:00
AUTOMATIC
311354c0bb
fix the issue with training on SD2.0
2023-01-02 00:38:09 +03:00
AUTOMATIC
bdbe09827b
changed embedding accepted shape detection to use existing code and support the new alt-diffusion model, and reformatted messages a bit #6149
2022-12-31 22:49:09 +03:00
Vladimir Mandic
f55ac33d44
validate textual inversion embeddings
2022-12-31 11:27:02 -05:00
Yuval Aboulafia
3bf5591efe
fix F541 f-string without any placeholders
2022-12-24 21:35:29 +02:00
Jim Hays
c0355caefe
Fix various typos
2022-12-14 21:01:32 -05:00
AUTOMATIC1111
c9a2cfdf2a
Merge branch 'master' into racecond_fix
2022-12-03 10:19:51 +03:00
brkirch
4d5f1691dd
Use devices.autocast instead of torch.autocast
2022-11-30 10:33:42 -05:00
AUTOMATIC
b48b7999c8
Merge remote-tracking branch 'flamelaw/master'
2022-11-27 12:19:59 +03:00
flamelaw
755df94b2a
set TI AdamW default weight decay to 0
2022-11-27 00:35:44 +09:00