Commit Graph

155 Commits (c71467e7f05a778bb3e7eaf4e96a88caa8a63cf8)

Author SHA1 Message Date
Disty0 2a9133bfec IPEX rework 2023-07-14 17:33:24 +03:00
Disty0 966eed8dd9 Autodetect IPEX 2023-07-04 23:37:36 +03:00
Vladimir Mandic 9740b9d217 new training and models interface 2023-06-22 07:46:48 -04:00
Disty0 6ea6f2448e Revert loss scale for ipex 2023-06-19 23:59:18 +03:00
Disty0 71b1532bb0 Scale loss for ipex 2023-06-15 23:55:23 +03:00
Disty0 618097dac2 GradScaler patch for IPEX 2023-06-15 01:19:35 +03:00
Disty0 a9f66cb33e loss / 2 for ipex train 2023-06-14 12:18:08 +03:00
Vladimir Mandic cb307399dd jumbo merge 2023-06-13 11:59:56 -04:00
Disty0 0d101b9def Revert xpu.optimize in training 2023-06-13 15:36:52 +03:00
Disty0 c9e95bec3f Fix loss=nan 2023-06-12 06:13:18 +03:00
Disty0 ab255b732b Remove unnecessary ipex code 2023-06-12 04:45:29 +03:00
Disty0 c9e58c9604 Fix train for IPEX 2023-06-12 00:21:32 +03:00
Disty0 3bef3e3eee Train patches for IPEX 2023-06-07 17:25:11 +03:00
Disty0 4265692505 Fix GradScaler doesn't exist for XPU 2023-06-03 17:02:44 +03:00
Vladimir Mandic 0ccda9bc8b jumbo patch 2023-05-17 14:15:55 -04:00
Vladimir Mandic c470f39913 merge fixes 2023-05-04 16:55:41 -04:00
Disty0 8171d57c36 Remove unnecessary IPEX imports 2023-05-04 02:34:34 +03:00
Disty0 de8d0bef9f More patches and Import IPEX after Torch 2023-04-30 18:19:37 +03:00
Vladimir Mandic 2ece9782e4 handle duplicate extensions and redo exception handler 2023-04-14 09:57:53 -04:00
Vladimir Mandic 81b8294e93 switch cmdflags to settings 2023-04-12 10:40:11 -04:00
Vladimir Mandic 86b83fc956
Merge pull request #66 from AUTOMATIC1111/master
merge from upstream
2023-03-28 16:43:39 -04:00
AUTOMATIC 1b63afbedc sort hypernetworks and checkpoints by name 2023-03-28 20:03:57 +03:00
Vladimir Mandic f6679fcc77 add global exception handler 2023-03-17 10:08:07 -04:00
AUTOMATIC1111 dfb3b8f398
Merge branch 'master' into weighted-learning 2023-02-19 12:41:29 +03:00
Shondoit edb10092de Add ability to choose using weighted loss or not 2023-02-15 10:03:59 +01:00
Shondoit bc50936745 Call weighted_forward during training 2023-02-15 10:03:59 +01:00
brkirch 4738486d8f Support for hypernetworks with --upcast-sampling 2023-02-06 18:10:55 -05:00
AUTOMATIC 81823407d9 add --no-hashing 2023-02-04 11:38:56 +03:00
AUTOMATIC 78f59a4e01 enable compact view for train tab
prevent  previews from ruining hypernetwork training
2023-01-22 00:02:51 +03:00
AUTOMATIC 40ff6db532 extra networks UI
rework of hypernets: rather than via settings, hypernets are added directly to prompt as <hypernet:name:weight>
2023-01-21 08:36:07 +03:00
AUTOMATIC 924e222004 add option to show/hide warnings
removed hiding warnings from LDSR
fixed/reworked few places that produced warnings
2023-01-18 23:04:24 +03:00
aria1th 13445738d9 Fix tensorboard related functions 2023-01-16 03:02:54 +09:00
aria1th 598f7fcd84 Fix loss_dict problem 2023-01-16 02:46:21 +09:00
AngelBottomless 16f410893e
fix missing 'mean loss' for tensorboard integration 2023-01-16 02:08:47 +09:00
AUTOMATIC d8b90ac121 big rework of progressbar/preview system to allow multiple users to prompts at the same time and do not get previews of each other 2023-01-15 18:51:04 +03:00
AUTOMATIC f9ac3352cb change hypernets to use sha256 hashes 2023-01-14 10:25:37 +03:00
AUTOMATIC a95f135308 change hash to sha256 2023-01-14 09:56:59 +03:00
AUTOMATIC1111 9cd7716753
Merge branch 'master' into tensorboard 2023-01-13 14:57:38 +03:00
Vladimir Mandic 3f43d8a966
set descriptions 2023-01-11 10:28:55 -05:00
aria1th a4a5475cfa Variable dropout rate
Implements variable dropout rate from #4549

Fixes hypernetwork multiplier being able to modified during training, also fixes user-errors by setting multiplier value to lower values for training.

Changes function name to match torch.nn.module standard

Fixes RNG reset issue when generating previews by restoring RNG state
2023-01-10 14:56:57 +09:00
AUTOMATIC 1fbb6f9ebe make a dropdown for prompt template selection 2023-01-09 23:35:40 +03:00
dan 72497895b9 Move batchsize check 2023-01-08 02:57:36 +08:00
dan 669fb18d52 Add checkbox for variable training dims 2023-01-08 02:31:40 +08:00
AUTOMATIC 683287d87f rework saving training params to file #6372 2023-01-06 08:52:06 +03:00
timntorres b6bab2f052 Include model in log file. Exclude directory. 2023-01-05 09:14:56 -08:00
timntorres b85c2b5cf4 Clean up ti, add same behavior to hypernetwork. 2023-01-05 08:14:38 -08:00
AUTOMATIC1111 eeb1de4388
Merge branch 'master' into gradient-clipping 2023-01-04 19:56:35 +03:00
Vladimir Mandic 192ddc04d6
add job info to modules 2023-01-03 10:34:51 -05:00
AUTOMATIC1111 b12de850ae
Merge pull request #5992 from yuvalabou/F541
Fix F541: f-string without any placeholders
2022-12-25 09:16:08 +03:00
Vladimir Mandic 5f1dfbbc95 implement train api 2022-12-24 18:02:22 -05:00