Commit Graph

68 Commits (4e74172f3a25e9993d01686c5ecc3a62a27192b5)

Author SHA1 Message Date
Vladimir Mandic 8cd5fbc926 lint
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-09-12 16:28:53 -04:00
Disty0 a8b850adf4 move hf quantizer hijacks to sdnq 2025-09-12 20:54:44 +03:00
Disty0 4f2b829450 cleanup 2025-09-11 15:42:36 +03:00
Disty0 6a954ffcde SDNQ fuse repeating bitwise ops 2025-09-06 13:18:17 +03:00
Vladimir Mandic 9743c8e4bf keep previous processed state
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-08-31 15:20:15 -04:00
Disty0 9035f4299c SDNQ fix new transformers 2025-08-30 22:44:57 +03:00
Disty0 bbb345cf44 Fix bias dtype mismatch 2025-08-30 02:31:41 +03:00
Disty0 6c36433a14 SDNQ fix row-wise FP8 matmul with fp32 and fp16 inputs 2025-08-30 02:27:15 +03:00
Disty0 4ec8603f63 SDNQ re-add bitpacking for uint1 2025-08-29 23:06:11 +03:00
Disty0 a20ab65359 SDNQ override matmul group size with int8 as well 2025-08-29 22:54:38 +03:00
Disty0 d49e954918 SDNQ listen to dequantize_fp32 option with re_quantize 2025-08-29 22:48:28 +03:00
Disty0 a8de3f7282 SDNQ add quantized matmul support for all quantization types and group sizes 2025-08-29 22:26:47 +03:00
Disty0 36bf998302 cleanup 2025-08-29 18:15:28 +03:00
Disty0 6e68dff381 Fix AuraFlow quant 2025-08-29 18:13:53 +03:00
Disty0 f324b7c0e5 SDNQ remove unnecessary .contiguous() 2025-08-21 02:21:05 +03:00
Disty0 e49814098e Add sdnq_modules_dtype_dict 2025-08-20 14:58:54 +03:00
Disty0 0946710662 Add sdnq_modules_to_not_convert to UI settings 2025-08-20 04:38:20 +03:00
Disty0 47ff01fd3b SDNQ add "*" support and upcast only the first and last layer's img_mod to 6 bit with Qwen Image 2025-08-20 03:24:19 +03:00
Disty0 47154db8b1 SDNQ Flux lora, use shape from sdnq_dequantizer 2025-08-18 19:53:57 +03:00
Vladimir Mandic fc547a3ccd sdnq with diffusers lora loader
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-08-18 10:29:01 -04:00
Disty0 85d494ee84 revert .t_().contiguous().t_() 2025-08-17 05:19:44 +03:00
Disty0 8460be662c SDNQ use inplace transpose and use view instead of reshape 2025-08-17 05:07:55 +03:00
Disty0 8ca74d0cd2 SDNQ rename unused param_name arg to op 2025-08-13 22:10:30 +03:00
Disty0 cb0c5414a3 SDNQ use uint with minimum_bits <= 4 2025-08-13 00:37:44 +03:00
Disty0 7085db9add Update changelog 2025-08-13 00:17:15 +03:00
Disty0 15cb8fe9f8 SDNQ add modules_dtype_dict and fix Qwen Image with quants less than 5 bits 2025-08-13 00:07:36 +03:00
Disty0 9992338187 sdnq fix convs 2025-08-11 23:24:13 +03:00
Disty0 26461f1d8d fix conv in8 matmul 2025-08-11 23:15:30 +03:00
Disty0 f45e3342e6 Cleanup 2025-08-11 15:11:29 +03:00
Disty0 afb3a5a06d SDNQ move non_blocking to quant config 2025-08-11 15:07:02 +03:00
Disty0 dc7b25d387 Cleanup SDNQ and add SDNQ_USE_TENSORWISE_FP8_MATMUL env var 2025-08-11 14:50:17 +03:00
Disty0 3f45c4e570 Cleanup SDNQ and skip transpose on packed int8 matmul 2025-08-10 19:31:34 +03:00
Disty0 69db77e365 SDNQ remove eps 2025-08-09 02:39:01 +03:00
Disty0 22d86acda3 Make SDNQ MatMul listen to the dequantize fp32 setting 2025-08-09 01:10:07 +03:00
Disty0 aa0652caa9 SDNQ fix new transformers 2025-08-07 00:18:24 +03:00
Disty0 ab8badfe0d SDNQ use non-blocking ops 2025-08-06 17:07:36 +03:00
Disty0 1d5dce1fb1 cleanup 2025-08-02 17:41:53 +03:00
Disty0 c3d007b02c SDNQ split forward.py into layers and cleanup 2025-08-02 17:36:55 +03:00
Vladimir Mandic fa44521ea3 offload-never and offload-always per-module and new highvram profile
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-07-31 11:40:24 -04:00
Vladimir Mandic 2656d3aa68 lint
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-07-24 15:42:29 -04:00
Disty0 444974a6ff cleanup 2025-07-23 19:02:44 +03:00
Disty0 7a08e1a7f2 SDNQ always use custom tensorwise fp8 matmul 2025-07-23 19:01:10 +03:00
Disty0 25a4731a97 SDNQ use static compile 2025-07-20 16:25:57 +03:00
Disty0 30af1f8fb0 Use inference_context with SDNQ 2025-07-14 13:07:44 +03:00
Vladimir Mandic e6d97f4d44 monkeypatch numpy for gradio
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-07-07 19:45:30 -04:00
Disty0 cf90e5621a Add _skip_layerwise_casting_patterns to SDNQ skip list 2025-07-04 00:04:01 +03:00
Vladimir Mandic c4d9338d2e major refactoring of modules
Signed-off-by: Vladimir Mandic <mandic00@live.com>
2025-07-03 09:18:38 -04:00
Disty0 3406083d14 Override lumina2 to use diffusers lora loading 2025-07-02 06:40:27 +03:00
Disty0 dc8fd006b2 Add modules_to_not_convert to pre-mode quants 2025-06-26 02:47:10 +03:00
Disty0 e43d1d2ba7 SDNQ use strings as target_dtype 2025-06-25 23:25:49 +03:00