Added flux1-dev_8x8_scaled. both unet and t5 are converted to fp8 scaled weights and should be a sweet spot. This replaces fp16 static weights if i didn't mess up the alphas lol. I am pretty sure it worked out.
I compiled a little collection of flux.1 models. there are fp8 models with fp8 t5 and fp8 models with fp16 t5 for both dev and schnell. Single files for use with the regular checkpoint loader. there are also fp16 models available now. all models have clip, t5, and vae baked in. THESE ARE ALL STOCK FLUX.1
`
These all use bf16 upcasting, use the appropriate flags if you are tuning on gtx cards for some reason.
`
Unified single file versions of flux.1 for comfyui. All files have a baked in VAE and clip L included:
flux.1_dev_8x8_e4m3fn-marduk191.safetensors is Flux.1 Dev quantized to 8 bit with an 8 bit T5 XXL encoder included.
flux.1_dev_fp8_fp16t5-marduk191.safetensors is Flux.1 Dev quantized to 8 bit with an 16 bit T5 XXL encoder included.
flux.1_schnell_8x8_e4m3fn-marduk191.safetensors is Flux.1 Schnell quantized to 8 bit with an 8 bit T5 XXL encoder included.
flux.1_schnell_fp8_fp16t5-marduk191.safetensors is Flux.1 Schnell quantized to 8 bit with an 16 bit T5 XXL encoder included.
flux.1_dev_16x16-marduk191.safetensors Flux.1 Dev quantized to 16 bit with an 16 bit T5 XXL encoder included.
flux.1_schnell_16x16-marduk191.safetensors Flux.1 Schnell quantized to 16 bit with an 16 bit T5 XXL encoder included.
flux.1_dev_8x8_scaled-marduk191.safetensors is Flux.1 Dev quantized to 8 bit scaled stochastic weights and normalized outlaying alphas. It uses an 8 bit scale dstochastic (tag limited to avoid loss) T5 XXL encoder included.
Workflow examples are available here: SOON
Repository is here: https://huggingface.co/marduk191/Flux.1_collection/tree/main
Discord: https://discord.gg/s3kj9VqpKc
Tips welcome: https://ko-fi.com/marduk191
Added flux1-dev_8x8_scaled. both unet and t5 are converted to fp8 scaled weights and should be a sweet spot. This replaces fp16 static weights if i didn't mess up the alphas lol. I am pretty sure it worked out.
1. 轉载模型僅供學習與交流分享,其版權及最終解释權归原作者。
2. 模型原作者如需認領模型,請通過官方渠道联系海藝AI工作人員進行認證。我們致力於保護每一位創作者的權益。 點擊去認領
