CivArchive
    wan2.1_t2v_14B_fp8_scaled - wan2.1_t2v_14B_fp8_scaled

    Not finding these on civitia and I need for comfy in runpod - wan2.1_t2v_14B_fp8_scaled
    fp8 scaled 14B models.

    These are the models I use locally to render

    wan2.1_t2v_14B_fp8_e4m3fn.safetensors - pending upload

    14.3 GB

    LFS

    Simple fp8

    Runpod: (One Click deploy - ComfyUI Wan14B t2v i2v v2v) not mine.. but works good.
    navigate to diffusion_models

    run: download.py --model 1666198 (hope it works)

    wan2.1_t2v_14B_fp8_scaled.safetensors - uploaded 4.15.25

    14.3 GB

    LFS

    fp8 scaled 14B models.



    https://huggingface.co/Comfy-Org/Wan_2.1_ComfyUI_repackaged/tree/main/split_files/diffusion_models

    FAQ

    Comments (11)

    BovisAndBeetheadApr 15, 2025· 1 reaction
    CivitAI

    You can use wget to download any download link to runpod. Including huggingface links.

    rickets_xxx
    Author
    Apr 16, 2025

    ill look into that, I am learning runpod, and that means wasted money each try/fail.

    axicecApr 16, 2025· 2 reactions

    @rickets_xxx thats on you.... do better

    GradashoApr 16, 2025
    CivitAI

    Is there any kind of advantage with the fp8 scaled vs the standard fp8 e4m3fn? Just tested them, doesn't seem really noticeably different

    rickets_xxx
    Author
    Apr 16, 2025· 1 reaction

    Shrug... still learning

    funscripter627Apr 16, 2025

    According to comfyui it's like this:

    "Note: The fp16 versions are recommended over the bf16 versions as they will give better results.

    Quality rank (highest to lowest): fp16 > bf16 > fp8_scaled > fp8_e4m3fn"

    See https://comfyanonymous.github.io/ComfyUI_examples/wan/?utm_source=substack&utm_medium=email

    rickets_xxx
    Author
    Apr 16, 2025

    @funscripter627 I have noticed issues with the bf16 model(s) - thats why I did this mostly. going to load the fp16. instead of the e4m3fn I think.

    blyssApr 16, 2025· 4 reactions

    fp8 scaled maintains about 2.5% quantization error versus 12.5% for pure e4m3fn with these models. It's really excellent! Also recommended to avoid fp8_fast as it tanks quality!

    LatteLeopardApr 18, 2025· 6 reactions
    CivitAI

    Those fucking thumbnails. Absolutely hilarious.

    snobbias124Apr 28, 2025
    CivitAI

    Can anyone kindly direct me towards a guide that explains the myriads of settings of the WAN nodes? My search skills are apparently not advanced enough. Using kijai workflows myself, but any guide would do.

    nawani4450811May 15, 2025
    CivitAI

    The wrapper in comfy doesn't support scaled models. So how are you running these?

    Checkpoint
    Wan Video

    Details

    Downloads
    781
    Platform
    CivitAI
    Platform Status
    Available
    Created
    4/15/2025
    Updated
    4/30/2026
    Deleted
    -

    Available On (1 platform)

    Same model published on other platforms. May have additional downloads or version variants.