CivArchive
    Instagirl WAN 2.2 - v2.0
    NSFW
    Preview 92631977
    Preview 92632012
    Preview 92631998
    Preview 92631995
    Preview 92632000
    Preview 92631993
    Preview 92631994
    Preview 92632002
    Preview 92631999

    How to Use

    This is a style LoRA, designed to be the base layer of your LoRA stack. It creates the foundational aesthetic of realism, upon which you can add character or concept LoRAs.

    Note: The included ZIP archive contains both the high-noise and low-noise LoRA variants, along with our recommended ComfyUI workflows.

    • Trigger Word: Instacam

    • Recommended Strength: 1.0. Start here and adjust in small increments.

    I would also like to thank Danrisi, who originally taught us how to train LoRAs and helped make our work possible.

    Description

    Instagirl V2 is a complete overhaul, trained from the ground up to push the boundaries of realism on Wan 2.2.

    Unprecedented Realism: We've massively upgraded the dataset and training method with a focus on photorealistic skin textures, natural lighting, and flawless environmental details.

    Greater Diversity: V2 is trained on a much wider and more inclusive dataset, enabling a greater variety of faces, ethnicities, and styles right out of the box.

    Better Composition: The model now has a deeper understanding of world composition, resulting in more coherent and believable scenes with fewer artifacts.

    This model is trained on WAN 2.2, meaning there are two versions; a high and low noise variant.

    FAQ

    Comments (65)

    ameameAug 6, 2025
    CivitAI

    possible make some young Asian girl next time?

    beignetsAug 6, 2025
    CivitAI

    The workflow in the file: Credits to model creator:

    {"id":"11e19437-56ee-487e-bb1f-3bd053f2f3c2","revision":0,"last_node_id":74,"last_link_id":136,"nodes":[{"id":68,"type":"LoraLoader","pos":[3345.027099609375,-345.39691162109375],"size":[259.98895263671875,126],"flags":{},"order":6,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":135},{"localized_name":"clip","name":"clip","type":"CLIP","link":121},{"localized_name":"lora_name","name":"lora_name","type":"COMBO","widget":{"name":"lora_name"},"link":null},{"localized_name":"strength_model","name":"strength_model","type":"FLOAT","widget":{"name":"strength_model"},"link":null},{"localized_name":"strength_clip","name":"strength_clip","type":"FLOAT","widget":{"name":"strength_clip"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[134]},{"localized_name":"CLIP","name":"CLIP","type":"CLIP","links":[124,125]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.46","Node name for S&R":"LoraLoader","widget_ue_connectable":{}},"widgets_values":["Instagirlv5_high_noise.safetensors",1,1]},{"id":70,"type":"LoraLoaderModelOnly","pos":[3342.34765625,-164.2708282470703],"size":[270,82],"flags":{},"order":7,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":136},{"localized_name":"lora_name","name":"lora_name","type":"COMBO","widget":{"name":"lora_name"},"link":null},{"localized_name":"strength_model","name":"strength_model","type":"FLOAT","widget":{"name":"strength_model"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[128]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.46","Node name for S&R":"LoraLoaderModelOnly","widget_ue_connectable":{}},"widgets_values":["wanWan21_T2V_14B_lightx2v_cfg_step_distill_lora_rank32.safetensors",0.6000000000000001]},{"id":72,"type":"LoraLoaderModelOnly","pos":[3348.833251953125,103.12315368652344],"size":[270,82],"flags":{},"order":12,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":129},{"localized_name":"lora_name","name":"lora_name","type":"COMBO","widget":{"name":"lora_name"},"link":null},{"localized_name":"strength_model","name":"strength_model","type":"FLOAT","widget":{"name":"strength_model"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[132]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.46","Node name for S&R":"LoraLoaderModelOnly","widget_ue_connectable":{}},"widgets_values":["l3n0v0.safetensors",0.6000000000000001]},{"id":4,"type":"CLIPTextEncode","pos":[3642.8974609375,58.09259796142578],"size":[450,150],"flags":{"collapsed":false},"order":9,"mode":0,"inputs":[{"localized_name":"clip","name":"clip","type":"CLIP","link":125},{"localized_name":"text","name":"text","type":"STRING","widget":{"name":"text"},"link":null}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","links":[59,64]}],"title":"Negative Prompt","properties":{"cnr_id":"comfy-core","ver":"0.3.43","Node name for S&R":"CLIPTextEncode","widget_ue_connectable":{"text":true}},"widgets_values":["色调艳丽,过曝,静态,细节模糊不清,字幕,风格,作品,画作,画面,静止,整体发灰,最差质量,低质量,JPEG压缩残留,丑陋的,残缺的,多余的手指,画得不好的手部,画得不好的脸部,畸形的,毁容的,形态畸形的肢体,手指融合,静止不动的画面,杂乱的背景,三条腿,背景人很多,倒着走, censored, sunburnt skin, rashy skin, red cheeks\n"],"color":"#2a363b","bgcolor":"#3f5159"},{"id":5,"type":"EmptyHunyuanLatentVideo","pos":[3647.158935546875,260.2233581542969],"size":[450,150],"flags":{},"order":0,"mode":0,"inputs":[{"localized_name":"width","name":"width","type":"INT","widget":{"name":"width"},"link":null},{"localized_name":"height","name":"height","type":"INT","widget":{"name":"height"},"link":null},{"localized_name":"length","name":"length","type":"INT","widget":{"name":"length"},"link":null},{"localized_name":"batch_size","name":"batch_size","type":"INT","widget":{"name":"batch_size"},"link":null}],"outputs":[{"localized_name":"LATENT","name":"LATENT","type":"LATENT","links":[60]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.43","Node name for S&R":"EmptyHunyuanLatentVideo","widget_ue_connectable":{"width":true,"height":true,"length":true,"batch_size":true}},"widgets_values":[1088,1440,1,1],"color":"#323","bgcolor":"#535"},{"id":67,"type":"Seed Generator","pos":[3659.67333984375,464.1074523925781],"size":[270,82],"flags":{},"order":1,"mode":0,"inputs":[{"localized_name":"seed","name":"seed","type":"INT","widget":{"name":"seed"},"link":null}],"outputs":[{"localized_name":"INT","name":"INT","type":"INT","links":[115,116]}],"properties":{"cnr_id":"comfy-image-saver","ver":"65e6903eff274a50f8b5cd768f0f96baf37baea1","Node name for S&R":"Seed Generator","widget_ue_connectable":{}},"widgets_values":[669817269655594,"increment"]},{"id":36,"type":"KSamplerAdvanced","pos":[4392.58154296875,-335.0544738769531],"size":[250,652.4117431640625],"flags":{},"order":13,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":132},{"localized_name":"positive","name":"positive","type":"CONDITIONING","link":63},{"localized_name":"negative","name":"negative","type":"CONDITIONING","link":64},{"localized_name":"latent_image","name":"latent_image","type":"LATENT","link":61},{"localized_name":"add_noise","name":"add_noise","type":"COMBO","widget":{"name":"add_noise"},"link":null},{"localized_name":"noise_seed","name":"noise_seed","type":"INT","widget":{"name":"noise_seed"},"link":116},{"localized_name":"steps","name":"steps","type":"INT","widget":{"name":"steps"},"link":null},{"localized_name":"cfg","name":"cfg","type":"FLOAT","widget":{"name":"cfg"},"link":null},{"localized_name":"sampler_name","name":"sampler_name","type":"COMBO","widget":{"name":"sampler_name"},"link":null},{"localized_name":"scheduler","name":"scheduler","type":"COMBO","widget":{"name":"scheduler"},"link":null},{"localized_name":"start_at_step","name":"start_at_step","type":"INT","widget":{"name":"start_at_step"},"link":null},{"localized_name":"end_at_step","name":"end_at_step","type":"INT","widget":{"name":"end_at_step"},"link":null},{"localized_name":"return_with_leftover_noise","name":"return_with_leftover_noise","type":"COMBO","widget":{"name":"return_with_leftover_noise"},"link":null}],"outputs":[{"localized_name":"LATENT","name":"LATENT","type":"LATENT","links":[65]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.46","Node name for S&R":"KSamplerAdvanced","widget_ue_connectable":{"add_noise":true,"noise_seed":true,"steps":true,"cfg":true,"sampler_name":true,"scheduler":true,"start_at_step":true,"end_at_step":true,"return_with_leftover_noise":true}},"widgets_values":["enable",100,"fixed",10,1,"res_2s","beta57",4,999,"disable",""],"color":"#223","bgcolor":"#335"},{"id":22,"type":"CLIPLoader","pos":[3014.672119140625,-88.59371185302734],"size":[302.9598083496094,125.00395202636719],"flags":{},"order":2,"mode":0,"inputs":[{"localized_name":"clip_name","name":"clip_name","type":"COMBO","widget":{"name":"clip_name"},"link":null},{"localized_name":"type","name":"type","type":"COMBO","widget":{"name":"type"},"link":null},{"localized_name":"device","name":"device","shape":7,"type":"COMBO","widget":{"name":"device"},"link":null}],"outputs":[{"localized_name":"CLIP","name":"CLIP","type":"CLIP","links":[121]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.43","Node name for S&R":"CLIPLoader","widget_ue_connectable":{"clip_name":true,"type":true,"device":true}},"widgets_values":["umt5_xxl_fp8_e4m3fn_scaled.safetensors","wan","default"],"color":"#2a363b","bgcolor":"#3f5159"},{"id":35,"type":"KSamplerAdvanced","pos":[4114.67236328125,-337.7133483886719],"size":[250,652.4117431640625],"flags":{},"order":11,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":134},{"localized_name":"positive","name":"positive","type":"CONDITIONING","link":58},{"localized_name":"negative","name":"negative","type":"CONDITIONING","link":59},{"localized_name":"latent_image","name":"latent_image","type":"LATENT","link":60},{"localized_name":"add_noise","name":"add_noise","type":"COMBO","widget":{"name":"add_noise"},"link":null},{"localized_name":"noise_seed","name":"noise_seed","type":"INT","widget":{"name":"noise_seed"},"link":115},{"localized_name":"steps","name":"steps","type":"INT","widget":{"name":"steps"},"link":null},{"localized_name":"cfg","name":"cfg","type":"FLOAT","widget":{"name":"cfg"},"link":null},{"localized_name":"sampler_name","name":"sampler_name","type":"COMBO","widget":{"name":"sampler_name"},"link":null},{"localized_name":"scheduler","name":"scheduler","type":"COMBO","widget":{"name":"scheduler"},"link":null},{"localized_name":"start_at_step","name":"start_at_step","type":"INT","widget":{"name":"start_at_step"},"link":null},{"localized_name":"end_at_step","name":"end_at_step","type":"INT","widget":{"name":"end_at_step"},"link":null},{"localized_name":"return_with_leftover_noise","name":"return_with_leftover_noise","type":"COMBO","widget":{"name":"return_with_leftover_noise"},"link":null}],"outputs":[{"localized_name":"LATENT","name":"LATENT","type":"LATENT","links":[61]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.46","Node name for S&R":"KSamplerAdvanced","widget_ue_connectable":{"add_noise":true,"noise_seed":true,"steps":true,"cfg":true,"sampler_name":true,"scheduler":true,"start_at_step":true,"end_at_step":true,"return_with_leftover_noise":true}},"widgets_values":["enable",100,"fixed",10,1,"res_2s","beta57",0,4,"disable",""],"color":"#223","bgcolor":"#335"},{"id":8,"type":"VAELoader","pos":[3099.625732421875,84.64628601074219],"size":[210,58],"flags":{},"order":3,"mode":0,"inputs":[{"localized_name":"vae_name","name":"vae_name","type":"COMBO","widget":{"name":"vae_name"},"link":null}],"outputs":[{"localized_name":"VAE","name":"VAE","type":"VAE","links":[9]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.43","Node name for S&R":"VAELoader","widget_ue_connectable":{"vae_name":true}},"widgets_values":["wan_2.1_vae.safetensors"],"color":"#323","bgcolor":"#535"},{"id":9,"type":"VAEDecode","pos":[4427.91357421875,361.8782653808594],"size":[211.20114135742188,46],"flags":{},"order":14,"mode":0,"inputs":[{"localized_name":"samples","name":"samples","type":"LATENT","link":65},{"localized_name":"vae","name":"vae","type":"VAE","link":9}],"outputs":[{"localized_name":"IMAGE","name":"IMAGE","type":"IMAGE","links":[8]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.43","Node name for S&R":"VAEDecode","widget_ue_connectable":{}},"widgets_values":[],"color":"#323","bgcolor":"#535"},{"id":10,"type":"SaveImage","pos":[4669.361328125,-330.9233703613281],"size":[788.8705444335938,772.3082275390625],"flags":{},"order":15,"mode":0,"inputs":[{"localized_name":"images","name":"images","type":"IMAGE","link":8},{"localized_name":"filename_prefix","name":"filename_prefix","type":"STRING","widget":{"name":"filename_prefix"},"link":null}],"outputs":[],"properties":{"cnr_id":"comfy-core","ver":"0.3.43","Node name for S&R":"SaveImage","widget_ue_connectable":{"filename_prefix":true}},"widgets_values":["ComfyUI"],"color":"#233","bgcolor":"#355"},{"id":71,"type":"LoraLoaderModelOnly","pos":[3346.9189453125,-28.68145751953125],"size":[270,82],"flags":{},"order":10,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":128},{"localized_name":"lora_name","name":"lora_name","type":"COMBO","widget":{"name":"lora_name"},"link":null},{"localized_name":"strength_model","name":"strength_model","type":"FLOAT","widget":{"name":"strength_model"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[129]}],"properties":{"cnr_id":"comfy-core","ver":"0.3.46","Node name for S&R":"LoraLoaderModelOnly","widget_ue_connectable":{}},"widgets_values":["WAN2.2_LowNoise_Instagirl_V2_Pruned.safetensors",1]},{"id":74,"type":"UnetLoaderGGUF","pos":[3042.22900390625,-197.35244750976562],"size":[270,58],"flags":{},"order":5,"mode":0,"inputs":[{"localized_name":"unet_name","name":"unet_name","type":"COMBO","widget":{"name":"unet_name"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[136]}],"properties":{"cnr_id":"ComfyUI-GGUF","ver":"b3ec875a68d94b758914fd48d30571d953bb7a54","widget_ue_connectable":{},"Node name for S&R":"UnetLoaderGGUF"},"widgets_values":["Wan2.2-T2V-A14B-LowNoise-Q8_0.gguf"]},{"id":73,"type":"UnetLoaderGGUF","pos":[3041.61181640625,-302.9368591308594],"size":[270,58],"flags":{},"order":4,"mode":0,"inputs":[{"localized_name":"unet_name","name":"unet_name","type":"COMBO","widget":{"name":"unet_name"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[135]}],"properties":{"cnr_id":"ComfyUI-GGUF","ver":"b3ec875a68d94b758914fd48d30571d953bb7a54","widget_ue_connectable":{},"Node name for S&R":"UnetLoaderGGUF"},"widgets_values":["Wan2.2-T2V-A14B-HighNoise-Q8_0.gguf"]},{"id":3,"type":"CLIPTextEncode","pos":[3637.72607421875,-338.9566955566406],"size":[450,350],"flags":{},"order":8,"mode":0,"inputs":[{"localized_name":"clip","name":"clip","type":"CLIP","link":124},{"localized_name":"text","name":"text","type":"STRING","widget":{"name":"text"},"link":null}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","links":[58,63]}],"title":"Positive Prompt","properties":{"cnr_id":"comfy-core","ver":"0.3.43","Node name for S&R":"CLIPTextEncode","widget_ue_connectable":{"text":true}},"widgets_values":[""],"color":"#2a363b","bgcolor":"#3f5159"}],"links":[[8,9,0,10,0,"IMAGE"],[9,8,0,9,1,"VAE"],[58,3,0,35,1,"CONDITIONING"],[59,4,0,35,2,"CONDITIONING"],[60,5,0,35,3,"LATENT"],[61,35,0,36,3,"LATENT"],[63,3,0,36,1,"CONDITIONING"],[64,4,0,36,2,"CONDITIONING"],[65,36,0,9,0,"LATENT"],[115,67,0,35,5,"INT"],[116,67,0,36,5,"INT"],[121,22,0,68,1,"CLIP"],[124,68,1,3,0,"CLIP"],[125,68,1,4,0,"CLIP"],[128,70,0,71,0,"MODEL"],[129,71,0,72,0,"MODEL"],[132,72,0,36,0,"MODEL"],[134,68,0,35,0,"MODEL"],[135,73,0,68,0,"MODEL"],[136,74,0,70,0,"MODEL"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.7756848205605316,"offset":[-2564.324446129299,766.807977939455]},"frontendVersion":"1.23.4","ue_links":[],"links_added_by_ue":[],"VHS_latentpreview":true,"VHS_latentpreviewrate":0,"VHS_MetadataImage":true,"VHS_KeepIntermediate":true},"version":0.4}

    vAnN47Aug 6, 2025· 4 reactions
    CivitAI

    hi, thanks for the lora ! :)

    a question, what does it mean: "harmful content" ?

    can i create nsfw (xxx or x) content with it or not?

    edit: after generating some images now i understand your point haha! thanks anyway though!

    Escaflown2034Aug 6, 2025· 7 reactions
    CivitAI

    Could you post a workflow using the native WAN 2.2 instead of using all the 2.1 loras that have been known to not be super compatible with the Wan 2.2 weights?

    dailydoseofaiartAug 6, 2025· 1 reaction
    CivitAI

    I'm assuming this is mainly a low noise model, what strength should one put for high noise? both 1?

    Edit: I'm leaving it for the others with the same question, I'm dumb, download it and your questions will be answered

    lesteriaxAug 6, 2025· 8 reactions
    CivitAI

    Hi, thanks for the update. Could you please upload the safetensor and the workflow separately? I'm using API to download lora's remotely and the output is a .zip file while its expecting a .safetensor. Thanks in advance

    Instara
    Author
    Aug 6, 2025

    It's not possible, I would have done it that way otherwise

    kingsimbaAug 6, 2025· 7 reactions
    CivitAI

    Does anyone have any recommendations on which Wan 2.2 model is best with 8gb VRAM?

    R3G4LAug 6, 2025· 1 reaction

    Try out the different Q4 2.2 models, then work your way down to smaller sizes if it's too much. Q4 should work fine if you got enough system DRAM. I've use models over 16.5gb on my 12VRAM with optimized workflow and good amount of system RAM. I recommend at starting 64gb ram for your PC but should scrape by with 32gb. I go over 90gb system RAM on some workflows. There is also WANGP 2.2 one click easy installer on Pinokio UI which is great for low VRAM users.

    1GirlUniversityAug 6, 2025· 4 reactions
    CivitAI

    Please, help me

    Prompt outputs failed validation: KSamplerAdvanced: - Value not in list: sampler_name: 'res_2s' not in (list of length 40) - Value not in list: scheduler: 'beta57' not in ['simple', 'sgm_uniform', 'karras', 'exponential', 'ddim_uniform', 'beta', 'normal', 'linear_quadratic', 'kl_optimal'] KSamplerAdvanced: - Value not in list: sampler_name: 'res_2s' not in (list of length 40) - Value not in list: scheduler: 'beta57' not in ['simple', 'sgm_uniform', 'karras', 'exponential', 'ddim_uniform', 'beta', 'normal', 'linear_quadratic', 'kl_optimal']

    1GirlUniversityAug 6, 2025

    hardnon thanks

    ducky66Aug 7, 2025

    hardnon hey, do you know if its possible to install these files if you have the pinokio version of comfyui? i can not find a venv folder anywhere.

    hardnonAug 7, 2025

    ducky66 No idea, I just fly by the seat of my pants. I don't know more than just what allows me to get things working.

    hardnonAug 7, 2025

    ducky66 oh you have to make a venv folder.... its a virtual environment you have to create through your Terminal.

    In the comfyui root folder I just run each line:

    right click in ComfyUI -> Open in terminal

    py -3.12 -m venv venv

    cd venv/Scripts

    ./Activate.ps1

    cd ../..


    alternative_UniverseAug 6, 2025· 2 reactions
    CivitAI

    This would go crazy on flux, hope you can do it someday

    techmangreatAug 6, 2025· 17 reactions
    CivitAI

    Please stop polluting this space with broken workflows that don't work. Your discord automatically bans people who ask about the broken workflow or missing models. What is a l3n0v0.safetensors? This sort of behavior should be banned from this community.

    engineerAug 7, 2025· 5 reactions

    TesterFranz That helps now. However it doesn't solve the underlying problem.

    theloraprodigyAug 7, 2025· 3 reactions

    wow the audacity to complain about something like this that is offered literally for free to you. what did you contribute? anything at all? of course your profile is completely empty. your ban already seems more than justified. what kind of low and useless human waste you are.

    poondoggleAug 7, 2025

    engineer There is no underlying problem. I loaded the workflow after downloading ALL of the loras listed and it worked perfectly.

    engineerAug 7, 2025· 4 reactions

    theloraprodigy I build a table. I give it to you for free. You set up a meal on it. The table collapses. Would you not at least raise an eyebrow and ask me to improve my carpentry skills? Heck, I would want to know.

    tetonasenjoyerAug 7, 2025· 1 reaction

    He literally links the lenovo model in the lora's description.

    youzong007131Aug 10, 2025

    Lol are people illiterate now? Can you not take 2 mins to read the description first before complaining?

    srrobertson100814Aug 11, 2025

    Workflow works fine for me... don't know why you're having issues

    mellinjohan297Aug 23, 2025

    @engineer wow, just wow. that analogy was jaw-dropping. NO, I WOULD NOT ask you to improve your carpenter skills. I would even hesitate to tell you it broke. I would just be glad you actually took the time to build a table and even more so, GIVE IT TO ME FOR FREE. I would not raise a single hair on any brow and I would just be grateful for your efforts and try to fix it myself if I wanted to keep it. How the hell are you reasoning? Disgraceful thinking.

    BinaryBottleBakeAug 7, 2025· 4 reactions
    CivitAI

    Are you able to upload the LORAs separately? I use a Civitdownloader with ComfyUI because my upload speed isn't very good, and I use runpod so I have to upload these which takes hours.

    iwantcabbages799Aug 7, 2025

    you know you can just download the models directly in runpod right?

    robinhud5738936Aug 8, 2025

    click on download, once download starts on your browser, copy the download link from downloads tab of your browser and then got to runpod terminal , cd into loras folder then run the below command with the download link updated with yours, update the link in quote with what you have copied from browser, my link wont work, it has a expiry token, so you have to copy the link which you get from the browser -

    wget -O insta.zip "https://civitai-delivery-worker-prod.5ac0637cfd0766c97916cefa3764fbdf.r2.cloudflarestorage.com/model/7053464/instagirlv2202B.hgkp.zip?X-Amz-Expires=86400&response-content-disposition=attachment%3B%20filename%3D%22Instagirlv2%20%2B%20workflow.zip%22&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=e01358d793ad6966166af8b3064953ad/20250808/us-east-1/s3/aws4_request&X-Amz-Date=20250808T121239Z&X-Amz-SignedHeaders=host&X-Amz-Signature=babb3ad4d2c9ad889fe95715498a289401abcc5237707b8914f84c5d7bcb2715"

    FLiPNoTiKAug 12, 2025

    To upload a LoRa much faster to Jupyter in Runpod:

    Go to terminal

    pip install gdown

    cd /workspace/ComfyUI/models/loras

    gdown --fuzzy PASTE-LORA-GOOGLE-DRIVE-LINK

    mrjak01616Aug 7, 2025
    CivitAI

    Is it possible to just make standard gens ?

    PopHorn1956Aug 7, 2025· 1 reaction
    CivitAI

    Nice loras. Unusual sampler and scheduler - res_2s and beta57. Where can I get them?

    robinhud5738936Aug 8, 2025

    have you found it. I am struggling to install it

    ducky66Aug 8, 2025

    marqs89 hey, i have comfyui installed from pinokio and i do not have a venv folder, so im lost how to install this. i even downloaded the "portable version" and tried installing it there and when following instructions on the page your linking, it gives error, i ran the command in the custom nodes folder and it appeared to work, but the samplers are still not showing up.

    tetonasenjoyerAug 7, 2025· 3 reactions
    CivitAI

    You are the real deal mate. Great lora!

    19ruby57ga284sAug 7, 2025
    CivitAI

    trying to run the example comfy workflow. everything works, only it seems like output resolution is lower compared to examples. any ideas?

    Lora_AddictAug 7, 2025

    You can change the output resolution in the workflow

    robinhud5738936Aug 8, 2025

    is this video model?

    playtime_aiAug 9, 2025

    robinhud5738936 what is a video if not a series of images? yes, models trained on images work for video.

    saucheibOct 16, 2025

    @playtime_ai when I'm using your workflow it just generates black images, no errors at all...

    seshdayAug 7, 2025· 30 reactions
    CivitAI

    This looks great! Side note; Uploading a LoRA trained on WAN and presumably a dataset you don't own, then slapping a restrictive license on is not legally sound.

    Lora_AddictAug 8, 2025· 2 reactions
    CivitAI

    Only one negative thing about this Lora. It tries to add a nose ring way to often, but then it's mostly only a part of a nose ring

    TheRealOniAug 9, 2025

    try adding a negative prompt for nose ring

    Agent_SmthAug 8, 2025
    CivitAI

    is this trained on photos or videos? ty

    ProvenFlawlessAug 15, 2025· 3 reactions

    Mostly likely photos I imagine. Gooning for high quality videos of girls on insta-sluts is rather rare. Hopefully OP will make a Pinterest/tiktok girls Wan lmao. I've made a flux Pinterest girls but I wouldn't release that lol

    robinhud5738936Aug 8, 2025
    CivitAI

    great one, where do I find the sampler and the scheduler

    robinhud5738936Aug 8, 2025

    marqs89 thanks mate. it is taking lots of time for 81 frames, it is been 30 mins on a6000 48GB with sampler euler and scheduler simple, is it gonna be better with the configured sampler and scheduler?

    Lora_AddictAug 8, 2025· 1 reaction

    robinhud5738936 I did not try videos with this workflow, only pics. Will try it later and give you feedback! The res sampler will likely take a bit longer but it's worth it! But if you generate 81 frames with such a big resolution and 10 steps it will likely take very long yes. You will likely have to either reduce resolution, steps or frames. Or all of it ;)

    Lora_AddictAug 8, 2025

    robinhud5738936 so i did 81 frames, 480x720, 10 steps, res_2s sampler -> 450 seconds on a 4090. Quality obviously not very good with that resolution. Wan 2.2 is awesome but using two models takes time sadly. Oh btw, 81 frames is not 5 seconds bc the videos are 24 FPS so if you want 5 seconds you have to do 24*seconds+1=121 frames, which takes even longer of course :D

    dasfajhiAug 8, 2025

    marqs89 14b fps is 16 - 5b fps is 24

    Lora_AddictAug 8, 2025

    dasfajhi i use the 14b and if i use 81fps i only get like 3 seconds?
    EDIT: Forget it, i'm just a dummy :D
    You are correct!

    FluxFestAug 15, 2025· 1 reaction

    marqs89 you can use the Film VFI node at the end of the generation process to interpolate frames, taking it from the native 16fps to 32fps without too much additional compute.

    getJinxedAug 9, 2025· 1 reaction
    CivitAI

    how do i train a charchter lora for this ?

    hackinz03563Aug 9, 2025

    How do I even run this LoRA as of now both replicate and Fal don't provide a Wan2.2 lora model to run

    HearmemanAIAug 9, 2025· 54 reactions
    CivitAI

    Great LoRA.
    What's up with the license?
    This is a derivative of Wan.
    Do you own the rights to the dataset used? if not, your license is as good as solid food for my dentureless grandma

    AdaptiveVisionAug 9, 2025· 4 reactions

    Nobody will know how you will use the model, those licenses are useless anyway.

    BellaMartinezAug 12, 2025

    I'm assuming it's just in case some scammy company uses it or something

    wyldhuntAug 15, 2025· 4 reactions

    It's a LoRa. Saying that they can't license it because it was trained on another model is like saying that you can't sell the tires you hand crafted and built yourself to fit a Ford because you don't own Ford motor company.

    knigitzAug 15, 2025· 11 reactions

    wyldhunt You're right that creating something compatible with another product doesn't automatically restrict its sale. But the LoRA situation isn't quite like crafting tires--it’s more like modifying a car engine using proprietary parts and then trying to resell the whole thing under your own terms.

    LoRAs are trained on base models like Wan 2.2, which themselves may have licensing restrictions. If the base model or its training data isn't owned or fully licensed by the LoRA creator, then slapping a restrictive license on the derivative work can be legally shaky. It's not just about compatibility--it's about derivative rights and the provenance of the data used.

    The right-to-repair analogy works better when you're modifying something you own. But in this case, if the LoRA was trained on a model or dataset the creator doesn't have rights to, then asserting licensing control over the output is like selling a remix of a song you don’t own the rights to--regardless of how much you tweaked the bassline.

    So while the spirit of open innovation is admirable, licensing in the AI space hinges on data ownership, model provenance, and derivative rights. That’s why some folks are raising eyebrows about the restrictive license here.

    Alright123Aug 15, 2025

    AdaptiveVision Depends. If an investigation happens somewhere and ask for proofs, or forget to hide the meta, they can just paste the image/video in comyfui and see what you used :P

    Once they they will scan stuffs with AI and see the fingerprint of everything used haha

    kkkdsadasAug 17, 2025

    yes.wan2.2 got the license of internet videos than give you wan2.2

    knigitzAug 17, 2025

    Alright123 All you need to do is save the image without metadata.

    wyldhuntSep 2, 2025

    @knigitz I can't agree. You base it off of the model to match the tensor shapes. In your analogy, that would be like saying that you made something to fit a Ford V8 engine. There is nothing proprietary about the tensors or the shape of the LoRa. The LoRa is its own model, with its own data, using a tensor pattern that is compatible with a specific model. How could someone claim ownership because their model uses the pattern that the LoRa was designed to fit?

    LORA
    Wan Video 14B t2v

    Details

    Downloads
    7,193
    Platform
    CivitAI
    Platform Status
    Available
    Created
    8/6/2025
    Updated
    5/4/2026
    Deleted
    -

    Files

    Instagirlv2 + workflow.zip