1 repeat, 100 epoches, saving each every 20 epoches, totalling 5 epoches. cosine with restarts, 5 total cycle, synchronizing with each saving epoches. 1e-4 unet learning rate, zero text encoder learning.
256/256 network/convolution dimension and alpha.
trained on NoobAI XL 1.1 base model.
enabling random crop, original resolution artwork, tagged with wd14 swin v2 tagger v3. tagging threshold ~0.1 to 0.25
Description
FAQ
Comments (1)
I love your loras but can you kindly why the sizes are so big? usually all other style loras are around 200 to 300mb
Looks like we don't have an active mirror for this file right now.
CivArchive is a community-maintained index — we catalog mirrors that volunteers upload to HuggingFace, torrents, and other public hosts. Looks like no one has uploaded a copy of this file yet.
Some files do get recovered over time through contributions. If you're looking for this one, feel free to ask in Discord, or help preserve it if you have a copy.