Follow me on X/Twitter for exclusive content: https://x.com/Citron_Legacy 😁
Please checkout all my Pokemon Loras: https://civarchive.com/collections/261
Pony version was posted around May:
Start May off right!
Description
Trained on: 250 images
Training Model: Animefull-final-pruned-fp16
flip_aug: False
Num of Repeats: 2
Unit is Epochs or Steps: Epochs
Number of Epochs or Steps: 10
Training Batch Size: 4
Total Steps: 915
Resolution: 512
Network Dim: Dim 32 and Alpha 16
Lora Creation Process took: 00:19:01
FAQ
Comments (7)
Bye bye bye to a LoRA folder without May!
Pokegirl of the month 😍 (I love the prompt "perfect" to trigger her)
Lol thanks. I actually tagged my favorite pictures in the dataset with "perfect" so when "perfect" is used in the prompt the Lora leans towards those pictures.
Could you talk a bit more about merge reinforcement? I did something similar in my old SciWhite SD1.5/XL LORA.
Yes. When training XL I have an issue were the Loras aren't very strong and high weights (like 1.3 or higher) end up looking messy.
When my Lora seems like its not really understanding the details of a character/concept/etc I merge that Lora with itself.
After the merge XL loras behave closer to the way you'd expect a SD1.5 Lora to operate. I suspect this wouldn't be needed if you simply trained a Lora with more steps/repeats/etc but since I'm using Free-Tier Google Colab I don't have the option to train with a powerful GPU.
Back in the day I used Automatic1111 and other tools for Lora merging, but lately I use ComfyUI so this screenshot shows how I'm doing the merge.
https://civitai.com/posts/2471151
As you can see there is a "Merge Lora" node in the middle which would normally be used to merge two different Loras but I'm use the 1st Lora as input for both "lora_1" and "lora_2"
I've messed around with the various settings and found that the specific ones in the image work best for what I am doing.
@CitronLegacy Yup. Same technique I did except I double-triple trained them on Kohya under same datasets but diff settings.
This version of May is Tearin' Up My Heart! ;)


