v5.0a
"unetLR": 0.0005,
"clipSkip": 2,
"networkDim": 128,
"numRepeats": 10,
"resolution": 1024,
"lrScheduler": "cosine_with_restarts",
"noiseOffset": 0.1,
"networkAlpha": ??,
"optimizerType": "AdamW8Bit",
"textEncoderLR": 0.00005,
bigger dataset v3.0b:
Text Encoder learning rate:0.000005
Unet learning rate:0.0001
LR Scheduler:cosine
Optimizer:AdamW8bit
Network Dim:??
Network Alpha:32
v3.0a:
Bigger training dataset
0.8 have best result
1 look weird for me
Text Encoder learning rate:0.00001
Unet learning rate:0.0001
LR Scheduler:constant
Optimizer:AdamW8bit
Network Dim:32
Network Alpha:16
Description
different training method
FAQ
Details
Downloads
3,350
Platform
SeaArt
Platform Status
Available
Created
3/27/2025
Updated
5/25/2025
Deleted
-
Files
Available On (1 platform)
Same model published on other platforms. May have additional downloads or version variants.




