@thor (1/2) That depends heavily on what model you are using, most models out there are merges, when you merge a model, defects in one model get passed down. A very common defect is ignoring the tokens in the prompt, like for example, ignoring the second token, or the fifth. This is lowkey intentional. Also, moving up the cfg scale a little will also help. Another problem is that retraining models make the model lose info, for example, if you...
@thor Lose of information also gets passed down when merging.
@thor Almost forgot, nowadays, every single model out there is a merge of some damaged model, if you want to get something good and update, you may have to do weighted u-net layer merging of older models with new models and pray that it comes all right or at the very least you get more insight in what exactly you are doing.
@thor (2/2) re-train a working model with not-properly tagged pictures of corgis, now all dogs are going to start looking a little more like corgis