Joycaption now has both multi GPU support and batch size support > https://www.patreon.com/posts/110613301
FLUX LoRA training configurations fully updated and now works as low as 8GB GPUs — yes you can train on 8 GB GPU a 12 billion parameter model — very good speed and quality > https://www.patreon.com/posts/110293257
Check out the images to see all details
The above is the detailed content of Huge Daily Developments for FLUX LoRA Training (Now Even Works on GPU) and More. For more information, please follow other related articles on the PHP Chinese website!