Doesn't really work. Worked fine once, then now every time I try to render something it just keeps saying CUDA out of memory, and this persists even after a hard restart. I have a 2080ti with 11G of ram, and the program can't even allocate 3GB to render a file. It seems the program is allocating memory that it never releases after its done, and as such each subsequent load hogs up GPU memory.
I don't think the program, or the Stable Diffusion implementation, is ready for prime time yet. The limit of 512x512 is about as far as I can render, which is also really low resolution - and even with Gigapixle the quality is subpar to other programs like Midjourney.