I'm running into an issue while running Stable Diffusion web UI AUTOMATIC1111. I have an RTX 2060 6GB, and I'm running into this issue.
OutOfMemoryError: CUDA out of memory.
Tried to allocate 1.50 GiB
(GPU 0; 5.77 GiB total capacity; 3.34 GiB already allocated; 1.07 GiB free; 3.41 GiB reserved in total by PyTorch)
If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.
See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Time taken: 57.59s
Torch active/reserved: 4200/4380 MiB, Sys VRAM: 5687/5910 MiB (96.23%)
I know it's a low amount of vram, but I didn't get this while running under Windows. I tried setting max_split_size_mb
to 512mb