2

I teached my neural nets and realized that even after torch.cuda.empty_cache() and gc.collect() my cuda-device memory is filled. In Colab Notebooks we can see the current variables in memory, but even I delete every variable and clean the garbage gpu-memory is busy. I heard it's because python garbage collector can't work on cuda-device. Please explain me, what should I do?

Robert Crovella
  • 143,785
  • 11
  • 213
  • 257
  • please see [enter link description here](https://stackoverflow.com/questions/55322434/how-to-clear-cuda-memory-in-pytorch) – jinyi wu Dec 01 '21 at 08:04

2 Answers2

3

For me I have to delete the model before emptying the cache:

del model
gc.collect()
torch.cuda.empty_cache()

then you can check memory is freed using 'nvidia-smi'.

DvdG
  • 724
  • 1
  • 8
  • 15
3

You can do this:

import gc
import torch
gc.collect()
torch.cuda.empty_cache()
razimbres
  • 4,715
  • 5
  • 23
  • 50