I have a NxM size tensorflow variable stored in GPU. At some point, I would like to delete some rows i.e. free the memory occupied by them in GPU.
I don't want to release all memory resources by doing a sess.close()
or deleting the session.
Is this possible in tensorflow to just pick random row from the variable and completely erase it? I have read this answer about releasing memory but its for a tensorflow op.
Asked
Active
Viewed 130 times
0

Aliaksandr Sushkevich
- 11,550
- 7
- 37
- 44

Meenu Agarwal
- 1
- 1
-
Like the answer to the linked question says, as soon as one tensor is not needed for the computation its memory will be released. If you compute a new tensor without the rows you don't need (e.g. [slicing](https://www.tensorflow.org/api_docs/python/tf/slice) or with [`tf.gather`](https://www.tensorflow.org/api_docs/python/tf/gather)), and then not use the original tensor anymore, TensorFlow should release the memory from the first one as soon as this is done. – jdehesa Oct 05 '18 at 09:43
-
Thanks for the reply @jdehesa . So, the memory released will be used again by the same process? I am creating new tensors but can't see the memory released by old tensors using `watch nvidia-smi`. Is there any way to keep track of it? – Meenu Agarwal Oct 06 '18 at 10:45
-
Ah I see what you mean. Well I do not know how the memory management works in TensorFlow, but I believe it reserves a lot of memory in the GPU and then assigns it internally to different tensors as needed (instead of allocating and releasing each time, which takes some time). So you would not see the changes in `nvidia-smi`, but in principle the memory is being used properly. I have seen this can be problematic if you want to run other GPU apps along with TensorFlow, since it can be a memory hog. – jdehesa Oct 06 '18 at 11:18
-
Oh, got it. Thanks! – Meenu Agarwal Oct 08 '18 at 05:31