According to this, 'processes have separate memory'. But Pytorch can somehow share memory among several processes, according to this link: 'Once the tensor/storage is moved to shared_memory (see share_memory_()), it will be possible to send it to other processes without making any copies.' Why is it possible to share memory among separate memory? Doesn't it sound like a paradox?
Asked
Active
Viewed 948 times
1
-
1Probably shared to the main process or by using a Manager.dict or similar. – Bram Vanroy May 27 '19 at 09:48
-
1Perhaps this question is more suitable to ask on PyTorch's own forum https://discuss.pytorch.org/ . You could try asking there. – akshayk07 May 27 '19 at 10:23
-
Had posted on Pytorch forum; no one answered me yet :-( – jabberwoo May 27 '19 at 18:35
-
@jabberwoo where is the pytorch forum with your question? – Charlie Parker Feb 15 '21 at 20:21