1

According to this, 'processes have separate memory'. But Pytorch can somehow share memory among several processes, according to this link: 'Once the tensor/storage is moved to shared_memory (see share_memory_()), it will be possible to send it to other processes without making any copies.' Why is it possible to share memory among separate memory? Doesn't it sound like a paradox?

jabberwoo
  • 491
  • 6
  • 18

0 Answers0