0

I wrote a scientific simulation environment using python 2.7.

I start several instances of my simulation at the same time by directly using the Process interface:

for i in range(nr_cores):
    p = Process(target=worker, args=(i, nr_cores, scheduler, job, nr_iter, return_values, extremes, parameters,))
    processes.append(p)
    p.start()

for process in processes:
    process.join()

This works flawlessly on

  • my fedora 21 machine running python 2.7.8 (kernel 3.19.3)
  • my OSX machine running python 2.7.6

Now I tried to install it on a debian 7.8 (kernel 3.2.63) machine with python 2.7.3 and odd things started to happen:

  • The number of processes listed in top is greater than what I actually spawn (14 instead of 2)
  • Of these fourteen only two are running, the rest is sleeping
  • The two running processes share one core. The other cores are idle

I downloaded and compiled python 2.7.9 but the behavior is exactly the same.

I remember seeing a similar issue on another debian machine, but unfortunately I can't remember what version it was.

Has anyone encountered something like this before?

Thanks

1 Answers1

0

Ok, found it. After some poking around I found that this: Why does multiprocessing use only a single core after I import numpy?

was my issue. The suggestion solved my issue as well

Community
  • 1
  • 1