2

I have some experience sometimes a code works when use python script ( python script.py ) but doesn't work when I use jupyter notebook, with exactly conda enviroment and code.

The most recently case is: conda enviroment with python3.7.7, the code is

import subprocess
from subprocess import Popen
p1 = Popen(['sbatch', 'run2.txt'], stdout=subprocess.PIPE, shell=False, cwd=working_path)
p1.wait()
output = p1.communicate()[0]
print (output)

This code submits the job successfully for both ways (python script/jupyter notebook ). However, when submitting with jupyter notebook the slurm cluster can't run the job.

So I guess there are some foundemental differences between the two ways, or there are some settings I should know. Anyone can give me a help?


I found a clue that, from the output file of the submitted job, I can see when I use terminal or .py file to submit the job, the job create two threads but when I use .ipynb to submit the job, it doesn't create any thread.


Talked with cluster administrator, and they installed a system wide jupyter notebook and this version works. So the problem may be the previous version, installed by me in a conda enviroment, doesn't have some permission.

  • You might want to see this: https://stackoverflow.com/questions/38616077/live-stdout-output-from-python-subprocess-in-jupyter-notebook – raspiduino Jun 11 '21 at 03:17
  • Basically Jupyter notebook is running IPython, and your stdout will output to the shell that runs that IPython process. – raspiduino Jun 11 '21 at 03:18
  • @raspiduino The !{cmd} is pretty useful! Thank you! However, the problem is still there. After submiting job successfuly ( I can see the cluster create a output file for the job ), the job fails. I also try to use ipython script.py with the same code, the cluster deal with the job well. – user9660558 Jun 11 '21 at 03:24
  • You said * I can see the cluster create a output file for the job*, but the job fails. How do you know it fails? – raspiduino Jun 11 '21 at 03:26
  • @raspiduino When I submit the job with "sbatch run2.txt", I submit an independ code to the cluster and this code will keep output to a file, when I open the file, it stops at "Error when reading xxx file" (xxx file is supposed to be created by the independ code). So it means the job is submitted to cluster but something is wrong. I tried to use "sbatch run2.txt" in terminal, or a .py file, nothing is wrong. – user9660558 Jun 11 '21 at 03:30
  • So I guess something is difference runing "sbatch run2.txt" in .py file and .ipynb file. – user9660558 Jun 11 '21 at 03:32
  • could you check something was wrong with the working path? – raspiduino Jun 11 '21 at 03:38
  • Maybe you open the jupyter notebook not in right path? – raspiduino Jun 11 '21 at 03:38
  • @raspiduino I finally find a clue. In the submitted job, it shall spawn some 2 threads but it didn't when submit the job with ipython notebook. – user9660558 Jun 11 '21 at 04:07

0 Answers0