Questions tagged [sbatch]

sbatch submits a batch script to SLURM (Simple Linux Utility for Resource Management). The batch script may be given to sbatch through a file name on the command line, or if no file name is specified, sbatch will read in a script from standard input. The batch script may contain options preceded with "#SBATCH" before any executable commands in the script.

sbatch submits a batch script to SLURM (Simple Linux Utility for Resource Management). The batch script may be given to sbatch through a file name on the command line, or if no file name is specified, sbatch will read in a script from standard input. The batch script may contain options preceded with "#SBATCH" before any executable commands in the script.

231 questions
166
votes
2 answers

SLURM `srun` vs `sbatch` and their parameters

I am trying to understand what the difference is between SLURM's srun and sbatch commands. I will be happy with a general explanation, rather than specific answers to the following questions, but here are some specific points of confusion that can…
dkv
  • 6,602
  • 10
  • 34
  • 54
37
votes
3 answers

How to submit a job to any [subset] of nodes from nodelist in SLURM?

I have a couple of thousand jobs to run on a SLURM cluster with 16 nodes. These jobs should run only on a subset of the available nodes of size 7. Some of the tasks are parallelized, hence use all the CPU power of a single node while others are…
Faber
  • 1,504
  • 2
  • 13
  • 21
16
votes
2 answers

How to activate a specific Python environment as part of my submission to Slurm?

I want to run a script on cluster (SBATCH file). How can activate my virtual environment (path/to/env_name/bin/activate). Does I need only to add the following code to my_script.sh file? module load python/2.7.14 source…
bib
  • 944
  • 3
  • 15
  • 32
15
votes
3 answers

Is it possible to run SLURM jobs in the background using SRUN instead of SBATCH?

I was trying to run slurm jobs with srun on the background. Unfortunately, right now due to the fact I have to run things through docker its a bit annoying to use sbatch so I am trying to find out if I can avoid it all together. From my…
Charlie Parker
  • 5,884
  • 57
  • 198
  • 323
15
votes
3 answers

SLURM sbatch job array for the same script but with different input arguments run in parallel

I have a problem where I need to launch the same script but with different input arguments. Say I have a script myscript.py -p -i , where I need to consider N different par_values (between x0 and x1) and M trials for each value…
maurizio
  • 745
  • 1
  • 7
  • 25
12
votes
1 answer

SLURM: Changing the maximum number of simultaneously running tasks for a running array job

I have set of an array job as follows: sbatch --array=1:100%5 ... which will limit the number of simultaneously running tasks to 5. The job is now running, and I would like to change this number to 10 (i.e. I wish I'd run sbatch --array=1:100%10…
James Owers
  • 7,948
  • 10
  • 55
  • 71
11
votes
3 answers

How to get the ID of GPU allocated to a SLURM job on a multiple GPUs node?

When I submit a SLURM job with the option --gres=gpu:1 to a node with two GPUs, how can I get the ID of the GPU which is allocated for the job? Is there an environment variable for this purpose? The GPUs I'm using are all nvidia GPUs. Thanks.
Negelis
  • 376
  • 4
  • 17
9
votes
1 answer

SLURM sbatch output buffering

I created some slurm scripts and then tried to execute them with sbatch. But the output file is updated not frequently (once a minute maybe). Is there a way to change the output buffering latency in sbatch? I know stdbuf is used in such situations…
ahemya
  • 101
  • 1
  • 2
9
votes
3 answers

Is there a "one-liner" for submitting many jobs to SLURM (similar to LSF)?

Can I submit "one-liners" to SLURM? Using bsub from LSF and the standard Linux utility xargs, I can easily submit a separate job for uncompressing all of the files in a directory: ls *.gz | sed 's/.gz$//g' | xargs -I {} bsub 'gunzip -c {}.gz >…
Christopher Bottoms
  • 11,218
  • 8
  • 50
  • 99
8
votes
1 answer

Using SBATCH Job Name as a Variable in File Output

With SBATCH you can use the job-id in automatically generated output files using the following syntax with %j: #!/bin/bash # omitting some other sbatch commands here ... #SBATCH -o slurm-%j.out-%N # name of the stdout, using the job number (%j)…
ctesta01
  • 909
  • 8
  • 19
6
votes
1 answer

SLURM / Sbatch creates many small output files

I am running a pipeline on a SLURM-cluster, and for some reason a lot of smaller files (between 500 and 2000 bytes in size) named along the lines of slurm-XXXXXX.out (where XXXXXX is a number). I've tried to find out what these files are on the…
erikfas
  • 4,357
  • 7
  • 28
  • 36
6
votes
2 answers

SLURM Submit multiple tasks per node?

I found some very similar questions which helped me arrive at a script which seems to work however I'm still unsure if I fully understand why, hence this question.. My problem (example): On 3 nodes, I want to run 12 tasks on each node (so 36 tasks…
Shiwayari
  • 315
  • 3
  • 12
6
votes
2 answers

Why do I keep getting NonZeroExitCode when using sbatch SLURM?

I have a simple test.ksh that I am running with the command: sbatch test.ksh I keep getting "JobState=FAILED Reason=NonZeroExitCode" (using "scontrol show job") I have already made sure of the following: slurmd and slurmctld are up and running…
user3200387
  • 141
  • 1
  • 4
5
votes
1 answer

Batch job submission failed: I/O error writing script/environment to file

I installed slurm on a workstation and it seemed to work, i can use the slurm commands, srun is working too. But when i try to launch a job from a script using sbatch test.sh i get the following error : Batch job submission failed: I/O error writing…
5
votes
2 answers

Snakemake slurm ouput file redirect to new directory

I'm putting together a snakemake slurm workflow and am having trouble with my working directory becoming cluttered with slurm output files. I would like my workflow to, at a minimum, direct these files to a 'slurm' directory inside my working…
Ensa
  • 105
  • 1
  • 5
1
2 3
15 16