4

Im writing a script to be run through qsub in a sge cluster.

I have several executables that I need to call in the script and all of them are in the same location (the same as the script itself). So Im wondering How can I get the location of the script Im executing. I have an automatic build system and it is just not right to hard code this location. The code should be something like

#!/bin/bash  
#$ -S /bin/bash  
#$ -V  
#$ -cwd
#$ -N "Job"  

exec="/path/to/this/script/"execNameJob  
$exec

I tryed a couple of things and they didn't work and people marked as duplicate. None of the previous solutions worked. Im running my script using qsub -o console.out /tmp/qsubScript/qsubScript.sh. What I want to get is the /tmp/qsubScript/ path in the script

SOURCE="${BASH_SOURCE[0]}"
while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symlink
  DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
  SOURCE="$(readlink "$SOURCE")"
  [[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located
done
DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"

echo " Directory SOURCE: $DIR"
echo " BaseDir 0: " $0
echo " SGE_ROOT: $SGE_ROOT"
echo " SGE_JOB_SPOOL_DIR: $SGE_JOB_SPOOL_DIR"

the output is:

Directory SOURCE: /opt/gridengine/default/spool/compute-2-8/job_scripts
BaseDir 0: /opt/gridengine/default/spool/compute-2-8/job_scripts/9534
SGE_ROOT: /opt/gridengine
SGE_JOB_SPOOL_DIR: /opt/gridengine/default/spool/compute-2-8/active_jobs/9534.1

The main problem is that the sge implementation copy my script into a job_script, so I loose the reference to the original script.

leo
  • 41
  • 2
  • 4
    This? [Can a Bash script tell what directory it's stored in?](http://stackoverflow.com/q/59895/1983854) – fedorqui Jun 09 '16 at 09:27
  • I try this before and it looks that SGE is executing my script through another script. Suppose I locate my script in /tmp/qsubDir/ when running this solution I get: Directory SOURCE: /opt/gridengine/default/spool/compute-3-15/job_scripts – leo Jun 09 '16 at 09:38
  • I added this as an example that didnt work – leo Jun 09 '16 at 11:49

2 Answers2

0

My guess is that qsub stores the submission in a separate directory and then runs it from there. By that time, the original queue entry's path will be lost. You will simply need to pass it as a parameter if you need it to be included in the job's metadata.

The submission directory is generally not guaranteed to be visible to the nodes which run the job so what you are attempting may not be directly possible anyway.

Depending on the platform, you might have to create a self-contained bundle of some sort, or pull in the dependencies you need on the compute node somehow as part of the job's startup processing.

tripleee
  • 175,061
  • 34
  • 275
  • 318
0

Not completely elegant but in one step the problem is solved. I'm using the same script to send the qsub work and set a variable for the given path

#!/bin/bash  
#  
#$ -S /bin/bash
#$ -V
#$ -cwd
#$ -N "Job"

if [ -z ${var+x} ]; then 
  sPath="`dirname $0`"
  qsub -o console.txt -e error.txt -v var=$sPath $0 >qsub.txt
else
  echo " Set variable: $var"
fi
leo
  • 41
  • 2