I am looking for some general advice rather than a coding solution. Basically when submitting a job via bsub I can retrieve a log of the Stdin/Stdout by specifying any of the following:
bsub -o log.txt % sends StdOut to log.txt
bsub -u me@email % sends StdOut to email
these are both great, but my program creates a folder when submitted to bsub and is stored on the remote server. essentially I want to
a) retrieve the folder and it's contents b) do this automatically when the job finishes
so I could technically to a by using scp -r
, however I would have to do this manually. not too bad if I get an email alert when the job is finished, but still - I'd have to manually do this.
so onto b):
well I can't see any special flag for bsub to retreive the actual results, only StdOut. I suppose I could have a script which uses sleep
and sets to the job time (perhaps a bit linger just to be safe), something like
#!/bin/bash
scp myfile.txt server:main/subfolder
ssh bsub < myprogram.sh -u my@email
sleep <job-time>
scp -r server:main/subfolder result_folder
however I am slightly concerned about being logged out etc and the script terminating before the job is finished.
does anyone have any suggestions?
I essentially want to have a interface (website in future) where user can submit a file, file is analysed remotely, user is sent emails when job starts/finishes, results automatically retrieved back to local/webserver, user gets email saying they can pick up their results.
one step at a time though!