I want to benchmark execution time of some Python script that involves some heavy computations. Although I can run it in acceptable time against a few datasets on my local machine, the waiting time becomes unacceptable when I test it against thousands of datasets, which is something I would like to do, so I have to resort to using a remote computation. The problem is that the remote machine is not used just by me, but by a few people, and their code often drains computation power from CPUs that I use, making benchmark w.r.t. time barely meaningful.
At the moment, I just run Python script via executing bash script like this:
python myscript.py --dataset dataset1
I can't ask the remote server owner to grant me full access to CPUs, which is a perfect case scenario, of course. I would like to do something like this: check if current CPU is used by anything else at the moment, if it is, then freeze the process and wait until it is free again, then resume the process. Is there a way to accomplish this or are there alternatives up to this task out there?