I am trying to profile an MPI/OpenACC Fortran code. I found a site that details how to run nvprof with MPI here. The examples given are for OpenMPI. However, I am limited to MPICH and I can't figure out the equivalent. Anyone know what it would be?
Asked
Active
Viewed 362 times
0
-
1Could you be more specific in what isn't working for you? – Yossarian Jun 09 '16 at 15:35
-
The examples use a variable OMPI_COMM_WORLD_RANK like so: `-o output.%h.%p.%q{OMPI_COMM_WORLD_RANK}` That variable is unique to OpenMPI. I need the MPICH equivalent. Thanks. – bob.sacamento Jun 09 '16 at 15:52
1 Answers
5
As far as I can tell the only OpenMPI specific parts of the nvprof
examples are the use of OMPI_COMM_WORLD_RANK
to get a unique filename for each rank. According to the discussion here you may be able to use either PMI_RANK
or PMI_ID
instead.
On my system I have the following small program
program env
implicit none
call system("echo $PMI_RANK")
end program env
I compile with mpif90 env.f90 -o test
and run with mpirun -np 2 ./test
and get
0
1
as output. So I think you can just replace -o output.%h.%p.%q{OMPI_COMM_WORLD_RANK}
with -o output.%h.%p.%q{PMI_RANK}
.
For the cray-mpt
mpi library I believe the correct variable is in fact ALPS_APP_PE
instead.

d_1999
- 854
- 6
- 16
-
-
1Also, `-x ENV_VAR1 -x ENV_VAR2 ...` should be replaced by `-envlist ENV_VAR1,ENV_VAR2,...`. – Hristo Iliev Jun 09 '16 at 16:25
-
@HristoIliev Sorry, you lost me. Is this a flag in the mpi execution command? I use aprun and it has no such flag. – bob.sacamento Jun 09 '16 at 17:27
-
@d_1999 Thanks for the suggestion, but the system doesn't recognize either PMI_RANK or PMI_ID. Returns an error immediately. – bob.sacamento Jun 09 '16 at 17:28
-
1@bob.sacamento As you've just mentioned you're using `aprun` I guess you're actually running on a Cray system which likely uses Cray's mpt mpi library which I think is related to mpich but is heavily customised. As such I've updated the answer to include what should be the correct replacement for `PMI_RANK` (tested on a Cray system in the UK). – d_1999 Jun 09 '16 at 18:02
-
@d_1999 Job now running. Looks like it understands it. ALPS_APP_PE. It's so obvious in retrospect. How did I miss it? :) Mind if I ask how you learned this? Thanks! – bob.sacamento Jun 09 '16 at 18:06
-
@bob.sacamento, I noticed in the tutorial you've linked that they are passing certain environment variables to the MPI job. The option for Open MPI's `mpiexec` is `-x`. No idea how to do it on Cray. – Hristo Iliev Jun 09 '16 at 21:29
-