0

I'm writting a fortran MPI program.

After calling subroutine MPI_COMM_RANK

MPI_COMM_RANK (MPI_COMM_WORLD, my_id, ierr)

I get the rankmy_id of current process. I found my_id is disturbed after calling MPI_RECV().

program hello_world
implicit none
include 'mpif.h'
integer ierr, num_procs, my_id,status,i
INTEGER :: send

call MPI_INIT ( ierr )
call MPI_COMM_RANK (MPI_COMM_WORLD, my_id, ierr)
call MPI_COMM_SIZE (MPI_COMM_WORLD, num_procs, ierr)

if (my_id == 0) then
  send = 1
  print *,my_id,"==M=="
  !!! send data to slave processed
  do i = 1,num_procs-1
      call MPI_SEND(send,1, MPI_INTEGER,i , &
           1020, MPI_COMM_WORLD, ierr )
  end do

else

  call MPI_RECV(send,1, MPI_INTEGER, 0, &
       1020, MPI_COMM_WORLD,status, ierr )

  print *,my_id,"==S=="

end if
   call mpi_finalize(ierr)
end program

And then, I got a wrong printout

0 ==M==
1020 ==S==
1020 ==S==

when I place

print *,my_id,"==S=="

in front of

call MPI_RECV(

I got the right printout

 1 ==S==
 2 ==S==
 0 ==M==

It seems like MPI_RECV disturb the value of my_id. I have no idea about it.

Chenxin
  • 21
  • 5

0 Answers0