help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Question about MPI_Barrier in openmpi_ext package (posible bug)


From: Edwardo
Subject: Question about MPI_Barrier in openmpi_ext package (posible bug)
Date: Mon, 20 May 2013 12:01:48 -0700 (PDT)

Hi,

I posted some issues with the openmpi_ext package. I managed to run it after
installing the version 3.6.4 of octave.

Now I can run the examples, but I found something rare with the
hellosparcemat.m example:

address@hidden ~ $ mpirun -np 3 octave -q --eval hellosparsemat
on rank 1 MPI_Send returned the following error code (0 = Success)
on rank 2 MPI_Send returned the following error code (0 = Success)
info = MPI_Recv returned the following error code (0 = Success) while
receving from rank info = 0
0
1
info = 0
This is the matrix received from rank 1: 
ans =

   0.60042   0.91984   0.00380   0.65244   0.52818
   0.00000   0.87083   0.06478   0.78401  error: MPI_Barrier: Please enter
octave comunicator object
error: called from:
error:   /home/shariff/hellosparsemat.m at line 76, column 5
 0.12855
   0.90207   0.05966   0.80096   0.46254   0.51436
   0.39674   0.71006  error: MPI_Barrier: Please enter octave comunicator
object
error: called from:
error:   /home/shariff/hellosparsemat.m at line 76, column 5
 0.08135   0.12374   0.49284
   0.42438   0.01605   0.13414   0.00000   0.82383

MPI_Recv returned the following error code (0 = Success) while receving from
rank 2
info = 0
This is the matrix received from rank 2: 
ans =

   0.12749   0.18532   0.00000   0.26919   0.09135
   0.36543   0.94180   0.23802   0.00000   0.29793
   0.88930   0.89241   0.51710   0.69006   0.84671
   0.13811   0.41671   0.05394   0.24462   0.84894
   0.81104   0.20716   0.57935   0.55185   0.11811

error: MPI_Barrier: Please enter octave comunicator object
error: called from:
error:   /home/shariff/hellosparsemat.m at line 76, column 5
--------------------------------------------------------------------------
mpirun has exited due to process rank 2 with PID 6776 on
node Zero exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

It looks like MPI_Barrier is not working in the right way... It gives a lot
of errors saying that the communicator was not passed, and it is... 

1- What is wrong with this? How I fix it?
2- I used a bit of MPI with C. Here we have MPI_status and MPI_ANY_SOURCE
for mpi in octave??



--
View this message in context: 
http://octave.1599824.n4.nabble.com/Question-about-MPI-Barrier-in-openmpi-ext-package-posible-bug-tp4653174.html
Sent from the Octave - General mailing list archive at Nabble.com.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]