help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: MPI


From: Andy Jacobson
Subject: Re: MPI
Date: 02 Feb 2001 10:33:37 -0500
User-agent: Gnus/5.0806 (Gnus v5.8.6) Emacs/20.7

>>>>> "Alex" == Alex Verstak <address@hidden> writes:

    Alex> What happened to the MPI bindings for Octave?  I saw a
    Alex> posting of some source Nov 2000, but apparently it didn't
    Alex> make it into the mainstream.

I am the author of those patches.  I put up a web page at
"http://corto.icg.to.infn.it/andy/octave-mpi/"; that discusses the
issues.  I haven't bothered to make patches against 2.0.32 or 2.0.33,
but it should be trivial to do so.  I have had email correspondence
with three people who have played with it.

I'm not sure what John Eaton wants to do with such parallelization
schemes; perhaps he is just waiting to see whether there is sufficient
interest before committing.  

    Alex> Is there any interest in MPI?  I put together a quick and
    Alex> dirty binding and am wondering whether or not I should make
    Alex> it good and post the source.  If there is enough interest, I
    Alex> will; if not, I won't.

I'd be interested in combining our efforts to make a more robust
patch.  Did you by any chance try compiling with my patches?  I'm
curious to see whether it works transparently under MPICH.

At one point I was trying (and failing) to get ScaLAPACK to compile on
my system to see whether I could link its MPI-parallelized routines
into Octave.  I still think this would be an interesting thing to try.

        -Andy

-- 
address@hidden



-------------------------------------------------------------
Octave is freely available under the terms of the GNU GPL.

Octave's home on the web:  http://www.octave.org
How to fund new projects:  http://www.octave.org/funding.html
Subscription information:  http://www.octave.org/archive.html
-------------------------------------------------------------



reply via email to

[Prev in Thread] Current Thread [Next in Thread]