help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: mpi 1.1.1 released


From: Sukanta Basu
Subject: Re: mpi 1.1.1 released
Date: Fri, 3 Jan 2014 07:25:35 -0500

Hi Michael,

Thanks for catching the typo in the sample file. I quickly created
this sample code for testing. The outer loop should have a different
index. I have to send multi-dimensional arrays in my work. There is no
alternative.

Best regards,
Sukanta

On Fri, Jan 3, 2014 at 7:13 AM, Michael Creel <address@hidden> wrote:
> I have run this, and there does appear to be a memory leak. I don't see
> problems with the code, either. At first, using i as an index for the outer
> loop and the nested loop looked like a problem to me, but it seems that
> Octave works the way you intended it to. I'm guessing that the problem may
> be related to sending n-dimensional arrays. In my work, I send scalars,
> vectors, and matrices, and I've never noticed memory leaks. I've never sent
> n-dimensional arrays, so perhaps that's the problem.
> Best,
> Michael
>
>
> On Thu, Jan 2, 2014 at 11:15 PM, Sukanta Basu <address@hidden>
> wrote:
>>
>> Dear Carlo and Michael,
>>
>> Many thanks for your prompt response!
>>
>> 1. The GSoC project is a great idea. I will be happy to be a mentor and
>> collaborate with you all on this topic. I would like to point out that I
>> have very limited knowledge/experience in C++ programming. I mostly work
>> with Octave/Matlab and Fortran.
>>
>> 2. In my report, I forgot to mention the following: (i) I used Octave
>> 3.6.4 on Cray CX1 and Octave 3.6.1 on Stampede; (ii) I used Ubuntu's default
>> openmpi (version 1.4.5); a few months ago, I tried 1.6.5 and 1.7.1. I
>> noticed the same memory leak problem. (iii) In the figures, you will notice
>> regular "spikes"; these are related to gather and output of large data files
>> to disks.
>>
>> 3. MATLES is absolutely free. I am creating a web-portal for public
>> dissemination (it will take a few more weeks). In the meantime, you can
>> download the code here:
>> https://dl.dropboxusercontent.com/u/18718365/MATLES.tar.gz
>>
>> To run the code using Octave + MPI package, simply use:
>> mpirun --machinefile myhostfile -x LD_PRELOAD=/usr/lib/libmpi.so -np 8
>> octave -q --eval MLp_Les > Sample.log &
>>
>> or,
>>
>> modify and use the script: jobsuboctaveOMPI
>>
>>
>> 4. I have attached a small code (MATLES_TEST.m) which portrays the memory
>> leak problem. I tested it only on Cray CX1. For job submission, simply use:
>>
>> mpirun --machinefile myhostfile -x LD_PRELOAD=/usr/lib/libmpi.so -np 8
>> octave -q --eval MATLES_TEST > Sample.log &
>>
>>
>> 5. MPI + MPICH2 installation was successful only for GNU compilers (not
>> for Intel compilers). I followed Carlo's suggestions. Specifically,
>>
>> Download mpi-1.1.1.tar.gz from octave-forge; untar, unzip. In
>> mpi/src/Makefile, delete the first few lines and use:
>>
>> MPICC     ?= mpic++
>>
>> OFMPIINC  ?= $(shell $(MPICC) -showme:compile | sed -e
>> "s/-pthread/-lpthread/g")
>>
>> MPIINC    := $(OFMPIINC)
>>
>> OFMPILIBS ?= $(shell $(MPICC) -showme:link | sed -e "s/-pthread/ /g")
>>
>> MPILIBS   := $(OFMPILIBS)
>>
>> Then: tar -cvzf mpi-1.1.1.tar.gz mpi
>>
>> Then: open sudo octave and write:
>>
>> setenv ("OFMPIINC", "-I/usr/include/mpich2 ")
>>
>> setenv ("OFMPILIBS", "-L/usr/lib -lmpich")
>>
>> Then: pkg install -auto mpi-1.1.1.tar.gz
>>
>>
>>
>> Note: In openmpi, we have -showme:compile. In mpich2, the equivalent
>> command is: -compile_info
>>
>> However, I was unable to use compile_info in makefile.
>>
>> I am not using mpich2, since the simulations blow up after several hours.
>> I speculate that this problem is also related to memory leak (the error
>> message is alluding to memory issues).
>>
>> Best regards,
>>
>> Sukanta
>>
>>
>> On 1/2/2014 4:44 AM, c. wrote:
>>
>> On 2 Jan 2014, at 08:56, Michael Creel <address@hidden> wrote:
>>
>> Hi Sukanta and others,
>> I haven't been following this issue. I have been using the mpi package
>> with Open MPI, currently v1.6.5, and Octave v3.6.4, on Debian. I use it on a
>> daily basis, on up to 32 nodes, for runs that can go overnight. So far, I
>> have not noticed a problem, but perhaps I'm not using whatever part might
>> have a leak. Have you posted the code that shows the problem somewhere? If
>> not could you send it to me, please?
>> Thanks,
>> Michael
>>
>> On 2 Jan 2014, c. <address@hidden> wrote:
>>
>> I am considering proposing a GSoC project about improvements to the MPI
>> package, in particular I'd like to add the
>> ability for users to start parallel jobs and collect the output in an
>> interactive Octave CLI/GUI session.
>>
>> If your have a non trivial application built on Octave MPI it would be
>> great to use it for testing.
>> Would it be possible to use your MATLES application for this purpose? Is
>> it Free Software?
>>
>> In addition to solving the memory leak issue do you have any other
>> improvements that could be part of the project?
>> Would you like to be a mentor for the project?
>>
>> On Thu, Jan 2, 2014 at 10:19 AM, c. <address@hidden> wrote:
>>
>> Michael,
>>
>> Would also like to be listed as a possible mentor for this project?
>> Your help would be greatly appreciated.
>>
>> c.
>>
>> On 2 Jan 2014, at 10:22, Michael Creel <address@hidden> wrote:
>>
>> Hi Carlo,
>> Sure, no problem. I'm not much of a real hacker when it comes to the
>> internal workings, but I'm an experienced user.
>> M.
>>
>> Hi,
>>
>> Thanks, I started a small project descritption here:
>> http://wiki.octave.org/Summer_of_Code_Project_Ideas#Improve_MPI_package
>>
>> Adding you and Sukanta as possible mentors, feel free to change/extended
>> the project descriptio as you like.
>>
>> c.
>>
>>
>>
>>
>> --
>> Sukanta Basu
>> Associate Professor
>> North Carolina State University
>> http://www4.ncsu.edu/~sbasu5/
>
>



-- 
Sukanta Basu
Associate Professor
North Carolina State University
http://www4.ncsu.edu/~sbasu5/


reply via email to

[Prev in Thread] Current Thread [Next in Thread]