[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: empirical_inv() out of memory
From: |
Glenn Golden |
Subject: |
Re: empirical_inv() out of memory |
Date: |
Fri, 23 Jan 2004 15:21:00 -0700 |
"Dmitri A. Sergatskov" writes:
> Glenn Golden wrote:
> > Running 2.1.50:
> >
> > Is the following a reasonable out-of-memory situation?
> >
> >
> > 1> f = 1 - 10 .** [-1:-1/3:-6];
> > 2> g = abs(rand(4e6,1)) .** 2;
> > 3> empirical_inv (f, g)
> > error: memory exhausted -- trying to return to prompt
>
> I run it on my computer while watching 'top' in other terminal.
> It went through, but octave process size at its maximum was 1201MB.
>
Spasibo. It runs ok on a larger memory machine at my work too.
My question really is, is it reasonable for a distribution analysis
function like empirical_inv() -- which, by its nature is expected
to be passed fairly large datasets -- to exhaust 256M when called
with an argument which is roughly only 1/8 the available memory?
Certainly this is something about which reasonble people can disagree,
but it seemed to me in this case that the memory usage was a bit
extravagant.
Looks like the core function (discrete_inv()) is written in a way
which causes a lot of temps to be created, although it's not obvious
at first glance why it should be using as much as 1.2GB...
Glenn
-------------------------------------------------------------
Octave is freely available under the terms of the GNU GPL.
Octave's home on the web: http://www.octave.org
How to fund new projects: http://www.octave.org/funding.html
Subscription information: http://www.octave.org/archive.html
-------------------------------------------------------------