[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
large arrays and error virtual memory exhausted
From: |
Joao Cardoso |
Subject: |
large arrays and error virtual memory exhausted |
Date: |
Wed, 24 May 1995 05:36:38 +0200 |
Hello,
I have the following error in octave-1.1.1 on a sco 3.2v4.0 with
16Mbytes of memory and 20Mbytes of swap,
"error: new: virtual memory exhausted -- stopping myself"
when working with large arrays (500x500 and greater).
It seems to me that the error "new: virtual memory exhausted" comes from gcc,
but I can't check it, as I have already removed the sources.
The strange thing is that virtual memory does not seem to be exhausted.
Voila a "sar -r" report:
05:01:45 257 25280
05:01:55 252 25280
05:02:05 245 24504
05:02:15 236 16280
05:02:25 307 16256
05:02:35 200 14080
05:02:45 169 14080
05:02:55 169 14080
05:03:05 169 14080
05:03:15 169 14080
05:03:25 169 14080
05:03:35 169 14080
05:03:45 169 14080
05:03:55 169 14080 <- octave has stopped itself
05:04:05 1272 26152
05:04:15 2119 26152
When creating large matrices, is it better, both in terms of memory
allocation/fragmentation (and execution speed) to previously allocate
a full matrix? what is the better way?
In a function is it wise to clear an unneeded variable? how? clear, clear -x ?
e.g.,
b = ones(size(a)); # is advantageous?
for i=1:rows(a)
b(i,:) = foo(a(...
endfor
clear a; # any advantage?
Thanks,
Joao Cardoso, address@hidden
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- large arrays and error virtual memory exhausted,
Joao Cardoso <=