help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Memory allocation for very large matrices??


From: Dennis Gaidashev
Subject: Memory allocation for very large matrices??
Date: Sat, 2 Dec 2000 13:01:06 -0600 (CST)

I am trying to do the Singular Value Decomposition of a very large matrix
(16641 by 16641) with the rank smaller than dimension by one. The
matrix is read from a file, the three resulting matrices are
writen into files. Here is the simple program I am using:
 
 
#! /usr/bin/local/bin/Octave-2.0.13 -qf
fid=fopen("A_129","r");
A=fscanf(fid,'%f',[129*129,129*129]);
fclose(fid);
[U,S,V]=svd(A);
clear  A;
fid=fopen('Ut_129','w+');
fprintf(fid,'%E ŠÜn',U);
fclose(fid);
clear U;
fid=fopen('S_129','w+');
fprintf(fid,'%E ŠÜn',diag(S'));
fclose(fid);
clear S;
fid=fopen('V_129','w+');
fprintf(fid,'%E ŠÜn',V');
fclose(fid);
clear V;
 
However, Octave exits at the step the third line
 
A=fscanf(fid,'%f',[129*129,129*129]);
 
with the memory error. I guess it is just simple a matter of RAM not being
enough - it can not read the full matrix from file. But is there any smart
way to get around this? May be there are some memory allocation tricks in
Octave?
 
I would appreciate any kind of advice on this.
Thank you,
Dennis Gaidashev                                  





-------------------------------------------------------------
Octave is freely available under the terms of the GNU GPL.

Octave's home on the web:  http://www.octave.org
How to fund new projects:  http://www.octave.org/funding.html
Subscription information:  http://www.octave.org/archive.html
-------------------------------------------------------------



reply via email to

[Prev in Thread] Current Thread [Next in Thread]