help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Importing large amounts of data


From: Hannes
Subject: Importing large amounts of data
Date: Mon, 18 Jun 2012 16:32:24 +0800

Hi everyone,

sorry if this question sounds lame, but I want to import a rather large
3-dimensional array into Octave that I output from a C++ program. For
simplicity, and because some of the formats I read about seemed obscure
I have the C++ program output octave code and read that from the octave
script with the source()-command. This also gives me the possibility to
easily pass multiple parts of data and settings and thus makes for easy
communication between the two programs.
Since most documentation points at creating a 3d-array with reshape(), I
currently export a one dimensional array that I then reshape in the
octave script to 3d.

The problem is, that this is really slow. Working on smaller bits of
data is no problem, but I am using this in a biological context, where I
need to work with large arrays.
Right now I am trying to do this with a 150x150x1000 int array. This
array has a small memory footprint in C++ and the file being pushed from
the C++ program to the octave script is around 65MB. When reading this
into Octave it already consumes 8GB of RAM, which is quite a surprise,
but not the main problem (I have memory to spare right now). However the
reshaping is already going of for two days now on a multi-cpu Xeon Server.

Whats going wrong? How should I approach this to get it done?

Thanks for your help.

Best,
Hannes

Attachment: signature.asc
Description: OpenPGP digital signature


reply via email to

[Prev in Thread] Current Thread [Next in Thread]