[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Reading data from large ascii-file
From: |
Francesco Potortì |
Subject: |
Re: Reading data from large ascii-file |
Date: |
Tue, 19 Apr 2011 13:17:41 +0200 |
>I am analyzing data from a 400MB ascii-file. There are about 30 million
>data points in the file and the execution takes forever.
>
>Currently the program reads and processes the file element-by-element ( val
>= dlmread(fileName, "emptyvalue", [i, col, i, col]) ) and I suspect that
>this is a very unafficient way to do it. Could you suggest a better
>approach?
Reading one value at a time is the most inefficient way I can think of
:)
First of all, are you sure you can't simply use load()? Or use a trivial
conversion of your data so that you can use load()?
If your data is fixed, you should definitely consider the hint you were
already given: read the data once, then save it with 'save -binary':
when loading it with 'load' it should be as fast as possible.
--
Francesco Potortì (ricercatore) Voice: +39 050 315 3058 (op.2111)
ISTI - Area della ricerca CNR Fax: +39 050 315 2040
via G. Moruzzi 1, I-56124 Pisa Email: address@hidden
(entrance 20, 1st floor, room C71) Web: http://fly.isti.cnr.it/
Re: Reading data from large ascii-file,
Francesco Potortì <=