help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: memory exhausted error.


From: GARY FORBIS
Subject: Re: memory exhausted error.
Date: Tue, 29 Apr 2008 20:03:54 -0700

Is top posting OK.
 
Is top posting socially acceptable?
 
Thanks a bunch.  Hmm... I only have 1GB of RAM on a Pentium4 with
Physical Address Extension. 1GB was a bunch when I bought the system. 
I've been thinking about moving to a home network with a mix of Linux and
Windows.  From what I've been told 64bit processors aren't of much use 
in Windows because of a chicken and egg type situation.  Heck we're
still running some DOS applications at work.  It looks like my maximum
virtual memory is set to 3072MB.
 
This certainly helps me analyze the problem. 
 
----- Original Message -----
Sent: Tuesday, April 29, 2008 11:33 AM
Subject: Re: memory exhausted error.

On 29-Apr-2008, GARY FORBIS wrote:

| ----- Original Message -----
|   From: Michael Goffioul<mailto:address@hidden>
|   To: GARY FORBIS<mailto:address@hidden>
|   Cc: address@hidden<mailto:address@hidden>
|   Sent: Tuesday, April 29, 2008 1:29 AM
|   Subject: Re: memory exhausted error.
|
|
|   On Tue, Apr 29, 2008 at 9:13 AM, David Bateman
|   <address@hidden<mailto:address@hidden>> wrote:
|   >  If this is the same bug report from the newsgroup comp.soft-sys.octave
|   >  then you need to point out that you are using the MSVC build under
|   >  Windows. I believe this is a known issue but don't remember the details,
|   >  so if some Windows user knows the registry magic needed to avoid this
|   >  issue send it here :-)
|
|   As I don't have that much time, it would really help if you could provide
|   the required files (a mat-file with all needed data and a m-file to run)
|   to reproduce the problem.
|
|   Michael.
| Thanks.
|
| I downloaded Octave 3.0.0 for Windows from http://sourceforge.net/project/showfiles.php?group_id=2888&package_id=40078<http://sourceforge.net/project/showfiles.php?group_id=2888&package_id=40078> it also installed
|
|   SciTE
|   Version 1.73
|       Mar 31 2007 09:25:11
| I downloaded the problem software from http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html<http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html>
| I was on this step:
|
|     6.  For training a deep autoencoder run mnistdeepauto.m in matlab.
|
| I ran the code under SciTE.  I've thought about moving the code into directory high in the C: tree
| into names without spaces since trying to figure out how to cd to 'My Documents' from within octave is just more side-tracking.

How quickly do you get the memory exhausted error?  How much memory
and swap space do you have available on your system?  I'm running the
mnistdeepauto script now (3.0.0 on a Debian system on AMD64 hardware),
and, omitting the warnings

  warning: save: no such variable `-mat'
  warning: load: loaded ASCII file `test0.ascii' -- ignoring extra args

due to -mat and -ascii options appearing at the end instead of the
beginning of save and load commands, I see the following:

  octave:1> more off
  octave:2> mnistdeepauto
  Converting Raw files into Matlab format
  You first need to download files:
   train-images-idx3-ubyte.gz
   train-labels-idx1-ubyte.gz
   t10k-images-idx3-ubyte.gz
   t10k-labels-idx1-ubyte.gz
   from http://yann.lecun.com/exdb/mnist/
   and gunzip them
  Starting to convert Test MNIST images (prints 10 dots)
  ..........
    980 Digits of class 0
   1135 Digits of class 1
   1032 Digits of class 2
   1010 Digits of class 3
    982 Digits of class 4
    892 Digits of class 5
    958 Digits of class 6
   1028 Digits of class 7
    974 Digits of class 8
   1009 Digits of class 9
  Starting to convert Training MNIST images (prints 60 dots)
  ...........................................................
   5923 Digits of class 0
   6742 Digits of class 1
   5958 Digits of class 2
   6131 Digits of class 3
   5842 Digits of class 4
   5421 Digits of class 5
   5918 Digits of class 6
   6265 Digits of class 7
   5851 Digits of class 8
   5949 Digits of class 9
  Pretraining a deep autoencoder.
  The Science paper used 50 epochs. This uses  10
  Size of the training dataset= 60000
  Size of the test dataset= 10000
  Pretraining Layer 1 with RBM: 784-1000
  epoch    1 error 819844.1 
  epoch    2 error 520952.7 
  epoch    3 error 476870.0 
  epoch    4 error 454953.4 
  epoch    5 error 442313.8 
  epoch    6 error 469727.1 
  epoch    7 error 438805.7 
  epoch    8 error 422759.3 
  epoch    9 error 412318.6 
  epoch   10 error 404754.2 

  Pretraining Layer 2 with RBM: 1000-500
  epoch    1 error 1257000.3 
  ...

Running top to look at memory usage, it is stable at about 1100MB
virtual and up to 900MB resident during the "Pretraing Layer 1" phase,
though the resident size was growing slowly.  However, since the
virtual size was stable for the entire run, I don't suspect a leak in
Octave.  More likely it is just the way these scripts work
(accumulating results as they run?).  Also, when the "Pretraing Layer
1" phase started, the virtual size dropped down to 970MB.

If you have enough memory on your system to run this code, then I'm
guessing some kind of bug with the Windows memory management system
and not a bug in Octave itself.  Perhaps there is a way to tune the
Windows memory management system, but I have no idea how to do that.

jwe


reply via email to

[Prev in Thread] Current Thread [Next in Thread]