help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Max array size


From: Nicholas Jankowski
Subject: Max array size
Date: Sat, 2 Nov 2024 11:39:49 -0400

As the note says at the bottom of these messages, the help email list is not used much anymore.  Active support has moved to the discussion forum at octave.discourse.group , you will likely receive more thorough assistance there.

That said, there are maximum array dimension limits in Octave (2^32 = 4.2e9 elements or 2^64 =  1.8e19 elements, depending on which windows installer you chose). The 1e12 vector would hit that limit triggering the error message you showed unless you used the w64-64 installer.  Despite this, however, a 1e9 vector of 8 byte doubles will consume ~8GB of memory. A 1e12 vector will try to consume ~8TB of memory, exceeding what is available.  If you require the ability to work with such large data sets, you will probably need to specially handle such large arrays to work within available memory, swap, and disc storage to do so.  I do not know of any builtin functions or external packages in octave made for this, but I’ll again refer you to the general octave forum at octave.discourse.group where experts may have better advice/suggestions


On Fri, Nov 1, 2024, 00:10 DuyKy Nguyen <dkn@unitedthc.com> wrote:

Hello

My info is on my personal page

http://unitedthc.com/

 

I'm evaluating my new AI structure using OCtave 9.1 on my w7-64b with 32 G RAM

it's happy with rand(1m 1e9)

but with rand(1, 1e12)

is say

error: out of memory or dimension too large for Octave's index type

so is it possible to huge raaray to evaluate huge neuron number

Thnks a lot

DuyKy Nguyen PhD EE

retired Ex R&D SymmetricomEng

 

I can use 4 million neuron but


----------
We are transitioning to a web based forum
for community help discussions at
https://octave.discourse.group/c/help

reply via email to

[Prev in Thread] Current Thread [Next in Thread]