igraph-help
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [igraph] memory limitation


From: Gabor Csardi
Subject: Re: [igraph] memory limitation
Date: Mon, 16 Jun 2008 09:38:45 -0500
User-agent: Mutt/1.5.17+20080114 (2008-01-14)

What about doing it in two pieces? That should work I think.

G.

On Mon, Jun 16, 2008 at 10:34:12AM -0400, Alisa Coffin wrote:
> 
> Thanks for your help Gabor.  Indeed I was able to work the 3rd (smaller) graph
> object after I removed all other graph objects from the .rdata file and saved
> it as a new .rdata file with only the one graph object.  I'll have to add more
> memory to get that last, larger one done.
> 
> 
> 
> On Thu, Jun 12, 2008 at 1:35 PM, Gabor Csardi <address@hidden> wrote:
> 
>     Alisa, try calculating the shortest paths in two (or more) steps,
>     e.g. if you have 10000 vertices, first calculate the sortest distance
>     from the first 5000 to all others and then the second 5000.
> 
>     As for whether 2Gb is enough, igraph uses 32 bytes per edge and
>     16 bytes per vertex to store a network. So your network only takes
>     a small amount of the memory.
> 
>     The shortest paths matrix for your graph takes about 570Mb of memory
>     (8 bytes for each element) and it is copied once, so you need two
>     times 570Mb. 2Gb is likely to be enough, but i cannot say for
>     sure, as i don't have intimate knowledge about R's memory management
>     on Windows.
> 
>     For your smaller network, only two times 335Mb is needed.
> 
>     Hope this helps,
>     Gabor
> 
>     On Thu, Jun 12, 2008 at 01:23:31PM -0400, Alisa Coffin wrote:
>     > Dear igraph developers and users,
>     >
>     > I have 4 connected, simple, undirected graph objects for which I am
>     trying to
>     > discover the shortest paths using igraph in R.  When I ran the
>     "shortest.paths"
>     > routine on the first two, I had no problem, but for the last two, I come
>     up
>     > against what appears to be a memory limitation.  I have tried the
>     analysis on
>     > both my laptop and my desktop, both of which have 1Gb RAM on a Windows
>     > platform.  For both of these datasets, I get the message: "Error: cannot
>     > allocate vector of size ...".  The first problem graph has 6593 vertices
>     and
>     > 8268 edges.  The second one has 8613 vertices and 10649 edges.  These do
>     not
>     > strike me as inordinately large, but the first two graphs were a bit
>     smaller
>     > and the routine worked just fine for them.
>     >
>     > Please forgive the basic nature of this question, but will this problem
>     be
>     > resolved if I try the analysis on a 2Gb machine? Are there other 
> possible
>     > solutions?  I have scoured the R manual for information on how to 
> resolve
>     this
>     > problem and think that the memory issue may be the problem, but I can't
>     see any
>     > other possibilities.
>     >
>     > Your help is GREATLY appreciated!
>     >
>     > Alisa.
>     >
>     > p.s. I can send the .rdata file if anyone is interested in trying it out
>     on
>     > their machine.
>     >
>     > --
>     > Alisa Coffin, PhD. Candidate
>     > Department of Geography
>     > University of Florida
>     > Gainesville, FL 32611
> 
>     > _______________________________________________
>     > igraph-help mailing list
>     > address@hidden
>     > http://lists.nongnu.org/mailman/listinfo/igraph-help
> 
> 
>     --
>     Csardi Gabor <address@hidden>    UNIL DGM
> 
> 
>     _______________________________________________
>     igraph-help mailing list
>     address@hidden
>     http://lists.nongnu.org/mailman/listinfo/igraph-help
> 
> 
> 
> 
> 
> --
> Alisa Coffin, PhD. Candidate
> Department of Geography
> University of Florida
> Gainesville, FL 32611

> _______________________________________________
> igraph-help mailing list
> address@hidden
> http://lists.nongnu.org/mailman/listinfo/igraph-help


-- 
Csardi Gabor <address@hidden>    UNIL DGM




reply via email to

[Prev in Thread] Current Thread [Next in Thread]