igraph-help
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [igraph] the old "cannot reserve space for vector" problem


From: Tamas Nepusz
Subject: Re: [igraph] the old "cannot reserve space for vector" problem
Date: Tue, 8 Jun 2010 10:22:38 +0100

Dear Pascal,

> Are there any practical limits to using shortest_paths() with much larger 
> networks?
*Theoretically*, the only practical limit should be the amount of memory you 
have available :) (Of course, your OS has to support allocating huge chunks of 
memory to the same process; for instance, on 32-bit machines, there is an 
address space limit which cannot be circumvented without switching to a 64-bit 
platform). If you found some limit other than the memory requirements or the 
time complexity, then that's a bug in igraph and we'll try to fix it.

> Especially, are there memory leaks and / or other limits when using the 
> python interface? 
We regularly check both the C core and the higher level interfaces for memory 
leaks and I'm not aware of any memory leaks right now that should affect your 
calculation. This holds to the memory allocated by igraph itself, I cannot 
promise anything about the Python interpreter.

What I can tell you is that I successfully managed to calculate the *diameter* 
of an undirected network with 2.1 million vertices and 7.5 million edges -- 
this took about 11 days on a single core of an Intel Xeon X3360 running at 2.83 
GHz. So, time complexity should not be a problem for your network. I'm still 
worried about the storage requirements, because when using the Python 
interface, the result matrix is temporarily stored *twice*, once as a regular 
igraph_matrix_t and once as a Python list of lists which is returned to Python. 
Of course the igraph_matrix_t is destroyed as soon as the Python representation 
is ready, but still. I would probably try to be on the safe side and build up 
the shortest path matrix line by line and store the calculated lines on disk, 
so even if Python runs out of memory for some reason, nothing is lost because 
all the earlier calculations are still saved.

> Using the unweighted algorithm (is it Johnson's?) with O(n(V+E)), this seems 
> feasible.
Actually, the unweighted shortest path implementation is a simple breadth first 
search, and yes, it runs in O(n(V+E)). A single run (with one source vertex) 
thus requires O(V+E).

-- 
Tamas


reply via email to

[Prev in Thread] Current Thread [Next in Thread]