igraph-help
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[igraph] betweenness running time estimate and


From: David Hunkins
Subject: [igraph] betweenness running time estimate and
Date: Sun, 13 Jul 2008 14:16:13 -0700

Thank you Gabor, Tamas and others very much for the igraph library.

I am analyzing some online photo sharing data donated to me by a large photo sharing site.

In my first dataset I have about 5 million vertices and 5 million edges. In the next dataset I'll get about 50 million vertices and 200 million edges. I am currently running analysis on a 2GB/2.0GHz Intel Core Duo Macbook Pro. I am conditioning and loading the weighted, directed graph data in Pajek format.

A couple of questions with which your user community may have some experience.

First, I am having a trouble determining how long I should expect various operations to take even with the 5M/5M dataset, and I am wondering whether there is a simple metric (such as density of the graph) that can be computed and then used to calculate order-of- magnitude running times for the various other functions of interest. Yesterday, for example I ran betweenness.estimate(graph, cutoff=2) and it hasn't completed after 24 hours (and the Macbook is not paging at all, just getting a little 'warm').

Second, do you know anybody who has used commercial (or otherwise) scalable computing resources to run analyses on datasets of this size? I fear that, even if I can melt my laptops to solve the problem for the 5M/5M dataset, I will need other resources when I go to 50M/200M dataset.

Thanks for any assistance you and the community can provide. Kösz!

David Hunkins








reply via email to

[Prev in Thread] Current Thread [Next in Thread]