Hey Everyone,
I am working on igraph python version to analyse different community detection algorithms on bigger datasets (100 thousand nodes, 500 thousand edges), but it always gives me memory error saying ''memory failure or the process is terminated''. I am
working with core i7 process and 6GB RAM. Is it the memory problem or any other reason. I have not worked on bigger graphs before, so I would be helpful if anyone can recommend me the fastest way of analysing social graphs; can it be efficiently done on laptop or I need to run it on some server? I am pasting my code for the graph below.
here is the code below
############FastGreedy Algorithm###########
fastgreedy=g.community_fastgreedy()
fast=fastgreedy.as_clustering()
cheers,
sam