Good afternoon,
I'm using the function all_simple_paths from the igraph R package: (1) to generate the list of all simple paths in networks; and (2) to calculate the total number of simple paths.
I'm using the function in the form:
pathsMp <- unlist(lapply(V(graphMp), function(x)all_simple_paths(graphMp, from = x)), recursive =FALSE)
List_paths_Mp <- lapply(1:length(pathsMp), function(x)as_ids(pathsMp[[x]]))
n_paths<-length(List_paths_Mp)
The function does what I need, but with the increase in the number of variables and interactions the processing time grows too much and it takes very long to get the results. In particular, using a network with 11 variables and 60 interactions, there is a total of 146338 possible simple paths. And this already takes a long time to compute. Using a bigger network, with 13 variables and 91 interactions, causes the program to take even longer times to process (after half hour the function still didn't run its course).
Is there a way to increase the efficiency of the task (i.e. to get results in a faster way)? Has anyone ever encountered a similar problem and found a solution? And, I know, I could use a CPU with higher processing power, but the point is to have the function to run efficiently (as much as possible) in a normal personal computer.
Kind regards,
Daniel Pereira
PhD visiting Student at GEOMAR - Helmholtz Centre for Ocean Research Kiel
Düsternbrooker Weg 20
24105 Kiel, Germany