help-gsl
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Help-gsl] Memory issues when using GSL ODE solver


From: Alex Shevtsov
Subject: [Help-gsl] Memory issues when using GSL ODE solver
Date: Mon, 25 Nov 2013 14:19:31 +0100

Hello,

I am using the GSL library from Cython to make my simulations. Before it
was working just fine, while I used only matrix operations and complex
calculus. The problem started when I had to solve a system of ODEs
numerically as a part of my code. I was quite happy with the speed of the
new odeiv2 interface, however there are some memory issues when using it,
which do not allow me to finish my calculation.

I know that this question is not new, see
https://mail.python.org/pipermail/python-list/2009-March/530623.html for a
similar problem. But it seems like the problem really exists.

So, in short the problem is the following. I have 4 coupled linear ODEs to
solve numerically. The system is stiff and the best method, which does the
job is 'msbdf' available in the 'new' ode-initval2. The function that I am
evaluating in a loop calls the ODE solver to get an array of 4 values at a
given coordinate z (stepping) and uses the array to proceed. As the loop
proceeds I can see that the program 'eats' enormously the RAM of my
computer (10 MB/second) until it reaches maximum (8 GB!!) and starts using
swap. I know exactly that the problem is with the ODE solver since when I
call the same function at the initial value z=0, i.e. when I do not have to
solve the ODE, the RAM usage is the same during the whole time the program
runs. Moreover, when I use a scipy.ode (or scipy.odeint) solver instead of
GSL odeiv2 it runs smoothly without any extraordinary RAM usage. The reason
I wanted to use GSL is speed. But I can't use it because of this memory
issue.

I am very well aware of the necessity to deallocate the memory used by the
various GSL objects and I kill all unnecessaty objects created with *_alloc
method by using the corresponding *_free method. Apparently, the
deallocation process in the GSL ODE solver either has a bug or something
else is wrong with it. It seems like each new call to the ODE solver
allocates a new block of memory, while not releasing the one used on the
previous step.

Please, help me to fix it somehow or give some piece of advice since my
project cannot proceed until I perform these simulations.

With the best wishes,
Alex


reply via email to

[Prev in Thread] Current Thread [Next in Thread]