help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Optimize a function in parallel on a cluster


From: Olaf Till
Subject: Re: Optimize a function in parallel on a cluster
Date: Tue, 18 Dec 2018 20:36:37 +0100
User-agent: NeoMutt/20170113 (1.7.2)

On Mon, Dec 17, 2018 at 06:04:36PM +0200, Oleksii Zdorevskyi wrote:
> Dear Octave community,
> 
> I am new in using GNU Octave. Could you tell me please if there are any
> functions for parallel optimization to perform it on cluster?

Such functionality is prepared in Octave Forge packages. It doesn't
work yet, but with Octave of version 5.0, whose release will soon be
under preparation, it should work.

You'll need current releases of the Octave Forge packages 'parallel'
and 'optim'.

The 'parallel' package has to be installed and loaded and a cluster
has to be set up. Type 'parallel_doc' on the Octave command line for
help. The 'parallel' package can't be installed on Windows.

If the 'parallel' package and cluster are set up, some functions of
the 'optim' package can use them for computations which can be
performed in parallel. These functions are 'nonlin_residmin',
'nonlin_curvefit', 'nonlin_min', 'residmin_stat', and
'curvefit_stat'. The option 'parallel_net' has to be used for
this. Type 'optim_doc' on the Octave command line for help.

Note: Optimization with automatic gradient determination is simpler,
but with user-supplied gradient functions (if available) it's
sometimes faster (depending on the gradient function...) and it may
have a higher success rate. The current algorithms of 'optim' use
parallel computation only for automatic gradient determination and for
a certain simulated annealing algorithm. It probably wouldn't give a
large benefit to parallelize steps of deterministic algorithms except
automatic gradient determination.

Olaf

-- 
public key id EAFE0591, e.g. on x-hkp://pool.sks-keyservers.net

Attachment: signature.asc
Description: PGP signature


reply via email to

[Prev in Thread] Current Thread [Next in Thread]