help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Fitting multiple datasets to "partially" the same model (global fit


From: Allen.Windhorn
Subject: RE: Fitting multiple datasets to "partially" the same model (global fit with shared parameters)
Date: Mon, 7 Dec 2015 16:47:19 +0000

> -----Original Message-----
> From: address@hidden
> 
> Indeed, the knowledge I was talking about is about the *values* of the
> parameters.
> 
> So, indeed, the model function(s) for the dataset 1 to n should be noted,
> in my example, like:
> 
> F_1(x) = (a0_1*ln(x)) * (b1 + b2*x) for dataset 1
> F_2(x) = (a0_2*ln(x)) * (b1 + b2*x) for dataset 2 ...
> F_n(x) = (a0_n*ln(x)) * (b1 + b2*x) for dataset n
> 
> With "unique" a0_i for all datasets and common parameters b1, b2. Now,
> the fitting procedure could be performed simultaneously, I guess. At
> least, if this should be of some use?
> 
> I hope that clarifies my question.

If your penalty or error function is the sum of the error functions for all
the equations then I think leasqr or another optimization routine can
be used without modification -- just glom all the parameters into one
big vector and solve.  But the result and the time to solve may not be
optimum.

I realize your example function may not be realistic, but if I were trying
to find coefficients for it I would rearrange it to one that is easier to
solve.  If the ln(x) term doesn't vary over a wide range (i.e. including
zero) you could define G_i(x) = F_i(x)/ln(x) and p0_i = -1/a0_i and
the equation would become p0_i*G_i(x) + b1 + b2(x) = 0, which is
a linear equation and can be solved using linear least squares.  The
modified problem doesn't have the same solution as the original,
but it should be fairly close, maybe needs a final nonlinear step or two
to clean it up.

Hmmm, quite a three-pipe problem...  Thanks.  Any chance getting a
look at a typical dataset?

Regards,
Allen
  



reply via email to

[Prev in Thread] Current Thread [Next in Thread]