help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: OT: finding the weights used in weighted least squares regression


From: Kamaraju S Kusumanchi
Subject: Re: OT: finding the weights used in weighted least squares regression
Date: Wed, 27 Apr 2011 08:09:03 -0400
User-agent: KNode/4.4.7

James Sherman Jr. wrote:

> This seems like an ill-posed problem to me.  (or in other words the answer
> is no, you can't solve for W or at least not a specific W).
> 

I think so too! But I just wanted to make sure if I am missing something.

> 2) Ax ~=b, then if you want to solve for a W such that WAx = Wb, equates
> to W(Ax-b) = 0, which is to say that as long as Ax-b is in the null space
> of W (or that Ax-b is an eigenvector with eigenvalue 0) you are free to
> choose
> any W that satisfies this condition.  This is in essence a linear system
> with n^2 unknowns (size of W) with only n equations.

FWIW, W is a diagonal matrix. So there are only n unknowns, not n^2.
Also when A is a long thin matrix and when the system is solved by least 
squares, W(Ax-b) is not exactly zero. There is always a residual.
In other words, W(Ax-b) ~= 0.


thanks
-- 
Kamaraju S Kusumanchi
http://malayamaarutham.blogspot.com/



reply via email to

[Prev in Thread] Current Thread [Next in Thread]