[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: leasqr question

From: Mike Miller
Subject: Re: leasqr question
Date: Fri, 6 Sep 2002 00:08:23 -0500 (CDT)

I don't agree.  No matter how the regression is accomplished (linear,
nonlinear, least squares, random guesses, goat entrails), the geometry
should work just fine.  You have a set of observed values and a
corresponding set of predicted values.  Combining these, you produce a set
of residual values (observed minus predicted).  The R^2 is always the same

   R^2 = [var(observed) - var(residual)] / var(observed)

So it's the proportion of variance accounted for by the regression.  It is
nearly as easy to compute R^2 in nonlinear as in linear cases.

Norman Draper (whose book was recommended in the last message) taught an
advanced regression course I took at UW-Madison.  It was a required course
for all PhD students in stat (I was in a different dept., but I took the
course and received an A).  This doesn't prove that Draper would agree
with me, but I think he would.


Michael B. Miller, Ph.D.
Assistant Professor
Division of Epidemiology
University of Minnesota

On Thu, 5 Sep 2002, Dirk Eddelbuettel wrote:

> On Thu, Sep 05, 2002 at 09:49:07AM -0400, Tom Kornack wrote:
> > I am unfamiliar with all the complicated output of leasqr and I was
> > hoping if someone could provide me with an algorithm to extract a
> > measure like chi^2 or R^2. Could someone suggest a reference that might
> > elucidate the output variables of leasqr?
> No -- R2 only works for linear models, but leasqr.m is for nonlinear ones.
> As you asked, R^2 should be defined in any semi-decent statistics intro --
> it is the ratio of regression sum of squares [SSR := sumsq(fitted -
> mean(fitted))] to total sum of squares [SST := sumsq (y - mean(y))]. Then
> you get R2 as SSR/SST.  Alternatively, define sum of squared residuals [SSE
> := sumsq (y - fitted)] and use R2 := 1 - SSE/SST. There are variations which
> account for the nb of variables (as adding new variables, even if
> meaningless, will always increase R2, some people prefer adj. R2).
> But more importantly, R2 is only for linear models. Its geometry cannot be
> applied to nonliear fits such as those computed by leasqr.m. The Draper and
> Smith reference in leasqr.m is a good one, it will have details on
> appropriate diagnostics for nonlinear models.
> Dirk
> --
> Good judgement comes from experience; experience comes from bad judgement.
>                                                           -- Fred Brooks

Octave is freely available under the terms of the GNU GPL.

Octave's home on the web:
How to fund new projects:
Subscription information:

reply via email to

[Prev in Thread] Current Thread [Next in Thread]