English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

why one would not wanna apply least squares regression to a situation where there are continuous predictor variables and dichotomous outcome.

2007-12-03 13:07:41 · 1 answers · asked by Anonymous in Science & Mathematics Mathematics

1 answers

Lots of reasons.

One obvious one is that least squares regression works nicely when the relationship between the parameters is linear. It is almost useless when the relationship is non-linear.

Another is that classical least squares is based on the idea that it is the absolute magnitude of the errors that count. In many cases, especially when the data spans orders of magnitude, it is the relative errors that count. (In theory, one can use weights with least squares, but no one does that.)

Yet another is that a very few bad data points (i.e. outliers) can completely dominate the result. Many people try to get around this by eliminating outliers, but then, where does one stop?

Here is one article on an alternative:
http://en.wikipedia.org/wiki/Robust_regression

2007-12-05 18:00:27 · answer #1 · answered by simplicitus 7 · 0 0

fedest.com, questions and answers