Advertising seems to be blocked by your browser.
The ads help us provide this software and web site to you for free.
Please support our project by allowing our site to show ads.
Driver Date |
2011-03-18 | |
Version |
1.29.0.0 | |
Driver for |
Windows 2000 (5.0) 64 bit Windows XP (5.1) 64 bit Windows Server 2003 (5.2) 64 bit Windows Vista (6.0) 64 bit Windows 7 (6.1) 64 bit Windows 8 (6.2) 64 bit |
Download |
Regression bar In statistics, the Gauss-Markov theorem (or simply Gauss theorem for some authors) states that the ordinary least squares (OLS) estimator has the lowest sampling variance within the class of linear unbiased estimators, if the errors in the linear regression model are uncorrelated, have equal variances and expectation value of zero. The errors do not need to be normal, nor do they need to be independent and identically distributed (only uncorrelated with mean zero and homoscedastic with finite variance). The requirement that the estimator be unbiased cannot be dropped, since biased estimators exist with lower variance. See, for example, the James-Stein estimator (which also drops linearity), ridge regression, or simply any degenerate estimator. The theorem was named after Carl Friedrich Gauss and Andrey Markov, although Gauss' work significantly predates Markov's. But while Gauss derived the result under the assumption of independence and normality, Markov reduced the assumptions to the form stated above. A further generalization to non-spherical errors was given by Alexander Aitken.