New Estimators Improve Accuracy of Predictions in Linear Models
The article discusses different ways to estimate parameters in linear models when the data is not ideal. The researchers introduce biased estimators called ridge estimators and shrunken estimators. These estimators are shown to be effective in handling problems like multicollinearity and ill-conditioning. The ridge estimators minimize the Euclidean norm, while the shrunken estimators minimize the design dependent norm. Both types of estimators are derived as minimum norm estimators from the least squares estimator. The researchers also find that the shrunken estimators are minimum variance linear transforms of the least squares estimator. An example is provided to illustrate how these different estimators behave in practice.