Slab and Shrinkage Linear Regression Estimation
Asimit, V. ORCID: 0000-0002-7706-0066, Cidota, M. A., Chen, Z.
ORCID: 0009-0009-6376-3850 & Asimit, J.
Slab and Shrinkage Linear Regression Estimation.
Abstract
Shrinkage estimation is a statistical methodology that is used to improve parameter estimation by reducing the mean square error, which is expected to improve the out-of-sample performance. This paper focuses on multiple linear regression estimators since the standard ordinary least square estimator is often computationally instable. Its penalized variants such as ridge and LASSO are the usual non-parametric solutions, and such shrinkage methods lead to sparse models and reduce overfitting. Another shrinkage class is the Stein-type shrinkage estimators that use Bayesian arguments to leverage prior information so that the resulting estimators dominate the maximum likelihood estimator. A third class of shrinkage estimators has been used with great success where various estimators are optimally combined to take advantage of the positive traits of each estimator. We provide seven non-parametric multiplicative and linear shrinkage estimators, and provide theoretical guarantees that these new estimators have a lower mean square error than the ordinary least square estimator. We illustrate that such theoretical guarantees are reflected in synthetic and real data experiments, and we choose genetics, machine learning, and finance applications to convince the reader about our contributions.
Publication Type: | Other (Preprint) |
---|---|
Publisher Keywords: | Multivariate linear regression; Shrinkage estimation |
Subjects: | H Social Sciences > HA Statistics |
Departments: | Bayes Business School |
SWORD Depositor: |
Download (1MB) | Preview
Export
Downloads
Downloads per month over past year