Least Squares Model Averaging By Prediction Criterion

QED Working Paper Number
1299

This paper proposes a new estimator for least squares model averaging. A model average estimator is a weighted average of common estimates obtained from a set of models. We propose computing weights by minimizing a model average prediction criterion (MAPC). We prove that the MAPC estimator is asymptotically optimal in the sense of achieving the lowest possible mean squared error. For statistical inference, we derive asymptotic tests for single hypotheses and joint hypotheses on the average coefficients for the ``core'' regressors. These regressors are of primary interest to us and are included in every approximation model. To improve the finite sample performance, we also consider bootstrap tests. In simulation experiments the MAPC estimator is shown to have significant efficiency gains over existing model selection and model averaging methods. We also show that the bootstrap tests have more reasonable rejection frequency than the asymptotic tests in small samples. As an empirical illustration, we apply the MAPC estimator to cross-country economic growth models.

Author(s)

Tian Xie

JEL Codes

Keywords

Model Averaging
MAPC
Convex Optimization
Optimality
Statitstical Inference

Working Paper

Download [PDF] (257.22 KB)