Model averaging for generalized linear models in diverging model spaces with effective model size

  • Chaoxia Yuan
  • , Fang Fang*
  • , Jialiang Li
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Abstract.: While plenty of frequentist model averaging methods have been proposed, existing weight selection criteria for generalized linear models (GLM) are usually based on a model size penalized Kullback-Leibler (KL) loss or simply cross-validation. In this article, when the data is generated from an exponential distribution, we propose a novel model averaging approach for GLM motivated by an asymptotically unbiased estimator of the KL loss penalized by an “effective model size” that incorporates the model misspecification. When all the candidate models are misspecified, the proposed method achieves asymptotic optimality while allowing both the number of candidate models and the dimension of covariates to diverging. Furthermore, when correct models are included in the candidate model set, we prove that the weight of wrong candidate models converges to zero, and hence the weighted regression coefficient estimator is consistent. Simulation studies and two real-data examples demonstrate the advantage of our new method over the existing frequentist model averaging methods.

Original languageEnglish
Pages (from-to)71-96
Number of pages26
JournalEconometric Reviews
Volume43
Issue number1
DOIs
StatePublished - 2024

Keywords

  • Asymptotic optimality
  • Kullback-Leibler loss
  • diverging dimension
  • effective model size

Fingerprint

Dive into the research topics of 'Model averaging for generalized linear models in diverging model spaces with effective model size'. Together they form a unique fingerprint.

Cite this