Evaluating Predictors’ Relative Importance Using Bayes Factors in Regression Models

  • Xin Gu*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

This study presents a Bayesian inference approach to evaluate the relative importance of predictors in regression models. Depending on the interpretation of importance, a number of indices are introduced, such as the standardized regression coefficient, the average squared semipartial correlation, and the dominance analysis measure. Researchers’ theories about relative importance are represented by order constrained hypotheses. Support for or against the hypothesis is quantified by the Bayes factor, which can be computed from the prior and posterior distributions of the importance index. As the distributions of the indices are often unknown, we specify prior and posterior distributions for the covariance matrix of all variables in the regression model. The prior and posterior distributions of each importance index can be obtained from the prior and posterior samples of the covariance matrix. Simulation studies are conducted to show different inferences resulting from various importance indices and to investigate the performance of the proposed Bayesian testing approach. The procedure of evaluating relative importance using Bayes factors is illustrated using two real data examples.

Original languageEnglish
Pages (from-to)825-842
Number of pages18
JournalPsychological Methods
Volume28
Issue number4
DOIs
StatePublished - 4 Nov 2021

Keywords

  • bayes factor
  • dominance analysis
  • order constrained hypotheses
  • relative importance
  • semipartial correlation

Fingerprint

Dive into the research topics of 'Evaluating Predictors’ Relative Importance Using Bayes Factors in Regression Models'. Together they form a unique fingerprint.

Cite this