Multiple choice from competing regression models under multicollinearity based on standardized update

Masao Ueki, Yoshinori Kawasaki

    Research output: Contribution to journalArticlepeer-review

    7 Citations (Scopus)

    Abstract

    This paper proposes a new method for choosing regression models which may produce multiple models with sufficient explanatory power and parsimony unlike the traditional model selection procedures that aim at obtaining a single best model. The method ensures interpretability of the resulting models even under strong multicollinearity. The algorithm proceeds in the forward stepwise manner with two requirements for the selected regression models to be fulfilled: goodness of fit and the magnitude of update in loss functions. For the latter criterion, the standardized update is newly introduced, which is closely related with the model selection criteria including the Mallows' cp, Akaike information criterion and Bayesian information criterion. Simulation studies demonstrate that the proposed algorithm works well with and without strong multicollinearity and even with many explanatory variables. Application to real data is also provided.

    Original languageEnglish
    Pages (from-to)31-41
    Number of pages11
    JournalComputational Statistics and Data Analysis
    Volume63
    DOIs
    Publication statusPublished - 2013

    Keywords

    • Multicollinearity
    • Regression model choice
    • Standardized update

    ASJC Scopus subject areas

    • Statistics and Probability
    • Computational Mathematics
    • Computational Theory and Mathematics
    • Applied Mathematics

    Fingerprint

    Dive into the research topics of 'Multiple choice from competing regression models under multicollinearity based on standardized update'. Together they form a unique fingerprint.

    Cite this