Model selection criteria based on Kullback information measures for nonlinear regression
Akaike information criterion
Divergence (linguistics)
Bayesian information criterion
Information Criteria
Kullback–Leibler divergence
Information Theory
DOI:
10.1016/j.jspi.2004.05.002
Publication Date:
2004-07-24T10:25:55Z
AUTHORS (2)
ABSTRACT
Abstract In statistical modeling, selecting an optimal model from a class of candidates is a critical issue. During the past three decades, a number of model selection criteria have been proposed based on estimating Kullback's (Information Theory and Statistics, Dover, Mineola, NY, 1968, p. 5) directed divergence between the model generating the data and a fitted candidate model. The Akaike (Second International Symposium on Information Theory, Akademia Kiado, Budapest, Hungary, 1973, pp. 267–281; IEEE Trans. Automat. Control AC-19 (1974) 716) information criterion, AIC, was the first of these. AIC is justified in a very general framework, and as a result, offers a crude estimator of the directed divergence: one which exhibits a potentially high degree of negative bias in small-sample applications (Biometrika 76 (1989) 297). The “corrected” Akaike information criterion (Biometrika 76 (1989) 297), AICc, adjusts for this bias, and consequently often outperforms AIC as a selection criterion. However, AICc is less broadly applicable than AIC since its justification depends upon the structure of the candidate model. AIC I (Biometrika 77 (1990) 709) is an “improved” version of AIC featuring a simulated bias correction. Recently, model selection criteria have been proposed based on estimating Kullback's (Information Theory and Statistics, Dover, Mineola, NY, 1986, p. 6) symmetric divergence between the generating model and a fitted candidate model (Statist. Probab. Lett. 42 (1999) 333; Austral. New Zealand J. Statist. 46 (2004) 257). KIC, KICc, and KIC I are criteria devised to target the symmetric divergence in the same manner that AIC, AICc, and AIC I target the directed divergence. AICc has been justified for the nonlinear regression framework by Hurvich and Tsai (Biometrika 76 (1989) 297). In this paper, we justify KICc for this framework, and propose versions of AIC I and KIC I suitable for nonlinear regression applications. We evaluate the selection performance of AIC, AICc, AIC I , KIC, KICc, and KIC I in a simulation study. Our results generally indicate that the “improved” criteria outperform the “corrected” criteria, which in turn outperform the non-adjusted criteria. Moreover, the KIC family performs favorably against the AIC family.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (15)
CITATIONS (21)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....