Gradient boosting for generalised additive mixed models
Boosting
Gradient boosting
Generalized additive model
Mixed model
DOI:
10.1007/s11222-025-10612-y
Publication Date:
2025-04-29T07:57:15Z
AUTHORS (4)
ABSTRACT
Abstract
Generalised additive mixed models are a common tool for modelling of grouped or longitudinal data where random effects are incorporated into the model in order to account for within-group or inter-individual correlations. As an alternative to established penalised maximum likelihood approaches, several different types of boosting routines have been developed to make more demanding data situations manageable. However, when estimating mixed models with component-wise gradient boosting, random and fixed effects compete within the variable selection mechanism. This can result in irregular selection properties and biased parameter estimates, particularly when covariates are constant within clusters. Moreover, while researchers typically are more interested in the covariance structure of random effects than in the effects themselves, current gradient boosting implementations focus solely on estimating the random effects. To overcome these drawbacks we propose a novel gradient boosting scheme for generalized additive mixed models. This novel approach is implemented as an -package mermboost, seamlessly wrapped around the established mboost framework, maintaining its flexibility while enhancing functionality. The improved performance of the new framework is shown via an extensive simulation study and real world applications.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (44)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....