Generalized linear mixed model


In statistics, a generalized linear mixed model is an extension to the generalized linear model in which the linear predictor contains random effects in addition to the usual fixed effects. They also inherit from GLMs the idea of extending linear mixed models to non-normal data.
GLMMs provide a broad range of models for the analysis of grouped data, since the differences between groups can be modelled as a random effect. These models are useful in the analysis of many kinds of data, including longitudinal data.

Model

GLMMs are generally defined as such that conditioned on the random effects,, the dependent variable,, is distributed according to an exponential family.
Where and are the fixed effects design matrix, and fixed effects; and are the random effects design matrix and random effects.
The complete likelihood,
has no general closed form, and integrating over the random effects is usually extremely computationally intensive. In addition to numerically approximating this integral, methods motivated by Laplace approximation have been proposed. For example, the penalized quasi-likelihood method, which essentially involves repeatedly fitting a weighted normal mixed model with a working variate, is implemented by various commercial and open source statistical programs.

Fitting a model

Fitting GLMMs via maximum likelihood involves integrating over the random effects. In general, those integrals cannot be expressed in analytical form. Various approximate methods have been developed, but none has good properties for all possible models and data sets. For this reason, methods involving numerical quadrature or Markov chain Monte Carlo have increased in use, as increasing computing power and advances in methods have made them more practical.
The Akaike information criterion is a common criterion for model selection. Estimates of AIC for GLMMs based on certain exponential family distributions have recently been obtained.

Software