Applied Multilevel Models

Instructor(s):

  • Mark Manning, Wayne State University

Multilevel models are known by many synonyms (i.e., hierarchical linear models, general linear mixed models). The defining feature of these models is their capacity to provide quantification and prediction of random variance due to multiple sampling dimensions (across occasions, persons, or groups). Multilevel models offer many advantages for analyzing longitudinal data, such as flexible strategies for modeling change and individual differences in change, the examination of time-invariant or time-varying predictor effects, and the use of all available complete observations. Multilevel models are also useful in analyzing clustered data (e.g., persons nested in groups), in which one wishes to examine predictors pertaining to individuals or to groups. This 4-week course will serve as an applied introduction to multilevel models, focusing primarily on longitudinal data and then continuing onto clustered data.

The course will be organized to take participants through each of the cumulative steps in multilevel analysis: deciding which type of model is appropriate, setting up the data file and coding predictor variables, evaluating fixed and random effects and/or alternative covariance structures for time, predicting between- and within-unit variation using covariates, interpreting and displaying empirical findings, and presenting results in both verbal and written form. Participants should be familiar with the general linear model (e.g., ANOVA and regression), but no prior experience with multilevel models or knowledge of advanced mathematics (e.g., matrix algebra) is assumed.

Fees: Consult the fee structure.

Tags: multilevel, hierarchical, clustered data, nested models, general linear mixed models

Course Sections

Section 1

Location: ICPSR -- Ann Arbor, MI

Date(s): July 21 - August 15

Time: 3:00 PM - 5:00 PM

Instructor(s):

  • Mark Manning, Wayne State University

Syllabus:

Found a problem? Use our Report Problem form to let us know.