Regression Analysis II: Linear Models


  • Brian Pollins, Ohio State University

The course is designed to give students a thorough foundation in Regression Analysis to prepare them for more advanced work in statistical modeling (Structural Equations, MLE, Bayesian Analysis, etc.). A brief introduction to Matrix Algebra will be provided, and the Ordinary Least Squares estimator and its basic Gaussian properties will be re-introduced in matrix form. We can then begin to relax individual OLS assumptions to discover the Generalized Least Squares Estimator and learn diagnostic procedures and corrections which constitute "Best Practice" for our everyday research. More advanced questions in Ordinary- and Generalized Least Squares Regression are next examined in depth, including Regression Diagnostics, Collinearity, Model Specification, and Measurement Error. While theoretical underpinnings of these issues and their solutions are shown, the main emphasis is on practical applications. We finish with a brief look at the world beyond linear models: Maximum Likelihood Estimators.

All necessary reading materials for this class will be readily available on-line or within a course packet (roughly 200 pages) made available by the ICPSR Program for a nominal fee. This packet will also contain all slides used in lecture by the Instructor. A syllabus containing further suggested readings will also be provided.

Software: R is the class estimation package. You are strongly recommended to install both R and R-Studio before the course begins.

Fees: Consult the fee structure.

Tags: regression, linear models

Course Sections

Section 1

Location: ICPSR -- Ann Arbor, MI

Date(s): July 22 - August 16

Time: 3:00 PM - 5:00 PM


  • Brian Pollins, Ohio State University