Regression Analysis II: Linear Models

Instructor(s):

  • Brian Pollins, Ohio State University

The course is designed to give students a thorough foundation in Regression Analysis to prepare them for more advanced work in statistical modeling (Structural Equations, MLE, Bayesian Analysis, etc.). A brief introduction to Matrix Algebra will be provided, and the Ordinary Least Squares estimator and its basic Gaussian properties will be re-introduced in matrix form. We can then begin to relax individual OLS assumptions to discover the Generalized Least Squares Estimator and learn diagnostic procedures and corrections which constitute "Best Practice" for our everyday research. More advanced questions in Ordinary- and Generalized Least Squares Regression are next examined in depth, including Regression Diagnostics, Collinearity, Model Specification and Measurement Error. While theoretical underpinnings of these issues and their solutions are shown, the main emphasis is on practical applications. We finish with a brief look at the world beyond linear models: Maximum Likelihood Estimators.

All necessary reading materials for this class will be readily available on-line or within a course packet (roughly 200 pages) made available by the ICPSR Program for a nominal fee. This packet will also contain all slides used in lecture by the Instructor. A syllabus containing further suggested readings will also be provided.

Fees: Consult the fee structure.

Tags: regression, linear models

Course Sections

Section 1

Location: ICPSR -- Ann Arbor, MI

Date(s): July 20 - August 14

Time: 1:00 PM - 3:00 PM

Instructor(s):

  • Brian Pollins, Ohio State University

Found a problem? Use our Report Problem form to let us know.