Skip Navigation
Search

AMS 317, INTRODUCTION TO LINEAR REGRESSION ANALYSIS

Catalog Description:  Basic inference procedures and linear regression, model adequacy checking, transformations and weighted least squares, handling with influential observations and outliers, polynomial regression modeling, use of indicator variables, multicollinearity, variable selection, introduction of logistic regression, conventional and less common uses of linear regression in today’s cutting-edge scientific research. Understanding of the basic principles for applied regression model-building techniques in various fields of study, including engineering, management, and the health sciences.


PrerequisiteAMS 315; AMS 210 or MAT 211

 

Course Materials for Fall 2022:

"Linear Models with R" by Julian J. Faraway, 2nd edition, 2014, CRC Press/Taylor & Francis; ISBN:  978-1-4398-8733-2 (required; online version also acceptable)

 

 

Week 1.

Introduction: Statistical technique for investing and modeling the relationship between variables

Week 2.

Simple Linear Regression: least-squares estimation and its properties, Gauss-Markov theorem, statistical inference for parameters, prediction

Week 3.

Review of Linear Algebra; Multiple Linear Regression; least-squares estimation and its properties, Gauss-Markov theorem, statistical inference for parameters, prediction,

Week 4.

Multiple Linear Regression: Maximum likelihood estimation, design matrix, multicollinearity, variance inflation factor

Week 5.

Model Adequacy Checking: residual analysis, residual plots, partial regression

Week 6.

Data Transformation; variance stabilizing transform, Box-Cox method, generalized regression

Week 7.

Regression Diagnostics: leverage points and influential points

Week 8.

Polynomial regression: quadratic and higher order model, piecewise regression

Week 9.

Indicator Variables: regression with categorical variables, use of indication variable, approach to ANOVA

Week 10.

Multicollinearity: effect of multicollinearity, diagnostics, variance inflation factor, Ridge regression, principal component regression

Week 11.

Variable Selection: coefficient of multiple determination, Mallow’s statistic, Akaike information criterion

Week 12.

Regression Validation: validation techniques, analysis of model coefficients and predicted values, cross validation

Week 13.

Nonlinear Regression: expectation function, nonlinear least squares, transformation to linear model, parameter estimation

Week 14.

Review

 

 

Learning Objectives for AMS 317, Introduction to Linear Regression Analysis: 

         

 • Describe and apply the linear regression from conception through conclusion. This process involves:
      * Formulations and collecting data;
      * Fitting a regression model, analyzing data and drawing inferences;
      * Interpreting results and communicating conclusions.

• Demonstrate facility with, and a solid conceptual understanding of, the key tools of regression analysis, including:
      * Model fitting through least squares estimation;
      * Hypothesis testing and interval estimation for regression parameters;
      * Prediction of new observations.

• Demonstrate facility with, and a solid conceptual understanding of, the key tools of model adequacy checking, including:
      * definition of residuals;
      * residual plots;
      * partial regression and partial residual plots.

• Perform important transformation and weighting, such as:
       * variance stabilizing transformation;
       * Box-Cox method;
       * generalized regression technique.

• Demonstrate knowledge of diagnostics for leverage and influence, including:
      * Cook’s method;
      * DFFITS and DFBETAS;
      * COVRATIO.

• Demonstrate knowledge of regression with categorical variables, by defining indicator variables.

• Understand the methods dealing with multicollinearity, including:
     * collecting additional data;
     * ridge regression;
     * principal component regression.

• Demonstrate the knowledge of variable selection, using:
     * all possible regression;
     * stepwise regression;
     * Akaike information criteria.

• Demonstrate the knowledge of generalized linear models and understand how to interpret, using:
     * logistic regression model;
     * link functions;
     * statistical inference on model parameters.